sha
null | last_modified
null | library_name
stringclasses 154
values | text
stringlengths 1
900k
| metadata
stringlengths 2
348k
| pipeline_tag
stringclasses 45
values | id
stringlengths 5
122
| tags
sequencelengths 1
1.84k
| created_at
stringlengths 25
25
| arxiv
sequencelengths 0
201
| languages
sequencelengths 0
1.83k
| tags_str
stringlengths 17
9.34k
| text_str
stringlengths 0
389k
| text_lists
sequencelengths 0
722
| processed_texts
sequencelengths 1
723
| tokens_length
sequencelengths 1
723
| input_texts
sequencelengths 1
61
| embeddings
sequencelengths 768
768
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
null | null | transformers |
## Swedish BERT models for sentiment analysis, Sentiment targets.
[Recorded Future](https://www.recordedfuture.com/) together with [AI Sweden](https://www.ai.se/en) releases two language models for target/role assignment in Swedish. The two models are based on the [KB/bert-base-swedish-cased](https://huggingface.co/KB/bert-base-swedish-cased), the models as has been fine tuned to solve a Named Entety Recognition(NER) token classification task.
This is a downstream model to be used in conjunction with the [Swedish violence sentiment classifier](https://huggingface.co/RecordedFuture/Swedish-Sentiment-Violence) or [Swedish violence sentiment classifier](https://huggingface.co/RecordedFuture/Swedish-Sentiment-Fear). The models are trained to tag parts of sentences that has recieved a positive classification from the upstream sentiment classifier. The model will tag parts of sentences that contains the targets that the upstream model has activated on.
The NER sentiment target models do work as standalone models but their recommended application is downstreamfrom a sentence classification model.
The models are only trained on Swedish data and only supports inference of Swedish input texts. The models inference metrics for all non-Swedish inputs are not defined, these inputs are considered as out of domain data.
The current models are supported at Transformers version >= 4.3.3 and Torch version 1.8.0, compatibility with older versions are not verified.
### Fear targets
The model can be imported from the transformers library by running
from transformers import BertForSequenceClassification, BertTokenizerFast
tokenizer = BertTokenizerFast.from_pretrained("RecordedFuture/Swedish-Sentiment-Fear-Targets")
classifier_fear_targets= BertForTokenClassification.from_pretrained("RecordedFuture/Swedish-Sentiment-Fear-Targets")
When the model and tokenizer are initialized the model can be used for inference.
#### Verification metrics
During training the Fear target model had the following verification metrics when using "any overlap" as the evaluation metric.
| F-score | Precision | Recall |
|:-------------------------:|:-------:|:---------:|:------:|
| 0.8361 | 0.7903 | 0.8876 |
#### Swedish-Sentiment-Violence
The model be can imported from the transformers library by running
from transformers import BertForSequenceClassification, BertTokenizerFast
tokenizer = BertTokenizerFast.from_pretrained("RecordedFuture/Swedish-Sentiment-Violence-Targets")
classifier_violence_targets = BertForTokenClassification.from_pretrained("RecordedFuture/Swedish-Sentiment-Violence-Targets")
When the model and tokenizer are initialized the model can be used for inference.
#### Verification metrics
During training the Violence target model had the following verification metrics when using "any overlap" as the evaluation metric.
| F-score | Precision | Recall |
|:-------------------------:|:-------:|:---------:|:------:|
| 0.7831| 0.9155| 0.8442 | | {"language": "sv", "license": "mit"} | token-classification | RecordedFuture/Swedish-Sentiment-Fear-Targets | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"token-classification",
"sv",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"sv"
] | TAGS
#transformers #pytorch #tf #jax #bert #token-classification #sv #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
## Swedish BERT models for sentiment analysis, Sentiment targets.
Recorded Future together with AI Sweden releases two language models for target/role assignment in Swedish. The two models are based on the KB/bert-base-swedish-cased, the models as has been fine tuned to solve a Named Entety Recognition(NER) token classification task.
This is a downstream model to be used in conjunction with the Swedish violence sentiment classifier or Swedish violence sentiment classifier. The models are trained to tag parts of sentences that has recieved a positive classification from the upstream sentiment classifier. The model will tag parts of sentences that contains the targets that the upstream model has activated on.
The NER sentiment target models do work as standalone models but their recommended application is downstreamfrom a sentence classification model.
The models are only trained on Swedish data and only supports inference of Swedish input texts. The models inference metrics for all non-Swedish inputs are not defined, these inputs are considered as out of domain data.
The current models are supported at Transformers version >= 4.3.3 and Torch version 1.8.0, compatibility with older versions are not verified.
### Fear targets
The model can be imported from the transformers library by running
from transformers import BertForSequenceClassification, BertTokenizerFast
tokenizer = BertTokenizerFast.from_pretrained("RecordedFuture/Swedish-Sentiment-Fear-Targets")
classifier_fear_targets= BertForTokenClassification.from_pretrained("RecordedFuture/Swedish-Sentiment-Fear-Targets")
When the model and tokenizer are initialized the model can be used for inference.
#### Verification metrics
During training the Fear target model had the following verification metrics when using "any overlap" as the evaluation metric.
| F-score | Precision | Recall |
|:-------------------------:|:-------:|:---------:|:------:|
| 0.8361 | 0.7903 | 0.8876 |
#### Swedish-Sentiment-Violence
The model be can imported from the transformers library by running
from transformers import BertForSequenceClassification, BertTokenizerFast
tokenizer = BertTokenizerFast.from_pretrained("RecordedFuture/Swedish-Sentiment-Violence-Targets")
classifier_violence_targets = BertForTokenClassification.from_pretrained("RecordedFuture/Swedish-Sentiment-Violence-Targets")
When the model and tokenizer are initialized the model can be used for inference.
#### Verification metrics
During training the Violence target model had the following verification metrics when using "any overlap" as the evaluation metric.
| F-score | Precision | Recall |
|:-------------------------:|:-------:|:---------:|:------:|
| 0.7831| 0.9155| 0.8442 | | [
"## Swedish BERT models for sentiment analysis, Sentiment targets. \nRecorded Future together with AI Sweden releases two language models for target/role assignment in Swedish. The two models are based on the KB/bert-base-swedish-cased, the models as has been fine tuned to solve a Named Entety Recognition(NER) token classification task.\n\nThis is a downstream model to be used in conjunction with the Swedish violence sentiment classifier or Swedish violence sentiment classifier. The models are trained to tag parts of sentences that has recieved a positive classification from the upstream sentiment classifier. The model will tag parts of sentences that contains the targets that the upstream model has activated on. \n\nThe NER sentiment target models do work as standalone models but their recommended application is downstreamfrom a sentence classification model. \n\nThe models are only trained on Swedish data and only supports inference of Swedish input texts. The models inference metrics for all non-Swedish inputs are not defined, these inputs are considered as out of domain data.\n\nThe current models are supported at Transformers version >= 4.3.3 and Torch version 1.8.0, compatibility with older versions are not verified.",
"### Fear targets\n\nThe model can be imported from the transformers library by running\n\n from transformers import BertForSequenceClassification, BertTokenizerFast\n \n tokenizer = BertTokenizerFast.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Fear-Targets\")\n classifier_fear_targets= BertForTokenClassification.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Fear-Targets\") \n\nWhen the model and tokenizer are initialized the model can be used for inference.",
"#### Verification metrics \n\nDuring training the Fear target model had the following verification metrics when using \"any overlap\" as the evaluation metric. \n\n\n| F-score | Precision | Recall |\n|:-------------------------:|:-------:|:---------:|:------:|\n| 0.8361 | 0.7903 | 0.8876 |",
"#### Swedish-Sentiment-Violence\nThe model be can imported from the transformers library by running\n\n from transformers import BertForSequenceClassification, BertTokenizerFast\n \n tokenizer = BertTokenizerFast.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Violence-Targets\")\n classifier_violence_targets = BertForTokenClassification.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Violence-Targets\") \n\nWhen the model and tokenizer are initialized the model can be used for inference.",
"#### Verification metrics \nDuring training the Violence target model had the following verification metrics when using \"any overlap\" as the evaluation metric. \n\n| F-score | Precision | Recall |\n|:-------------------------:|:-------:|:---------:|:------:|\n| 0.7831| 0.9155| 0.8442 |"
] | [
"TAGS\n#transformers #pytorch #tf #jax #bert #token-classification #sv #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"## Swedish BERT models for sentiment analysis, Sentiment targets. \nRecorded Future together with AI Sweden releases two language models for target/role assignment in Swedish. The two models are based on the KB/bert-base-swedish-cased, the models as has been fine tuned to solve a Named Entety Recognition(NER) token classification task.\n\nThis is a downstream model to be used in conjunction with the Swedish violence sentiment classifier or Swedish violence sentiment classifier. The models are trained to tag parts of sentences that has recieved a positive classification from the upstream sentiment classifier. The model will tag parts of sentences that contains the targets that the upstream model has activated on. \n\nThe NER sentiment target models do work as standalone models but their recommended application is downstreamfrom a sentence classification model. \n\nThe models are only trained on Swedish data and only supports inference of Swedish input texts. The models inference metrics for all non-Swedish inputs are not defined, these inputs are considered as out of domain data.\n\nThe current models are supported at Transformers version >= 4.3.3 and Torch version 1.8.0, compatibility with older versions are not verified.",
"### Fear targets\n\nThe model can be imported from the transformers library by running\n\n from transformers import BertForSequenceClassification, BertTokenizerFast\n \n tokenizer = BertTokenizerFast.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Fear-Targets\")\n classifier_fear_targets= BertForTokenClassification.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Fear-Targets\") \n\nWhen the model and tokenizer are initialized the model can be used for inference.",
"#### Verification metrics \n\nDuring training the Fear target model had the following verification metrics when using \"any overlap\" as the evaluation metric. \n\n\n| F-score | Precision | Recall |\n|:-------------------------:|:-------:|:---------:|:------:|\n| 0.8361 | 0.7903 | 0.8876 |",
"#### Swedish-Sentiment-Violence\nThe model be can imported from the transformers library by running\n\n from transformers import BertForSequenceClassification, BertTokenizerFast\n \n tokenizer = BertTokenizerFast.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Violence-Targets\")\n classifier_violence_targets = BertForTokenClassification.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Violence-Targets\") \n\nWhen the model and tokenizer are initialized the model can be used for inference.",
"#### Verification metrics \nDuring training the Violence target model had the following verification metrics when using \"any overlap\" as the evaluation metric. \n\n| F-score | Precision | Recall |\n|:-------------------------:|:-------:|:---------:|:------:|\n| 0.7831| 0.9155| 0.8442 |"
] | [
50,
274,
139,
89,
148,
87
] | [
"passage: TAGS\n#transformers #pytorch #tf #jax #bert #token-classification #sv #license-mit #autotrain_compatible #endpoints_compatible #region-us \n## Swedish BERT models for sentiment analysis, Sentiment targets. \nRecorded Future together with AI Sweden releases two language models for target/role assignment in Swedish. The two models are based on the KB/bert-base-swedish-cased, the models as has been fine tuned to solve a Named Entety Recognition(NER) token classification task.\n\nThis is a downstream model to be used in conjunction with the Swedish violence sentiment classifier or Swedish violence sentiment classifier. The models are trained to tag parts of sentences that has recieved a positive classification from the upstream sentiment classifier. The model will tag parts of sentences that contains the targets that the upstream model has activated on. \n\nThe NER sentiment target models do work as standalone models but their recommended application is downstreamfrom a sentence classification model. \n\nThe models are only trained on Swedish data and only supports inference of Swedish input texts. The models inference metrics for all non-Swedish inputs are not defined, these inputs are considered as out of domain data.\n\nThe current models are supported at Transformers version >= 4.3.3 and Torch version 1.8.0, compatibility with older versions are not verified.### Fear targets\n\nThe model can be imported from the transformers library by running\n\n from transformers import BertForSequenceClassification, BertTokenizerFast\n \n tokenizer = BertTokenizerFast.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Fear-Targets\")\n classifier_fear_targets= BertForTokenClassification.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Fear-Targets\") \n\nWhen the model and tokenizer are initialized the model can be used for inference."
] | [
-0.059673670679330826,
0.08596284687519073,
-0.00209223385900259,
0.0778675228357315,
0.05924049764871597,
-0.0388716384768486,
0.1532924473285675,
0.03270586580038071,
0.02304503321647644,
0.020519211888313293,
0.006586003117263317,
0.0005939679685980082,
0.08894176036119461,
0.06025172770023346,
-0.0100775808095932,
-0.24627633392810822,
0.05730045959353447,
-0.07883831858634949,
0.08878698945045471,
0.0709085613489151,
0.10897312313318253,
-0.023753046989440918,
0.0396699458360672,
0.05275839939713478,
-0.03411471098661423,
-0.039770592004060745,
0.0030922756996005774,
-0.02956930175423622,
0.06385844200849533,
0.07269132137298584,
0.06639902293682098,
-0.03585989028215408,
0.07964713126420975,
-0.05662848427891731,
0.01977580599486828,
0.019896451383829117,
0.002552290912717581,
0.01901964284479618,
0.08747578412294388,
-0.005640209652483463,
0.11217235773801804,
-0.009576697833836079,
0.06779322028160095,
0.052336860448122025,
-0.03361846134066582,
-0.10128448903560638,
-0.11201459169387817,
0.07291432470083237,
0.06875192373991013,
0.040479667484760284,
-0.021784741431474686,
0.10979088395833969,
-0.0077867647632956505,
0.059521324932575226,
0.0346815325319767,
-0.06306767463684082,
-0.06526695191860199,
0.05868867039680481,
-0.024202456697821617,
0.032708827406167984,
-0.0626462996006012,
0.008785933256149292,
-0.0013311264337971807,
0.05339976027607918,
0.11512584239244461,
-0.016041293740272522,
0.0403338260948658,
-0.07460742443799973,
-0.1213657483458519,
-0.0009949331870302558,
0.08646924048662186,
0.02610590308904648,
-0.12438248842954636,
-0.18348340690135956,
-0.004663245286792517,
0.05893785133957863,
0.006666236091405153,
-0.048591211438179016,
0.008644131012260914,
-0.020532837137579918,
0.1683090627193451,
-0.04912901297211647,
-0.09294474124908447,
0.005855902098119259,
-0.003409271826967597,
0.1038835421204567,
-0.019009895622730255,
0.02426827698945999,
0.007946087047457695,
0.029997292906045914,
-0.1813090294599533,
-0.09490258246660233,
-0.04586679860949516,
-0.1164180114865303,
-0.08079949021339417,
-0.01938432641327381,
-0.02498115971684456,
-0.22953857481479645,
-0.05269432067871094,
0.04670262709259987,
-0.024300619959831238,
-0.0006622251239605248,
-0.05568600073456764,
0.004786781035363674,
0.10880287736654282,
0.09145227819681168,
-0.006278451066464186,
-0.053233902901411057,
0.04701634868979454,
0.0069270506501197815,
0.031605008989572525,
-0.019877268001437187,
-0.05334625020623207,
0.010277093388140202,
0.015095315873622894,
0.01047387532889843,
0.0006479620933532715,
0.06590180099010468,
-0.051067106425762177,
-0.05643485113978386,
0.11708609014749527,
-0.11663752794265747,
-0.03441724553704262,
0.003827642882242799,
-0.048366717994213104,
0.10442781448364258,
0.06597279757261276,
-0.040321022272109985,
-0.03769959881901741,
0.023437611758708954,
-0.006662061437964439,
-0.02589350938796997,
-0.09202317148447037,
-0.18242396414279938,
0.0574122779071331,
-0.0909971296787262,
-0.019002746790647507,
-0.12952111661434174,
-0.19250687956809998,
-0.0920170322060585,
0.029510805383324623,
-0.058783501386642456,
0.006338759325444698,
-0.08156487345695496,
-0.07467605173587799,
-0.010275226086378098,
0.002357470104470849,
-0.07826078683137894,
-0.009567372500896454,
-0.02826855331659317,
-0.11795351654291153,
0.0845099613070488,
-0.0029270690865814686,
-0.014013170264661312,
-0.13245807588100433,
-0.02514076419174671,
-0.25935474038124084,
0.11234921962022781,
-0.10656608641147614,
0.048385292291641235,
-0.05116220563650131,
-0.01450625341385603,
0.07303828746080399,
0.04475438967347145,
-0.0021353308111429214,
0.18868671357631683,
-0.3021610379219055,
-0.008444187231361866,
0.09322700649499893,
-0.21327126026153564,
0.003273604204878211,
0.05391883850097656,
-0.044112108647823334,
0.14965859055519104,
0.14552219212055206,
0.1839563399553299,
0.01220748107880354,
-0.046162430197000504,
-0.039692752063274384,
-0.023958338424563408,
-0.07848058640956879,
0.07520545274019241,
0.03220019489526749,
0.0036263384390622377,
0.031455148011446,
0.045061685144901276,
-0.09046431630849838,
-0.0030028431210666895,
0.029112083837389946,
-0.015005839988589287,
0.030424002557992935,
-0.04302504286170006,
0.03523660823702812,
-0.018829122185707092,
-0.015204706229269505,
-0.04034482687711716,
-0.08734836429357529,
0.07390471547842026,
0.0342501699924469,
-0.03454088792204857,
0.011067998595535755,
-0.06110847368836403,
0.0165751650929451,
-0.042033396661281586,
0.015391523949801922,
-0.13565693795681,
-0.15814000368118286,
-0.012279419228434563,
-0.07515560835599899,
0.05636851117014885,
0.10104110091924667,
0.03518655523657799,
0.041107889264822006,
0.004661539103835821,
-0.007275039330124855,
-0.01134277880191803,
0.013865470886230469,
-0.026407357305288315,
-0.16594085097312927,
-0.05664315074682236,
-0.07496684044599533,
0.03649619594216347,
-0.16580213606357574,
0.024627307429909706,
0.05398152023553848,
0.07310318946838379,
0.04753879830241203,
-0.043669719249010086,
0.0825803130865097,
0.027730880305171013,
-0.00724848173558712,
-0.01841847412288189,
0.039978355169296265,
0.011094260960817337,
-0.002766688819974661,
0.15798678994178772,
-0.05326369032263756,
-0.10584399849176407,
0.04667442664504051,
0.10508165508508682,
-0.1166081577539444,
0.005056914873421192,
0.019935932010412216,
0.018832648172974586,
-0.036080412566661835,
-0.016513297334313393,
0.08099257200956345,
0.07459846138954163,
0.0823575109243393,
-0.07427752763032913,
-0.002716759918257594,
0.0446336604654789,
-0.10555226355791092,
-0.000999554293230176,
0.07204856723546982,
-0.06272652745246887,
-0.1571083515882492,
0.05455618351697922,
-0.051778990775346756,
0.046324051916599274,
0.24794062972068787,
0.04291890189051628,
-0.06704232096672058,
-0.016267770901322365,
-0.08774948120117188,
0.00475533539429307,
0.109293632209301,
0.04487759247422218,
0.030041072517633438,
0.041818585246801376,
0.023520605638623238,
-0.03514315187931061,
-0.025056658312678337,
0.02109670080244541,
0.019236033782362938,
0.005938397254794836,
-0.007738742511719465,
0.05482476204633713,
0.005062923301011324,
0.05812237411737442,
-0.008449693210422993,
0.030330872163176537,
0.013102793134748936,
-0.05563744530081749,
-0.13300703465938568,
0.1284288913011551,
-0.11135368794202805,
-0.22727032005786896,
-0.12314392626285553,
-0.011408472433686256,
-0.10107230395078659,
0.010973185300827026,
0.03945467248558998,
-0.10297996550798416,
-0.08952002227306366,
-0.11385698616504669,
0.07556690275669098,
0.0964256078004837,
-0.067908376455307,
-0.06837770342826843,
-0.004074214491993189,
0.0013471620623022318,
-0.10483426600694656,
-0.023169878870248795,
-0.05813378840684891,
-0.052386991679668427,
0.03657509759068489,
0.05369038134813309,
0.029598137363791466,
0.09235494583845139,
0.006270936690270901,
-0.01735272817313671,
0.012035781517624855,
0.11792877316474915,
0.023347480222582817,
0.12968756258487701,
0.09978312999010086,
-0.11016524583101273,
0.06443586945533752,
0.07840897142887115,
0.02803615853190422,
-0.03277471289038658,
-0.004123901017010212,
0.053125545382499695,
-0.09001294523477554,
-0.1723322570323944,
-0.11428181082010269,
0.02775096893310547,
-0.01091099064797163,
0.0289070513099432,
0.006550219375640154,
0.025378551334142685,
0.08975014090538025,
0.020162874832749367,
-0.11663006246089935,
-0.025493044406175613,
0.10001539438962936,
0.09141316264867783,
-0.040413998067379,
0.05194011703133583,
-0.048945315182209015,
0.09367889910936356,
0.08554640412330627,
-0.11265293508768082,
0.137332484126091,
-0.051686305552721024,
-0.03773472458124161,
0.08282303810119629,
0.03750402107834816,
0.10629719495773315,
0.11532699316740036,
-0.01066497527062893,
-0.013139700517058372,
-0.023056989535689354,
-0.08982136845588684,
-0.10962187498807907,
0.09288597851991653,
-0.14478452503681183,
0.04193796217441559,
-0.0636049211025238,
-0.03809152916073799,
0.04561074450612068,
0.16412493586540222,
-0.029435325413942337,
-0.19502969086170197,
-0.16222043335437775,
0.035073935985565186,
-0.06337232142686844,
-0.05730082094669342,
-0.05767010152339935,
0.05088595673441887,
-0.14202678203582764,
0.07362169027328491,
-0.03988506272435188,
0.0732421800494194,
-0.07787469029426575,
0.022586433216929436,
0.010978984646499157,
0.0647725984454155,
-0.017226440832018852,
-0.009746847674250603,
0.04224478080868721,
0.11547961086034775,
0.005410656798630953,
0.10491130501031876,
-0.09860555082559586,
-0.009412863291800022,
0.03108822926878929,
0.05490744113922119,
0.15604472160339355,
0.04106967896223068,
-0.030869396403431892,
-0.03812035918235779,
-0.0508275181055069,
-0.003961305133998394,
-0.01747150905430317,
-0.07020263373851776,
0.10705720633268356,
0.03191855177283287,
0.039898764342069626,
-0.05824624374508858,
-0.06514735519886017,
-0.06758873909711838,
-0.10854855924844742,
0.012611018493771553,
-0.05255364626646042,
0.08605360984802246,
-0.028377411887049675,
-0.07889313995838165,
-0.06333570182323456,
0.18659767508506775,
-0.07819584012031555,
-0.14901041984558105,
-0.0978814885020256,
-0.022262340411543846,
0.03237064182758331,
-0.044762663543224335,
0.036998450756073,
-0.018300848081707954,
0.13503144681453705,
-0.08342783898115158,
-0.04188111051917076,
0.015748875215649605,
-0.1033506765961647,
-0.08658409863710403,
0.0034391849767416716,
0.09630551934242249,
0.1877887099981308,
0.007256242912262678,
0.07699878513813019,
0.040830690413713455,
0.07564647495746613,
-0.10380438715219498,
-0.06552108377218246,
0.24816913902759552,
0.014567532576620579,
0.0029251733794808388,
-0.04090440273284912,
-0.12402789294719696,
-0.01299726590514183,
-0.032124489545822144,
0.07279264181852341,
0.1678987741470337,
-0.07295975089073181,
0.1836860328912735,
0.15636025369167328,
-0.09447377920150757,
-0.20265592634677887,
0.04144520312547684,
0.10277200490236282,
0.03179966285824776,
0.1534804254770279,
-0.11476254463195801,
0.060540515929460526,
-0.01089534256607294,
0.004627280868589878,
-0.059417758136987686,
-0.0868273377418518,
-0.06721312552690506,
0.040964994579553604,
0.0442812442779541,
0.11145055294036865,
0.009360863827168941,
-0.03554842993617058,
0.004588237032294273,
-0.026836108416318893,
0.17119742929935455,
-0.004231794737279415,
0.019723501056432724,
0.000936330936383456,
0.1129230484366417,
0.07788991183042526,
-0.040037013590335846,
0.12027859687805176,
0.04030778631567955,
0.048190321773290634,
-0.14078164100646973,
0.0733637809753418,
0.11326111853122711,
-0.045693203806877136,
0.16737444698810577,
0.12230836600065231,
-0.011306042782962322,
-0.12289877235889435,
-0.07310136407613754,
-0.07234174013137817,
0.07518994808197021,
-0.014327192679047585,
-0.061397355049848557,
-0.005031885579228401,
0.11901140213012695,
0.08343947678804398,
-0.03999630734324455,
-0.01758413016796112,
-0.1529228836297989,
0.007322647608816624,
0.17845670878887177,
0.13571301102638245,
0.021973758935928345,
-0.0669771060347557,
0.021872876212000847,
-0.03155605494976044,
0.06224506348371506,
-0.12325632572174072,
-0.009334376081824303,
0.036410506814718246,
-0.00020257258438505232,
0.053252626210451126,
-0.05388770252466202,
-0.11099892854690552,
0.048515863716602325,
0.04852727800607681,
-0.145992249250412,
-0.06634467840194702,
-0.005718972999602556,
0.004693122580647469,
-0.04237710312008858,
-0.00822546798735857,
0.20348875224590302,
-0.09398261457681656,
0.0008283946663141251,
-0.03830447793006897,
0.06730259209871292,
-0.014434886164963245,
0.0011653484543785453,
0.008487912826240063,
0.016161438077688217,
-0.06872697174549103,
0.11078277230262756,
-0.03950691968202591,
-0.1008647233247757,
0.10248979926109314,
0.10948459804058075,
-0.11633793264627457,
-0.05004282668232918,
-0.11598356068134308,
0.13512006402015686,
-0.1230413094162941,
-0.05135008692741394,
0.030769724398851395,
-0.05662601441144943,
-0.031217379495501518,
0.14208456873893738,
0.04846576601266861,
0.006168670486658812,
-0.08042287081480026,
-0.020419102162122726,
-0.06636594235897064,
0.027651000767946243,
0.09050669521093369,
-0.07284089177846909,
-0.015436021611094475,
0.004121784586459398,
0.004469543695449829,
-0.015674447640776634,
-0.031207723543047905,
-0.056642647832632065,
-0.04345046356320381,
-0.04272036999464035,
-0.10665099322795868,
-0.027562662959098816,
-0.08415722846984863,
0.012517075054347515,
-0.030260875821113586,
0.006758634466677904,
0.047421328723430634,
0.028799818828701973,
-0.025453943759202957,
0.045055970549583435,
-0.02448102831840515,
0.05562049150466919,
-0.08000368624925613,
-0.02094447985291481,
-0.019193964079022408,
-0.0008243576739914715,
0.05368666350841522,
0.04591246321797371,
-0.056522686034440994,
0.09702745079994202,
-0.1128423660993576,
0.10053697228431702,
-0.017812704667448997,
-0.06196648254990578,
0.04817495122551918,
-0.05414389446377754,
-0.01788572408258915,
0.014967675320804119,
-0.04126785323023796,
-0.0032557735685259104,
0.08235392719507217,
-0.045897070318460464,
0.0667387992143631,
0.15774965286254883,
0.0323699526488781,
-0.10423889756202698,
0.06338442862033844,
0.06591957062482834,
0.06687228381633759,
0.13368986546993256,
-0.020899660885334015,
0.05787298083305359,
-0.07728254050016403,
-0.022879377007484436,
0.000910650531295687,
0.0029401127249002457,
0.038256268948316574,
-0.03360673040151596,
-0.006547825876623392,
-0.04047456011176109,
0.09795345366001129,
0.10038680583238602,
0.020147953182458878,
0.04090351611375809,
-0.004431759472936392,
-0.04926074296236038,
-0.0037284644786268473,
0.01956021599471569,
-0.029889917001128197,
0.038033634424209595,
-0.037631358951330185,
0.019585508853197098,
-0.04244266822934151,
-0.17134211957454681,
0.07876183837652206,
-0.0035014816094189882,
0.07259871810674667,
0.056262291967868805,
0.037737488746643066,
0.06590979546308517,
-0.07118221372365952,
0.055131323635578156,
0.07090222090482712,
0.032711535692214966,
-0.062288518995046616,
0.09626927226781845,
0.10545645654201508,
-0.07056160271167755,
0.10312701761722565,
0.03159099817276001,
-0.05202993378043175,
-0.06484436988830566,
-0.3820571303367615,
-0.05406028404831886,
-0.039747387170791626,
-0.010707397013902664,
-0.11194007843732834,
0.09502334147691727,
0.056773651391267776,
0.017324324697256088,
0.0004926193505525589,
0.07734976708889008,
-0.1259402185678482,
-0.0830332487821579,
0.08658090978860855,
-0.004622482229024172,
-0.030537353828549385,
0.006858825217932463,
0.06567871570587158,
0.017005156725645065,
0.10301986336708069,
-0.020876221358776093,
0.09648822993040085,
0.08872923254966736,
-0.04668806120753288,
-0.04805995151400566,
-0.09608914703130722,
-0.03428945317864418,
0.02421320602297783,
0.0147716598585248,
0.17312714457511902,
0.07648062705993652,
-0.08186709880828857,
-0.02246381901204586,
0.16431637108325958,
-0.031408317387104034,
-0.08312546461820602,
-0.1256353259086609,
0.15305864810943604,
-0.03878199681639671,
0.04353714734315872,
0.0035494279582053423,
-0.06434664130210876,
0.09161444008350372,
0.09360344707965851,
0.16354280710220337,
0.05524974316358566,
0.00927527341991663,
-0.15328006446361542,
-0.01818690076470375,
-0.08272240310907364,
0.060834672302007675,
-0.03922543674707413,
0.177123561501503,
-0.05446629971265793,
0.1961311399936676,
-0.07586461305618286,
-0.014630992896854877,
0.001561639946885407,
0.03745308890938759,
-0.029449600726366043,
0.019776709377765656,
-0.06628412008285522,
0.12841327488422394,
-0.03949931636452675,
-0.22720147669315338,
0.033409249037504196,
0.013738083653151989,
-0.07271411269903183,
-0.004188218154013157,
-0.02482609637081623,
0.060186125338077545,
0.08436093479394913,
0.020569035783410072,
-0.021005090326070786,
0.21856670081615448,
-0.017751669511198997,
-0.016468068584799767,
-0.11678091436624527,
0.1058957502245903,
-0.007910321466624737,
0.12039998173713684,
0.03900282457470894,
0.051258113235235214,
0.07376465946435928,
-0.0062314411625266075,
-0.1310809999704361,
0.040175680071115494,
-0.020631486549973488,
-0.07695159316062927,
-0.030569715425372124,
0.1814209669828415,
-0.007241870276629925,
0.12941184639930725,
-0.011614727787673473,
-0.14503735303878784,
0.03545693680644035,
-0.012485931627452374,
-0.07516490668058395,
-0.0017480190144851804,
0.12737122178077698,
-0.06595703959465027,
0.1576453298330307,
0.1262418031692505,
-0.008265810087323189,
-0.010336732491850853,
-0.07599365711212158,
0.0847846269607544,
-0.003496026387438178,
-0.07194135338068008,
0.053837165236473083,
-0.12707264721393585,
0.005990501958876848,
0.13260406255722046,
0.031264662742614746,
-0.17197199165821075,
-0.05672461539506912,
-0.023252343758940697,
0.01889781467616558,
0.061587557196617126,
0.01116239558905363,
0.011377628892660141,
0.03408174216747284,
-0.005578321870416403,
0.01951097883284092,
0.030093234032392502,
0.08675657957792282,
-0.04388910531997681,
-0.0726616233587265
] |
null | null | transformers |
## Swedish BERT models for sentiment analysis
[Recorded Future](https://www.recordedfuture.com/) together with [AI Sweden](https://www.ai.se/en) releases two language models for sentiment analysis in Swedish. The two models are based on the [KB\/bert-base-swedish-cased](https://huggingface.co/KB/bert-base-swedish-cased) model and has been fine-tuned to solve a multi-label sentiment analysis task.
The models have been fine-tuned for the sentiments fear and violence. The models output three floats corresponding to the labels "Negative", "Weak sentiment", and "Strong Sentiment" at the respective indexes.
The models have been trained on Swedish data with a conversational focus, collected from various internet sources and forums.
The models are only trained on Swedish data and only supports inference of Swedish input texts. The models inference metrics for all non-Swedish inputs are not defined, these inputs are considered as out of domain data.
The current models are supported at Transformers version >= 4.3.3 and Torch version 1.8.0, compatibility with older versions are not verified.
### Swedish-Sentiment-Fear
The model can be imported from the transformers library by running
from transformers import BertForSequenceClassification, BertTokenizerFast
tokenizer = BertTokenizerFast.from_pretrained("RecordedFuture/Swedish-Sentiment-Fear")
classifier_fear= BertForSequenceClassification.from_pretrained("RecordedFuture/Swedish-Sentiment-Fear")
When the model and tokenizer are initialized the model can be used for inference.
#### Sentiment definitions
#### The strong sentiment includes but are not limited to
Texts that:
- Hold an expressive emphasis on fear and/ or anxiety
#### The weak sentiment includes but are not limited to
Texts that:
- Express fear and/ or anxiety in a neutral way
#### Verification metrics
During training, the model had maximized validation metrics at the following classification breakpoint.
| Classification Breakpoint | F-score | Precision | Recall |
|:-------------------------:|:-------:|:---------:|:------:|
| 0.45 | 0.8754 | 0.8618 | 0.8895 |
#### Swedish-Sentiment-Violence
The model be can imported from the transformers library by running
from transformers import BertForSequenceClassification, BertTokenizerFast
tokenizer = BertTokenizerFast.from_pretrained("RecordedFuture/Swedish-Sentiment-Violence")
classifier_violence = BertForSequenceClassification.from_pretrained("RecordedFuture/Swedish-Sentiment-Violence")
When the model and tokenizer are initialized the model can be used for inference.
### Sentiment definitions
#### The strong sentiment includes but are not limited to
Texts that:
- Referencing highly violent acts
- Hold an aggressive tone
#### The weak sentiment includes but are not limited to
Texts that:
- Include general violent statements that do not fall under the strong sentiment
#### Verification metrics
During training, the model had maximized validation metrics at the following classification breakpoint.
| Classification Breakpoint | F-score | Precision | Recall |
|:-------------------------:|:-------:|:---------:|:------:|
| 0.35 | 0.7677 | 0.7456 | 0.791 | | {"language": "sv", "license": "mit"} | text-classification | RecordedFuture/Swedish-Sentiment-Fear | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"text-classification",
"sv",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"sv"
] | TAGS
#transformers #pytorch #tf #jax #bert #text-classification #sv #license-mit #autotrain_compatible #endpoints_compatible #region-us
| Swedish BERT models for sentiment analysis
------------------------------------------
Recorded Future together with AI Sweden releases two language models for sentiment analysis in Swedish. The two models are based on the KB/bert-base-swedish-cased model and has been fine-tuned to solve a multi-label sentiment analysis task.
The models have been fine-tuned for the sentiments fear and violence. The models output three floats corresponding to the labels "Negative", "Weak sentiment", and "Strong Sentiment" at the respective indexes.
The models have been trained on Swedish data with a conversational focus, collected from various internet sources and forums.
The models are only trained on Swedish data and only supports inference of Swedish input texts. The models inference metrics for all non-Swedish inputs are not defined, these inputs are considered as out of domain data.
The current models are supported at Transformers version >= 4.3.3 and Torch version 1.8.0, compatibility with older versions are not verified.
### Swedish-Sentiment-Fear
The model can be imported from the transformers library by running
```
from transformers import BertForSequenceClassification, BertTokenizerFast
tokenizer = BertTokenizerFast.from_pretrained("RecordedFuture/Swedish-Sentiment-Fear")
classifier_fear= BertForSequenceClassification.from_pretrained("RecordedFuture/Swedish-Sentiment-Fear")
```
When the model and tokenizer are initialized the model can be used for inference.
#### Sentiment definitions
#### The strong sentiment includes but are not limited to
Texts that:
* Hold an expressive emphasis on fear and/ or anxiety
#### The weak sentiment includes but are not limited to
Texts that:
* Express fear and/ or anxiety in a neutral way
#### Verification metrics
During training, the model had maximized validation metrics at the following classification breakpoint.
#### Swedish-Sentiment-Violence
The model be can imported from the transformers library by running
```
from transformers import BertForSequenceClassification, BertTokenizerFast
tokenizer = BertTokenizerFast.from_pretrained("RecordedFuture/Swedish-Sentiment-Violence")
classifier_violence = BertForSequenceClassification.from_pretrained("RecordedFuture/Swedish-Sentiment-Violence")
```
When the model and tokenizer are initialized the model can be used for inference.
### Sentiment definitions
#### The strong sentiment includes but are not limited to
Texts that:
* Referencing highly violent acts
* Hold an aggressive tone
#### The weak sentiment includes but are not limited to
Texts that:
* Include general violent statements that do not fall under the strong sentiment
#### Verification metrics
During training, the model had maximized validation metrics at the following classification breakpoint.
| [
"### Swedish-Sentiment-Fear\n\n\nThe model can be imported from the transformers library by running\n\n\n\n```\nfrom transformers import BertForSequenceClassification, BertTokenizerFast\n\ntokenizer = BertTokenizerFast.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Fear\")\nclassifier_fear= BertForSequenceClassification.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Fear\") \n\n```\n\nWhen the model and tokenizer are initialized the model can be used for inference.",
"#### Sentiment definitions",
"#### The strong sentiment includes but are not limited to\n\n\nTexts that:\n\n\n* Hold an expressive emphasis on fear and/ or anxiety",
"#### The weak sentiment includes but are not limited to\n\n\nTexts that:\n\n\n* Express fear and/ or anxiety in a neutral way",
"#### Verification metrics\n\n\nDuring training, the model had maximized validation metrics at the following classification breakpoint.",
"#### Swedish-Sentiment-Violence\n\n\nThe model be can imported from the transformers library by running\n\n\n\n```\nfrom transformers import BertForSequenceClassification, BertTokenizerFast\n\ntokenizer = BertTokenizerFast.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Violence\")\nclassifier_violence = BertForSequenceClassification.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Violence\") \n\n```\n\nWhen the model and tokenizer are initialized the model can be used for inference.",
"### Sentiment definitions",
"#### The strong sentiment includes but are not limited to\n\n\nTexts that:\n\n\n* Referencing highly violent acts\n* Hold an aggressive tone",
"#### The weak sentiment includes but are not limited to\n\n\nTexts that:\n\n\n* Include general violent statements that do not fall under the strong sentiment",
"#### Verification metrics\n\n\nDuring training, the model had maximized validation metrics at the following classification breakpoint."
] | [
"TAGS\n#transformers #pytorch #tf #jax #bert #text-classification #sv #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"### Swedish-Sentiment-Fear\n\n\nThe model can be imported from the transformers library by running\n\n\n\n```\nfrom transformers import BertForSequenceClassification, BertTokenizerFast\n\ntokenizer = BertTokenizerFast.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Fear\")\nclassifier_fear= BertForSequenceClassification.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Fear\") \n\n```\n\nWhen the model and tokenizer are initialized the model can be used for inference.",
"#### Sentiment definitions",
"#### The strong sentiment includes but are not limited to\n\n\nTexts that:\n\n\n* Hold an expressive emphasis on fear and/ or anxiety",
"#### The weak sentiment includes but are not limited to\n\n\nTexts that:\n\n\n* Express fear and/ or anxiety in a neutral way",
"#### Verification metrics\n\n\nDuring training, the model had maximized validation metrics at the following classification breakpoint.",
"#### Swedish-Sentiment-Violence\n\n\nThe model be can imported from the transformers library by running\n\n\n\n```\nfrom transformers import BertForSequenceClassification, BertTokenizerFast\n\ntokenizer = BertTokenizerFast.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Violence\")\nclassifier_violence = BertForSequenceClassification.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Violence\") \n\n```\n\nWhen the model and tokenizer are initialized the model can be used for inference.",
"### Sentiment definitions",
"#### The strong sentiment includes but are not limited to\n\n\nTexts that:\n\n\n* Referencing highly violent acts\n* Hold an aggressive tone",
"#### The weak sentiment includes but are not limited to\n\n\nTexts that:\n\n\n* Include general violent statements that do not fall under the strong sentiment",
"#### Verification metrics\n\n\nDuring training, the model had maximized validation metrics at the following classification breakpoint."
] | [
49,
137,
6,
27,
27,
26,
141,
6,
29,
32,
26
] | [
"passage: TAGS\n#transformers #pytorch #tf #jax #bert #text-classification #sv #license-mit #autotrain_compatible #endpoints_compatible #region-us \n### Swedish-Sentiment-Fear\n\n\nThe model can be imported from the transformers library by running\n\n\n\n```\nfrom transformers import BertForSequenceClassification, BertTokenizerFast\n\ntokenizer = BertTokenizerFast.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Fear\")\nclassifier_fear= BertForSequenceClassification.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Fear\") \n\n```\n\nWhen the model and tokenizer are initialized the model can be used for inference.#### Sentiment definitions#### The strong sentiment includes but are not limited to\n\n\nTexts that:\n\n\n* Hold an expressive emphasis on fear and/ or anxiety#### The weak sentiment includes but are not limited to\n\n\nTexts that:\n\n\n* Express fear and/ or anxiety in a neutral way#### Verification metrics\n\n\nDuring training, the model had maximized validation metrics at the following classification breakpoint.#### Swedish-Sentiment-Violence\n\n\nThe model be can imported from the transformers library by running\n\n\n\n```\nfrom transformers import BertForSequenceClassification, BertTokenizerFast\n\ntokenizer = BertTokenizerFast.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Violence\")\nclassifier_violence = BertForSequenceClassification.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Violence\") \n\n```\n\nWhen the model and tokenizer are initialized the model can be used for inference.### Sentiment definitions#### The strong sentiment includes but are not limited to\n\n\nTexts that:\n\n\n* Referencing highly violent acts\n* Hold an aggressive tone#### The weak sentiment includes but are not limited to\n\n\nTexts that:\n\n\n* Include general violent statements that do not fall under the strong sentiment#### Verification metrics\n\n\nDuring training, the model had maximized validation metrics at the following classification breakpoint."
] | [
0.003579544834792614,
0.12002968043088913,
-0.006823304109275341,
0.0754019096493721,
0.07986628264188766,
-0.02166338823735714,
0.05516795441508293,
0.09400958567857742,
0.015807343646883965,
0.10368963330984116,
-0.03039165958762169,
0.012332786805927753,
0.05437842383980751,
-0.11151539534330368,
-0.005911299027502537,
-0.21178044378757477,
0.03445553034543991,
-0.09815899282693863,
0.003398197004571557,
0.0546453632414341,
0.12064849585294724,
-0.05321710556745529,
0.06036005914211273,
-0.01380356028676033,
0.02246241457760334,
0.014222514815628529,
-0.023651333525776863,
0.03111841529607773,
0.05109642818570137,
0.033321794122457504,
0.06607842445373535,
-0.04587509110569954,
0.001015484449453652,
-0.22517572343349457,
-0.004892013035714626,
0.05039294436573982,
-0.001572724780999124,
0.009220224805176258,
0.10820479691028595,
-0.0622217170894146,
0.05740639939904213,
-0.2206767201423645,
0.060113873332738876,
0.05781051516532898,
-0.10403341054916382,
-0.1829993575811386,
-0.0889483243227005,
0.06562481075525284,
0.06318233907222748,
0.04546467214822769,
-0.0695858895778656,
0.10551885515451431,
-0.043026845902204514,
0.05329270660877228,
0.16278262436389923,
-0.1711733043193817,
-0.038306932896375656,
-0.06949246674776077,
0.006111201364547014,
0.08266136050224304,
-0.13536697626113892,
0.028097696602344513,
0.03454983979463577,
0.0069006686098873615,
0.06909332424402237,
-0.03738173842430115,
-0.05112524703145027,
-0.02287925034761429,
-0.1338661164045334,
-0.025335241109132767,
0.06109827384352684,
0.017353836447000504,
-0.026145756244659424,
-0.1143830344080925,
0.02387872524559498,
0.011195339262485504,
0.005002122838050127,
-0.07543176412582397,
0.06169794127345085,
0.01692412793636322,
0.11133978515863419,
-0.0705706849694252,
-0.11797616630792618,
0.07552918046712875,
0.0015560517786070704,
0.12275408208370209,
-0.013173343613743782,
-0.03363914415240288,
0.05154842510819435,
-0.033273257315158844,
-0.14434361457824707,
-0.11850935220718384,
-0.010116703808307648,
-0.03374972194433212,
-0.07455354183912277,
-0.041773319244384766,
-0.08591850847005844,
-0.15954966843128204,
0.030019570142030716,
0.05582444369792938,
-0.17158962786197662,
-0.003285098820924759,
-0.04460516571998596,
-0.005744329188019037,
0.08899872750043869,
0.10937904566526413,
-0.0642353966832161,
-0.07798267900943756,
-0.024565638974308968,
-0.024533644318580627,
0.0378190316259861,
0.011851267889142036,
-0.019572844728827477,
-0.01017001736909151,
0.10784450173377991,
0.04912963882088661,
-0.03534227982163429,
0.05539277195930481,
-0.10554444789886475,
-0.037866998463869095,
-0.09401190280914307,
-0.12446475774049759,
0.02363048680126667,
0.060258299112319946,
-0.018680214881896973,
0.01937897317111492,
-0.013750576414167881,
0.01532113179564476,
-0.045929450541734695,
0.026747846975922585,
-0.019141701981425285,
-0.0021294658072292805,
-0.0827716588973999,
-0.14410400390625,
0.046056196093559265,
-0.04909661412239075,
-0.054987527430057526,
-0.05363461375236511,
-0.09981343150138855,
-0.05571611598134041,
0.03695749118924141,
-0.06961524486541748,
0.021806592121720314,
-0.06243683397769928,
-0.07522425800561905,
0.02465745247900486,
0.040605202317237854,
-0.03585388883948326,
0.005881734658032656,
-0.0024200037587434053,
-0.04956759884953499,
0.05089515075087547,
0.07016096264123917,
-0.006222473457455635,
-0.14499621093273163,
0.010006938129663467,
-0.19180697202682495,
0.1611601859331131,
-0.15394985675811768,
0.050290144979953766,
-0.1264573335647583,
-0.036593981087207794,
0.020960671827197075,
0.02625446580350399,
0.008967775851488113,
0.15218816697597504,
-0.1903502196073532,
-0.05237370356917381,
0.10329161584377289,
-0.16020530462265015,
-0.04732321575284004,
0.1696529984474182,
-0.05722622573375702,
0.09804804623126984,
0.10775308310985565,
0.1983797401189804,
0.011185436509549618,
-0.07054292410612106,
-0.058452919125556946,
-0.003924774471670389,
-0.06706912070512772,
0.2329385131597519,
0.03886883333325386,
-0.08744823932647705,
0.06841137260198593,
0.0020402565132826567,
-0.018620405346155167,
0.003086124313995242,
0.028542032465338707,
-0.06966795027256012,
0.005668981000781059,
-0.022219283506274223,
0.0805378183722496,
0.0025305335875600576,
-0.03465196117758751,
-0.029434867203235626,
-0.1008414775133133,
0.1491796374320984,
0.017073385417461395,
-0.028801117092370987,
0.028413420543074608,
-0.06615591794252396,
0.00037007973878644407,
-0.023877400904893875,
-0.016633568331599236,
-0.12590931355953217,
-0.07206335663795471,
-0.008879268541932106,
-0.13208463788032532,
0.08391394466161728,
0.10940005630254745,
0.0793965682387352,
-0.014635623432695866,
-0.0604267455637455,
0.02295532636344433,
0.008865959011018276,
0.04624330997467041,
-0.05437023565173149,
-0.22173458337783813,
0.043789640069007874,
-0.049660973250865936,
0.10514418035745621,
-0.10238134860992432,
0.006326351314783096,
0.20700259506702423,
0.10496754944324493,
0.05438490957021713,
-0.02264847233891487,
0.06467258185148239,
0.019487762823700905,
0.03244026377797127,
-0.04577802121639252,
0.041169147938489914,
-0.02250506915152073,
-0.07543765008449554,
0.09322052448987961,
-0.13793794810771942,
-0.0949832946062088,
0.06704594194889069,
0.024440547451376915,
-0.1118011325597763,
-0.07359348982572556,
-0.01749231293797493,
-0.012306657619774342,
0.023138530552387238,
-0.06116446480154991,
0.09113939851522446,
0.05858786776661873,
0.05070142075419426,
-0.06516887992620468,
-0.027869559824466705,
-0.005222893785685301,
-0.11041343957185745,
-0.055153988301754,
0.11925428360700607,
-0.11221297830343246,
-0.2647801339626312,
0.10845067352056503,
0.084854356944561,
-0.05184822529554367,
0.19314256310462952,
0.016808930784463882,
-0.05650182068347931,
-0.08707308024168015,
0.022555731236934662,
0.06641390919685364,
0.06453827023506165,
-0.07789343595504761,
0.02058856561779976,
0.029859155416488647,
-0.024957293644547462,
-0.005565729457885027,
-0.0319758839905262,
0.04293437302112579,
0.055290598422288895,
0.01829317770898342,
0.11664412915706635,
0.007116029970347881,
0.03638870269060135,
0.04870283976197243,
0.012099351733922958,
0.044944554567337036,
0.009340795688331127,
-0.053747374564409256,
-0.12087251245975494,
0.13353778421878815,
-0.1469469666481018,
-0.22097398340702057,
-0.06383245438337326,
0.02740107849240303,
-0.06726723164319992,
0.009487505070865154,
0.04376408830285072,
-0.1492796689271927,
-0.07719853520393372,
-0.07347973436117172,
0.08618965744972229,
0.06386502087116241,
-0.09851622581481934,
-0.06553006917238235,
0.03781796619296074,
0.062377236783504486,
-0.07833235710859299,
0.013951288536190987,
-0.010066404938697815,
-0.04868520423769951,
-0.009118152782320976,
0.03916822373867035,
0.03248174861073494,
0.1017971783876419,
0.018134260550141335,
-0.04260428994894028,
-0.030878189951181412,
0.18215718865394592,
-0.062376827001571655,
0.03514249622821808,
0.02404114603996277,
-0.08806119114160538,
0.09520301222801208,
0.14947237074375153,
0.025063171982765198,
-0.035777732729911804,
0.004230157937854528,
0.11664365231990814,
0.020515060052275658,
-0.19317567348480225,
-0.08734719455242157,
-0.022085582837462425,
-0.02858036383986473,
0.020678507164120674,
0.0021345457062125206,
0.05248800665140152,
0.06632480025291443,
-0.02864672802388668,
-0.10955878347158432,
0.020922360941767693,
0.118246890604496,
0.1660619080066681,
-0.01559637300670147,
0.006761745549738407,
-0.021654654294252396,
-0.010524723678827286,
0.08337539434432983,
-0.047928161919116974,
0.06412722170352936,
-0.01594465598464012,
0.17633497714996338,
0.04716718941926956,
0.06476910412311554,
-0.0004948742571286857,
-0.013612261973321438,
-0.03275202214717865,
0.011662177741527557,
-0.03488341346383095,
-0.1038520559668541,
-0.0798109844326973,
0.09912761300802231,
0.01834878884255886,
0.052770473062992096,
-0.024969538673758507,
0.00793472584336996,
0.1771516650915146,
0.220070943236351,
0.0045166886411607265,
-0.06913778185844421,
-0.07731466740369797,
0.05103645846247673,
-0.013910187408328056,
-0.03344077244400978,
-0.055313631892204285,
0.06651468575000763,
-0.08468339592218399,
0.03548211231827736,
-0.06477966159582138,
0.03746895492076874,
-0.04660004749894142,
0.06746387481689453,
-0.035220589488744736,
0.13771522045135498,
0.005135141313076019,
0.056363556534051895,
-0.1014121025800705,
0.12766486406326294,
0.03833458945155144,
0.08364848792552948,
-0.055353619158267975,
0.009995623491704464,
0.04887222498655319,
-0.031097985804080963,
0.19779162108898163,
0.03899058327078819,
0.04068595543503761,
-0.012961804866790771,
-0.013205609284341335,
-0.021570589393377304,
0.1217234805226326,
-0.06443963199853897,
0.07549524307250977,
-0.02278311923146248,
-0.004002271685749292,
-0.021099135279655457,
0.050446245819330215,
-0.10020462423563004,
-0.1382584571838379,
0.0828075110912323,
-0.10359025746583939,
0.027459651231765747,
-0.021268822252750397,
-0.04259154945611954,
-0.18045219779014587,
0.26688089966773987,
-0.16850876808166504,
-0.11931832134723663,
-0.08566023409366608,
-0.03153497353196144,
0.10301583260297775,
-0.05207788944244385,
-0.0017899019876495004,
0.010676607489585876,
0.1378968060016632,
-0.05993299186229706,
0.04803404584527016,
0.05870485678315163,
-0.03390815481543541,
-0.13956928253173828,
-0.03140140697360039,
0.061725422739982605,
0.05855611711740494,
0.021054638549685478,
0.041494809091091156,
0.05506148189306259,
0.03151581436395645,
-0.08931384980678558,
-0.008727090433239937,
0.0867963507771492,
-0.025946887210011482,
0.04312947019934654,
-0.05857110768556595,
-0.11131981015205383,
-0.13919751346111298,
-0.02668338268995285,
0.124398373067379,
0.22609136998653412,
-0.07120246440172195,
0.13653279840946198,
0.2003237009048462,
-0.13103777170181274,
-0.1868254542350769,
0.005094780586659908,
0.08426754921674728,
-0.038301050662994385,
0.12081552296876907,
-0.10948403179645538,
0.03912336379289627,
0.060386717319488525,
0.017607809975743294,
-0.15992869436740875,
-0.16129538416862488,
-0.10319782048463821,
0.04892968758940697,
0.057330284267663956,
-0.0901871770620346,
-0.12239819765090942,
-0.07615125179290771,
-0.031987547874450684,
0.021702460944652557,
0.14468535780906677,
-0.05880173668265343,
0.05424833297729492,
0.03564848750829697,
0.026245642453432083,
0.0711255818605423,
-0.03251686319708824,
0.17541180551052094,
-0.042407650500535965,
0.06998006999492645,
-0.10740716010332108,
-0.01176639273762703,
0.12166574597358704,
-0.05108066275715828,
0.08298645913600922,
-0.04715821519494057,
-0.017640244215726852,
-0.10668231546878815,
-0.03494664281606674,
-0.07000721246004105,
0.08201104402542114,
-0.06595968455076218,
-0.08981481939554214,
-0.02436443790793419,
0.134597510099411,
0.12960995733737946,
-0.0340927392244339,
-0.023100068792700768,
-0.13271133601665497,
-0.0010757017880678177,
0.19903306663036346,
0.21265502274036407,
0.12529867887496948,
-0.10217797756195068,
0.03244961425662041,
-0.020154830068349838,
0.06713476032018661,
-0.025698918849229813,
0.028222238644957542,
0.05909864231944084,
0.0180372204631567,
0.1556846648454666,
-0.06120515242218971,
-0.10672964155673981,
0.01671595126390457,
0.049335312098264694,
-0.09492401033639908,
-0.13820765912532806,
0.0006586469244211912,
0.01510096900165081,
-0.08916059881448746,
-0.10051719099283218,
0.1827095001935959,
-0.06202046573162079,
-0.008134009316563606,
0.004199706483632326,
0.0798536166548729,
-0.018869467079639435,
0.014598855748772621,
0.028567150235176086,
0.037100307643413544,
-0.04959753155708313,
0.1626916378736496,
0.07632182538509369,
-0.1979324221611023,
0.1306730955839157,
0.1464719921350479,
-0.03321194648742676,
-0.07436912506818771,
-0.016071071848273277,
0.17241951823234558,
-0.08395375311374664,
-0.009343677200376987,
0.009509806521236897,
-0.06308094412088394,
-0.04555824026465416,
0.2181389033794403,
0.025792526081204414,
0.04980207979679108,
-0.04258815571665764,
0.0003644190146587789,
-0.059554629027843475,
0.09966535866260529,
0.07921399921178818,
-0.04935626685619354,
-0.0140169532969594,
0.08608371764421463,
-0.046110253781080246,
-0.08988361805677414,
-0.013942118734121323,
-0.013620776124298573,
-0.09084798395633698,
-0.07485933601856232,
-0.17513883113861084,
0.02008737437427044,
-0.04070863127708435,
0.015347550623118877,
0.007141444366425276,
0.027474693953990936,
0.032176777720451355,
-0.003767254762351513,
-0.05102366581559181,
0.002982096280902624,
0.02935543656349182,
0.10879480093717575,
-0.14916349947452545,
-0.04759247228503227,
0.061463095247745514,
-0.04288133606314659,
0.06431767344474792,
-0.005093407817184925,
-0.01376426313072443,
0.011482752859592438,
-0.21775412559509277,
0.025929205119609833,
-0.03439914062619209,
-0.042028799653053284,
0.010495691560208797,
-0.1452544778585434,
-0.038029417395591736,
-0.040110018104314804,
-0.02965332753956318,
0.0030802092514932156,
0.07957707345485687,
-0.06259385496377945,
0.08522088825702667,
0.12888281047344208,
-0.0033654284197837114,
-0.07109533250331879,
0.025546101853251457,
0.04419122636318207,
-0.02871221862733364,
0.11123421043157578,
-0.06575740873813629,
0.06932506710290909,
-0.08314475417137146,
0.0050973547622561455,
0.024141555652022362,
0.01738297939300537,
-0.12478449940681458,
0.0013928060652688146,
0.024114320054650307,
-0.04465511813759804,
0.07786490023136139,
0.03480137884616852,
-0.03052620030939579,
0.04949795454740524,
-0.07186255604028702,
-0.06642620265483856,
0.0348559208214283,
0.024450557306408882,
-0.007014369126409292,
-0.028958884999155998,
-0.064794160425663,
-0.019472062587738037,
-0.017680518329143524,
-0.028852134943008423,
0.07879643142223358,
0.1860216110944748,
0.11333587020635605,
0.031844742596149445,
0.049264200031757355,
0.01814032904803753,
-0.015355359762907028,
0.20183147490024567,
0.07609657198190689,
0.05802978202700615,
-0.07404191046953201,
0.04734765365719795,
0.13119962811470032,
-0.10824903845787048,
0.13426077365875244,
-0.03412922844290733,
-0.0544903539121151,
-0.032704614102840424,
-0.20468494296073914,
-0.08327531814575195,
0.042665034532547,
0.012763259001076221,
-0.13669182360172272,
0.07082901149988174,
-0.025358589366078377,
0.02935665473341942,
-0.008066922426223755,
0.06180622801184654,
-0.1362277716398239,
-0.08570694178342819,
0.12040212750434875,
0.00053560477681458,
-0.008572323247790337,
0.01722438633441925,
0.023229314014315605,
0.03076924756169319,
0.09238889068365097,
0.010932746343314648,
0.09925646334886551,
0.07832404226064682,
-0.0779142826795578,
-0.048904597759246826,
-0.11763583868741989,
0.02736835740506649,
0.013674763962626457,
-0.02095148339867592,
0.19354186952114105,
0.03877713903784752,
-0.05131639540195465,
-0.03586212173104286,
0.15608644485473633,
-0.06293099373579025,
-0.12157345563173294,
-0.1207229420542717,
0.23197804391384125,
-0.03669228404760361,
0.09282075613737106,
-0.02465939149260521,
-0.10863830894231796,
0.09956704080104828,
0.08963561058044434,
0.07846649736166,
-0.060569263994693756,
0.03944259509444237,
-0.04910940304398537,
0.010562952607870102,
-0.05772321671247482,
-0.04056142270565033,
-0.019431494176387787,
0.1321268081665039,
-0.120963454246521,
0.18962445855140686,
-0.020542025566101074,
-0.03723292425274849,
-0.07558226585388184,
0.05630948767066002,
-0.03187999874353409,
0.0467338003218174,
-0.05735380947589874,
0.10804229974746704,
-0.06590738147497177,
-0.22058618068695068,
0.051720865070819855,
-0.08094368129968643,
-0.11314086616039276,
-0.005170207004994154,
0.07381048053503036,
0.0889228880405426,
0.08052270114421844,
0.0677647590637207,
-0.03502865135669708,
0.06297580152750015,
0.005240375641733408,
-0.014113635756075382,
-0.012283362448215485,
0.1041211187839508,
-0.02867208607494831,
0.1513093262910843,
0.026567477732896805,
0.04312334582209587,
0.11884915083646774,
-0.050456687808036804,
-0.10266437381505966,
0.017498472705483437,
0.028958376497030258,
-0.12837587296962738,
0.008610640652477741,
0.19835180044174194,
0.0068070595152676105,
0.13273955881595612,
0.05964033305644989,
-0.10477068275213242,
0.053493984043598175,
0.12604790925979614,
-0.06248566880822182,
-0.021123021841049194,
0.09404585510492325,
-0.09613364934921265,
0.1104588657617569,
0.19986505806446075,
0.007801195606589317,
-0.0101876650005579,
-0.08563048392534256,
0.02184704877436161,
0.016971511766314507,
0.002913209842517972,
-0.020088372752070427,
-0.06824415177106857,
0.027839791029691696,
0.15948575735092163,
0.04954449087381363,
-0.13221468031406403,
-0.09639368206262589,
0.04255543649196625,
0.05882527679204941,
0.02233639732003212,
0.0440823920071125,
0.08842361718416214,
0.042528778314590454,
-0.002072684932500124,
-0.14146482944488525,
0.05435483530163765,
0.09941036999225616,
-0.021604981273412704,
0.0009133583516813815
] |
null | null | transformers |
## Swedish BERT models for sentiment analysis, Sentiment targets.
[Recorded Future](https://www.recordedfuture.com/) together with [AI Sweden](https://www.ai.se/en) releases two language models for target/role assignment in Swedish. The two models are based on the [KB/bert-base-swedish-cased](https://huggingface.co/KB/bert-base-swedish-cased), the models as has been fine tuned to solve a Named Entety Recognition(NER) token classification task.
This is a downstream model to be used in conjunction with the [Swedish violence sentiment classifier](https://huggingface.co/RecordedFuture/Swedish-Sentiment-Violence) or [Swedish violence sentiment classifier](https://huggingface.co/RecordedFuture/Swedish-Sentiment-Fear). The models are trained to tag parts of sentences that has recieved a positive classification from the upstream sentiment classifier. The model will tag parts of sentences that contains the targets that the upstream model has activated on.
The NER sentiment target models do work as standalone models but their recommended application is downstreamfrom a sentence classification model.
The models are only trained on Swedish data and only supports inference of Swedish input texts. The models inference metrics for all non-Swedish inputs are not defined, these inputs are considered as out of domain data.
The current models are supported at Transformers version >= 4.3.3 and Torch version 1.8.0, compatibility with older versions are not verified.
### Fear targets
The model can be imported from the transformers library by running
from transformers import BertForSequenceClassification, BertTokenizerFast
tokenizer = BertTokenizerFast.from_pretrained("RecordedFuture/Swedish-Sentiment-Fear-Targets")
classifier_fear_targets= BertForTokenClassification.from_pretrained("RecordedFuture/Swedish-Sentiment-Fear-Targets")
When the model and tokenizer are initialized the model can be used for inference.
#### Verification metrics
During training the Fear target model had the following verification metrics when using "any overlap" as the evaluation metric.
| F-score | Precision | Recall |
|:-------------------------:|:-------:|:---------:|:------:|
| 0.8361 | 0.7903 | 0.8876 |
#### Swedish-Sentiment-Violence
The model be can imported from the transformers library by running
from transformers import BertForSequenceClassification, BertTokenizerFast
tokenizer = BertTokenizerFast.from_pretrained("RecordedFuture/Swedish-Sentiment-Violence-Targets")
classifier_violence_targets = BertForTokenClassification.from_pretrained("RecordedFuture/Swedish-Sentiment-Violence-Targets")
When the model and tokenizer are initialized the model can be used for inference.
#### Verification metrics
During training the Violence target model had the following verification metrics when using "any overlap" as the evaluation metric.
| F-score | Precision | Recall |
|:-------------------------:|:-------:|:---------:|:------:|
| 0.7831| 0.9155| 0.8442 | | {"language": "sv", "license": "mit"} | token-classification | RecordedFuture/Swedish-Sentiment-Violence-Targets | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"token-classification",
"sv",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"sv"
] | TAGS
#transformers #pytorch #tf #jax #bert #token-classification #sv #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
## Swedish BERT models for sentiment analysis, Sentiment targets.
Recorded Future together with AI Sweden releases two language models for target/role assignment in Swedish. The two models are based on the KB/bert-base-swedish-cased, the models as has been fine tuned to solve a Named Entety Recognition(NER) token classification task.
This is a downstream model to be used in conjunction with the Swedish violence sentiment classifier or Swedish violence sentiment classifier. The models are trained to tag parts of sentences that has recieved a positive classification from the upstream sentiment classifier. The model will tag parts of sentences that contains the targets that the upstream model has activated on.
The NER sentiment target models do work as standalone models but their recommended application is downstreamfrom a sentence classification model.
The models are only trained on Swedish data and only supports inference of Swedish input texts. The models inference metrics for all non-Swedish inputs are not defined, these inputs are considered as out of domain data.
The current models are supported at Transformers version >= 4.3.3 and Torch version 1.8.0, compatibility with older versions are not verified.
### Fear targets
The model can be imported from the transformers library by running
from transformers import BertForSequenceClassification, BertTokenizerFast
tokenizer = BertTokenizerFast.from_pretrained("RecordedFuture/Swedish-Sentiment-Fear-Targets")
classifier_fear_targets= BertForTokenClassification.from_pretrained("RecordedFuture/Swedish-Sentiment-Fear-Targets")
When the model and tokenizer are initialized the model can be used for inference.
#### Verification metrics
During training the Fear target model had the following verification metrics when using "any overlap" as the evaluation metric.
| F-score | Precision | Recall |
|:-------------------------:|:-------:|:---------:|:------:|
| 0.8361 | 0.7903 | 0.8876 |
#### Swedish-Sentiment-Violence
The model be can imported from the transformers library by running
from transformers import BertForSequenceClassification, BertTokenizerFast
tokenizer = BertTokenizerFast.from_pretrained("RecordedFuture/Swedish-Sentiment-Violence-Targets")
classifier_violence_targets = BertForTokenClassification.from_pretrained("RecordedFuture/Swedish-Sentiment-Violence-Targets")
When the model and tokenizer are initialized the model can be used for inference.
#### Verification metrics
During training the Violence target model had the following verification metrics when using "any overlap" as the evaluation metric.
| F-score | Precision | Recall |
|:-------------------------:|:-------:|:---------:|:------:|
| 0.7831| 0.9155| 0.8442 | | [
"## Swedish BERT models for sentiment analysis, Sentiment targets. \nRecorded Future together with AI Sweden releases two language models for target/role assignment in Swedish. The two models are based on the KB/bert-base-swedish-cased, the models as has been fine tuned to solve a Named Entety Recognition(NER) token classification task.\n\nThis is a downstream model to be used in conjunction with the Swedish violence sentiment classifier or Swedish violence sentiment classifier. The models are trained to tag parts of sentences that has recieved a positive classification from the upstream sentiment classifier. The model will tag parts of sentences that contains the targets that the upstream model has activated on. \n\nThe NER sentiment target models do work as standalone models but their recommended application is downstreamfrom a sentence classification model. \n\nThe models are only trained on Swedish data and only supports inference of Swedish input texts. The models inference metrics for all non-Swedish inputs are not defined, these inputs are considered as out of domain data.\n\nThe current models are supported at Transformers version >= 4.3.3 and Torch version 1.8.0, compatibility with older versions are not verified.",
"### Fear targets\n\nThe model can be imported from the transformers library by running\n\n from transformers import BertForSequenceClassification, BertTokenizerFast\n \n tokenizer = BertTokenizerFast.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Fear-Targets\")\n classifier_fear_targets= BertForTokenClassification.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Fear-Targets\") \n\nWhen the model and tokenizer are initialized the model can be used for inference.",
"#### Verification metrics \n\nDuring training the Fear target model had the following verification metrics when using \"any overlap\" as the evaluation metric. \n\n\n| F-score | Precision | Recall |\n|:-------------------------:|:-------:|:---------:|:------:|\n| 0.8361 | 0.7903 | 0.8876 |",
"#### Swedish-Sentiment-Violence\nThe model be can imported from the transformers library by running\n\n from transformers import BertForSequenceClassification, BertTokenizerFast\n \n tokenizer = BertTokenizerFast.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Violence-Targets\")\n classifier_violence_targets = BertForTokenClassification.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Violence-Targets\") \n\nWhen the model and tokenizer are initialized the model can be used for inference.",
"#### Verification metrics \nDuring training the Violence target model had the following verification metrics when using \"any overlap\" as the evaluation metric. \n\n| F-score | Precision | Recall |\n|:-------------------------:|:-------:|:---------:|:------:|\n| 0.7831| 0.9155| 0.8442 |"
] | [
"TAGS\n#transformers #pytorch #tf #jax #bert #token-classification #sv #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"## Swedish BERT models for sentiment analysis, Sentiment targets. \nRecorded Future together with AI Sweden releases two language models for target/role assignment in Swedish. The two models are based on the KB/bert-base-swedish-cased, the models as has been fine tuned to solve a Named Entety Recognition(NER) token classification task.\n\nThis is a downstream model to be used in conjunction with the Swedish violence sentiment classifier or Swedish violence sentiment classifier. The models are trained to tag parts of sentences that has recieved a positive classification from the upstream sentiment classifier. The model will tag parts of sentences that contains the targets that the upstream model has activated on. \n\nThe NER sentiment target models do work as standalone models but their recommended application is downstreamfrom a sentence classification model. \n\nThe models are only trained on Swedish data and only supports inference of Swedish input texts. The models inference metrics for all non-Swedish inputs are not defined, these inputs are considered as out of domain data.\n\nThe current models are supported at Transformers version >= 4.3.3 and Torch version 1.8.0, compatibility with older versions are not verified.",
"### Fear targets\n\nThe model can be imported from the transformers library by running\n\n from transformers import BertForSequenceClassification, BertTokenizerFast\n \n tokenizer = BertTokenizerFast.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Fear-Targets\")\n classifier_fear_targets= BertForTokenClassification.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Fear-Targets\") \n\nWhen the model and tokenizer are initialized the model can be used for inference.",
"#### Verification metrics \n\nDuring training the Fear target model had the following verification metrics when using \"any overlap\" as the evaluation metric. \n\n\n| F-score | Precision | Recall |\n|:-------------------------:|:-------:|:---------:|:------:|\n| 0.8361 | 0.7903 | 0.8876 |",
"#### Swedish-Sentiment-Violence\nThe model be can imported from the transformers library by running\n\n from transformers import BertForSequenceClassification, BertTokenizerFast\n \n tokenizer = BertTokenizerFast.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Violence-Targets\")\n classifier_violence_targets = BertForTokenClassification.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Violence-Targets\") \n\nWhen the model and tokenizer are initialized the model can be used for inference.",
"#### Verification metrics \nDuring training the Violence target model had the following verification metrics when using \"any overlap\" as the evaluation metric. \n\n| F-score | Precision | Recall |\n|:-------------------------:|:-------:|:---------:|:------:|\n| 0.7831| 0.9155| 0.8442 |"
] | [
50,
274,
139,
89,
148,
87
] | [
"passage: TAGS\n#transformers #pytorch #tf #jax #bert #token-classification #sv #license-mit #autotrain_compatible #endpoints_compatible #region-us \n## Swedish BERT models for sentiment analysis, Sentiment targets. \nRecorded Future together with AI Sweden releases two language models for target/role assignment in Swedish. The two models are based on the KB/bert-base-swedish-cased, the models as has been fine tuned to solve a Named Entety Recognition(NER) token classification task.\n\nThis is a downstream model to be used in conjunction with the Swedish violence sentiment classifier or Swedish violence sentiment classifier. The models are trained to tag parts of sentences that has recieved a positive classification from the upstream sentiment classifier. The model will tag parts of sentences that contains the targets that the upstream model has activated on. \n\nThe NER sentiment target models do work as standalone models but their recommended application is downstreamfrom a sentence classification model. \n\nThe models are only trained on Swedish data and only supports inference of Swedish input texts. The models inference metrics for all non-Swedish inputs are not defined, these inputs are considered as out of domain data.\n\nThe current models are supported at Transformers version >= 4.3.3 and Torch version 1.8.0, compatibility with older versions are not verified.### Fear targets\n\nThe model can be imported from the transformers library by running\n\n from transformers import BertForSequenceClassification, BertTokenizerFast\n \n tokenizer = BertTokenizerFast.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Fear-Targets\")\n classifier_fear_targets= BertForTokenClassification.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Fear-Targets\") \n\nWhen the model and tokenizer are initialized the model can be used for inference."
] | [
-0.059673670679330826,
0.08596284687519073,
-0.00209223385900259,
0.0778675228357315,
0.05924049764871597,
-0.0388716384768486,
0.1532924473285675,
0.03270586580038071,
0.02304503321647644,
0.020519211888313293,
0.006586003117263317,
0.0005939679685980082,
0.08894176036119461,
0.06025172770023346,
-0.0100775808095932,
-0.24627633392810822,
0.05730045959353447,
-0.07883831858634949,
0.08878698945045471,
0.0709085613489151,
0.10897312313318253,
-0.023753046989440918,
0.0396699458360672,
0.05275839939713478,
-0.03411471098661423,
-0.039770592004060745,
0.0030922756996005774,
-0.02956930175423622,
0.06385844200849533,
0.07269132137298584,
0.06639902293682098,
-0.03585989028215408,
0.07964713126420975,
-0.05662848427891731,
0.01977580599486828,
0.019896451383829117,
0.002552290912717581,
0.01901964284479618,
0.08747578412294388,
-0.005640209652483463,
0.11217235773801804,
-0.009576697833836079,
0.06779322028160095,
0.052336860448122025,
-0.03361846134066582,
-0.10128448903560638,
-0.11201459169387817,
0.07291432470083237,
0.06875192373991013,
0.040479667484760284,
-0.021784741431474686,
0.10979088395833969,
-0.0077867647632956505,
0.059521324932575226,
0.0346815325319767,
-0.06306767463684082,
-0.06526695191860199,
0.05868867039680481,
-0.024202456697821617,
0.032708827406167984,
-0.0626462996006012,
0.008785933256149292,
-0.0013311264337971807,
0.05339976027607918,
0.11512584239244461,
-0.016041293740272522,
0.0403338260948658,
-0.07460742443799973,
-0.1213657483458519,
-0.0009949331870302558,
0.08646924048662186,
0.02610590308904648,
-0.12438248842954636,
-0.18348340690135956,
-0.004663245286792517,
0.05893785133957863,
0.006666236091405153,
-0.048591211438179016,
0.008644131012260914,
-0.020532837137579918,
0.1683090627193451,
-0.04912901297211647,
-0.09294474124908447,
0.005855902098119259,
-0.003409271826967597,
0.1038835421204567,
-0.019009895622730255,
0.02426827698945999,
0.007946087047457695,
0.029997292906045914,
-0.1813090294599533,
-0.09490258246660233,
-0.04586679860949516,
-0.1164180114865303,
-0.08079949021339417,
-0.01938432641327381,
-0.02498115971684456,
-0.22953857481479645,
-0.05269432067871094,
0.04670262709259987,
-0.024300619959831238,
-0.0006622251239605248,
-0.05568600073456764,
0.004786781035363674,
0.10880287736654282,
0.09145227819681168,
-0.006278451066464186,
-0.053233902901411057,
0.04701634868979454,
0.0069270506501197815,
0.031605008989572525,
-0.019877268001437187,
-0.05334625020623207,
0.010277093388140202,
0.015095315873622894,
0.01047387532889843,
0.0006479620933532715,
0.06590180099010468,
-0.051067106425762177,
-0.05643485113978386,
0.11708609014749527,
-0.11663752794265747,
-0.03441724553704262,
0.003827642882242799,
-0.048366717994213104,
0.10442781448364258,
0.06597279757261276,
-0.040321022272109985,
-0.03769959881901741,
0.023437611758708954,
-0.006662061437964439,
-0.02589350938796997,
-0.09202317148447037,
-0.18242396414279938,
0.0574122779071331,
-0.0909971296787262,
-0.019002746790647507,
-0.12952111661434174,
-0.19250687956809998,
-0.0920170322060585,
0.029510805383324623,
-0.058783501386642456,
0.006338759325444698,
-0.08156487345695496,
-0.07467605173587799,
-0.010275226086378098,
0.002357470104470849,
-0.07826078683137894,
-0.009567372500896454,
-0.02826855331659317,
-0.11795351654291153,
0.0845099613070488,
-0.0029270690865814686,
-0.014013170264661312,
-0.13245807588100433,
-0.02514076419174671,
-0.25935474038124084,
0.11234921962022781,
-0.10656608641147614,
0.048385292291641235,
-0.05116220563650131,
-0.01450625341385603,
0.07303828746080399,
0.04475438967347145,
-0.0021353308111429214,
0.18868671357631683,
-0.3021610379219055,
-0.008444187231361866,
0.09322700649499893,
-0.21327126026153564,
0.003273604204878211,
0.05391883850097656,
-0.044112108647823334,
0.14965859055519104,
0.14552219212055206,
0.1839563399553299,
0.01220748107880354,
-0.046162430197000504,
-0.039692752063274384,
-0.023958338424563408,
-0.07848058640956879,
0.07520545274019241,
0.03220019489526749,
0.0036263384390622377,
0.031455148011446,
0.045061685144901276,
-0.09046431630849838,
-0.0030028431210666895,
0.029112083837389946,
-0.015005839988589287,
0.030424002557992935,
-0.04302504286170006,
0.03523660823702812,
-0.018829122185707092,
-0.015204706229269505,
-0.04034482687711716,
-0.08734836429357529,
0.07390471547842026,
0.0342501699924469,
-0.03454088792204857,
0.011067998595535755,
-0.06110847368836403,
0.0165751650929451,
-0.042033396661281586,
0.015391523949801922,
-0.13565693795681,
-0.15814000368118286,
-0.012279419228434563,
-0.07515560835599899,
0.05636851117014885,
0.10104110091924667,
0.03518655523657799,
0.041107889264822006,
0.004661539103835821,
-0.007275039330124855,
-0.01134277880191803,
0.013865470886230469,
-0.026407357305288315,
-0.16594085097312927,
-0.05664315074682236,
-0.07496684044599533,
0.03649619594216347,
-0.16580213606357574,
0.024627307429909706,
0.05398152023553848,
0.07310318946838379,
0.04753879830241203,
-0.043669719249010086,
0.0825803130865097,
0.027730880305171013,
-0.00724848173558712,
-0.01841847412288189,
0.039978355169296265,
0.011094260960817337,
-0.002766688819974661,
0.15798678994178772,
-0.05326369032263756,
-0.10584399849176407,
0.04667442664504051,
0.10508165508508682,
-0.1166081577539444,
0.005056914873421192,
0.019935932010412216,
0.018832648172974586,
-0.036080412566661835,
-0.016513297334313393,
0.08099257200956345,
0.07459846138954163,
0.0823575109243393,
-0.07427752763032913,
-0.002716759918257594,
0.0446336604654789,
-0.10555226355791092,
-0.000999554293230176,
0.07204856723546982,
-0.06272652745246887,
-0.1571083515882492,
0.05455618351697922,
-0.051778990775346756,
0.046324051916599274,
0.24794062972068787,
0.04291890189051628,
-0.06704232096672058,
-0.016267770901322365,
-0.08774948120117188,
0.00475533539429307,
0.109293632209301,
0.04487759247422218,
0.030041072517633438,
0.041818585246801376,
0.023520605638623238,
-0.03514315187931061,
-0.025056658312678337,
0.02109670080244541,
0.019236033782362938,
0.005938397254794836,
-0.007738742511719465,
0.05482476204633713,
0.005062923301011324,
0.05812237411737442,
-0.008449693210422993,
0.030330872163176537,
0.013102793134748936,
-0.05563744530081749,
-0.13300703465938568,
0.1284288913011551,
-0.11135368794202805,
-0.22727032005786896,
-0.12314392626285553,
-0.011408472433686256,
-0.10107230395078659,
0.010973185300827026,
0.03945467248558998,
-0.10297996550798416,
-0.08952002227306366,
-0.11385698616504669,
0.07556690275669098,
0.0964256078004837,
-0.067908376455307,
-0.06837770342826843,
-0.004074214491993189,
0.0013471620623022318,
-0.10483426600694656,
-0.023169878870248795,
-0.05813378840684891,
-0.052386991679668427,
0.03657509759068489,
0.05369038134813309,
0.029598137363791466,
0.09235494583845139,
0.006270936690270901,
-0.01735272817313671,
0.012035781517624855,
0.11792877316474915,
0.023347480222582817,
0.12968756258487701,
0.09978312999010086,
-0.11016524583101273,
0.06443586945533752,
0.07840897142887115,
0.02803615853190422,
-0.03277471289038658,
-0.004123901017010212,
0.053125545382499695,
-0.09001294523477554,
-0.1723322570323944,
-0.11428181082010269,
0.02775096893310547,
-0.01091099064797163,
0.0289070513099432,
0.006550219375640154,
0.025378551334142685,
0.08975014090538025,
0.020162874832749367,
-0.11663006246089935,
-0.025493044406175613,
0.10001539438962936,
0.09141316264867783,
-0.040413998067379,
0.05194011703133583,
-0.048945315182209015,
0.09367889910936356,
0.08554640412330627,
-0.11265293508768082,
0.137332484126091,
-0.051686305552721024,
-0.03773472458124161,
0.08282303810119629,
0.03750402107834816,
0.10629719495773315,
0.11532699316740036,
-0.01066497527062893,
-0.013139700517058372,
-0.023056989535689354,
-0.08982136845588684,
-0.10962187498807907,
0.09288597851991653,
-0.14478452503681183,
0.04193796217441559,
-0.0636049211025238,
-0.03809152916073799,
0.04561074450612068,
0.16412493586540222,
-0.029435325413942337,
-0.19502969086170197,
-0.16222043335437775,
0.035073935985565186,
-0.06337232142686844,
-0.05730082094669342,
-0.05767010152339935,
0.05088595673441887,
-0.14202678203582764,
0.07362169027328491,
-0.03988506272435188,
0.0732421800494194,
-0.07787469029426575,
0.022586433216929436,
0.010978984646499157,
0.0647725984454155,
-0.017226440832018852,
-0.009746847674250603,
0.04224478080868721,
0.11547961086034775,
0.005410656798630953,
0.10491130501031876,
-0.09860555082559586,
-0.009412863291800022,
0.03108822926878929,
0.05490744113922119,
0.15604472160339355,
0.04106967896223068,
-0.030869396403431892,
-0.03812035918235779,
-0.0508275181055069,
-0.003961305133998394,
-0.01747150905430317,
-0.07020263373851776,
0.10705720633268356,
0.03191855177283287,
0.039898764342069626,
-0.05824624374508858,
-0.06514735519886017,
-0.06758873909711838,
-0.10854855924844742,
0.012611018493771553,
-0.05255364626646042,
0.08605360984802246,
-0.028377411887049675,
-0.07889313995838165,
-0.06333570182323456,
0.18659767508506775,
-0.07819584012031555,
-0.14901041984558105,
-0.0978814885020256,
-0.022262340411543846,
0.03237064182758331,
-0.044762663543224335,
0.036998450756073,
-0.018300848081707954,
0.13503144681453705,
-0.08342783898115158,
-0.04188111051917076,
0.015748875215649605,
-0.1033506765961647,
-0.08658409863710403,
0.0034391849767416716,
0.09630551934242249,
0.1877887099981308,
0.007256242912262678,
0.07699878513813019,
0.040830690413713455,
0.07564647495746613,
-0.10380438715219498,
-0.06552108377218246,
0.24816913902759552,
0.014567532576620579,
0.0029251733794808388,
-0.04090440273284912,
-0.12402789294719696,
-0.01299726590514183,
-0.032124489545822144,
0.07279264181852341,
0.1678987741470337,
-0.07295975089073181,
0.1836860328912735,
0.15636025369167328,
-0.09447377920150757,
-0.20265592634677887,
0.04144520312547684,
0.10277200490236282,
0.03179966285824776,
0.1534804254770279,
-0.11476254463195801,
0.060540515929460526,
-0.01089534256607294,
0.004627280868589878,
-0.059417758136987686,
-0.0868273377418518,
-0.06721312552690506,
0.040964994579553604,
0.0442812442779541,
0.11145055294036865,
0.009360863827168941,
-0.03554842993617058,
0.004588237032294273,
-0.026836108416318893,
0.17119742929935455,
-0.004231794737279415,
0.019723501056432724,
0.000936330936383456,
0.1129230484366417,
0.07788991183042526,
-0.040037013590335846,
0.12027859687805176,
0.04030778631567955,
0.048190321773290634,
-0.14078164100646973,
0.0733637809753418,
0.11326111853122711,
-0.045693203806877136,
0.16737444698810577,
0.12230836600065231,
-0.011306042782962322,
-0.12289877235889435,
-0.07310136407613754,
-0.07234174013137817,
0.07518994808197021,
-0.014327192679047585,
-0.061397355049848557,
-0.005031885579228401,
0.11901140213012695,
0.08343947678804398,
-0.03999630734324455,
-0.01758413016796112,
-0.1529228836297989,
0.007322647608816624,
0.17845670878887177,
0.13571301102638245,
0.021973758935928345,
-0.0669771060347557,
0.021872876212000847,
-0.03155605494976044,
0.06224506348371506,
-0.12325632572174072,
-0.009334376081824303,
0.036410506814718246,
-0.00020257258438505232,
0.053252626210451126,
-0.05388770252466202,
-0.11099892854690552,
0.048515863716602325,
0.04852727800607681,
-0.145992249250412,
-0.06634467840194702,
-0.005718972999602556,
0.004693122580647469,
-0.04237710312008858,
-0.00822546798735857,
0.20348875224590302,
-0.09398261457681656,
0.0008283946663141251,
-0.03830447793006897,
0.06730259209871292,
-0.014434886164963245,
0.0011653484543785453,
0.008487912826240063,
0.016161438077688217,
-0.06872697174549103,
0.11078277230262756,
-0.03950691968202591,
-0.1008647233247757,
0.10248979926109314,
0.10948459804058075,
-0.11633793264627457,
-0.05004282668232918,
-0.11598356068134308,
0.13512006402015686,
-0.1230413094162941,
-0.05135008692741394,
0.030769724398851395,
-0.05662601441144943,
-0.031217379495501518,
0.14208456873893738,
0.04846576601266861,
0.006168670486658812,
-0.08042287081480026,
-0.020419102162122726,
-0.06636594235897064,
0.027651000767946243,
0.09050669521093369,
-0.07284089177846909,
-0.015436021611094475,
0.004121784586459398,
0.004469543695449829,
-0.015674447640776634,
-0.031207723543047905,
-0.056642647832632065,
-0.04345046356320381,
-0.04272036999464035,
-0.10665099322795868,
-0.027562662959098816,
-0.08415722846984863,
0.012517075054347515,
-0.030260875821113586,
0.006758634466677904,
0.047421328723430634,
0.028799818828701973,
-0.025453943759202957,
0.045055970549583435,
-0.02448102831840515,
0.05562049150466919,
-0.08000368624925613,
-0.02094447985291481,
-0.019193964079022408,
-0.0008243576739914715,
0.05368666350841522,
0.04591246321797371,
-0.056522686034440994,
0.09702745079994202,
-0.1128423660993576,
0.10053697228431702,
-0.017812704667448997,
-0.06196648254990578,
0.04817495122551918,
-0.05414389446377754,
-0.01788572408258915,
0.014967675320804119,
-0.04126785323023796,
-0.0032557735685259104,
0.08235392719507217,
-0.045897070318460464,
0.0667387992143631,
0.15774965286254883,
0.0323699526488781,
-0.10423889756202698,
0.06338442862033844,
0.06591957062482834,
0.06687228381633759,
0.13368986546993256,
-0.020899660885334015,
0.05787298083305359,
-0.07728254050016403,
-0.022879377007484436,
0.000910650531295687,
0.0029401127249002457,
0.038256268948316574,
-0.03360673040151596,
-0.006547825876623392,
-0.04047456011176109,
0.09795345366001129,
0.10038680583238602,
0.020147953182458878,
0.04090351611375809,
-0.004431759472936392,
-0.04926074296236038,
-0.0037284644786268473,
0.01956021599471569,
-0.029889917001128197,
0.038033634424209595,
-0.037631358951330185,
0.019585508853197098,
-0.04244266822934151,
-0.17134211957454681,
0.07876183837652206,
-0.0035014816094189882,
0.07259871810674667,
0.056262291967868805,
0.037737488746643066,
0.06590979546308517,
-0.07118221372365952,
0.055131323635578156,
0.07090222090482712,
0.032711535692214966,
-0.062288518995046616,
0.09626927226781845,
0.10545645654201508,
-0.07056160271167755,
0.10312701761722565,
0.03159099817276001,
-0.05202993378043175,
-0.06484436988830566,
-0.3820571303367615,
-0.05406028404831886,
-0.039747387170791626,
-0.010707397013902664,
-0.11194007843732834,
0.09502334147691727,
0.056773651391267776,
0.017324324697256088,
0.0004926193505525589,
0.07734976708889008,
-0.1259402185678482,
-0.0830332487821579,
0.08658090978860855,
-0.004622482229024172,
-0.030537353828549385,
0.006858825217932463,
0.06567871570587158,
0.017005156725645065,
0.10301986336708069,
-0.020876221358776093,
0.09648822993040085,
0.08872923254966736,
-0.04668806120753288,
-0.04805995151400566,
-0.09608914703130722,
-0.03428945317864418,
0.02421320602297783,
0.0147716598585248,
0.17312714457511902,
0.07648062705993652,
-0.08186709880828857,
-0.02246381901204586,
0.16431637108325958,
-0.031408317387104034,
-0.08312546461820602,
-0.1256353259086609,
0.15305864810943604,
-0.03878199681639671,
0.04353714734315872,
0.0035494279582053423,
-0.06434664130210876,
0.09161444008350372,
0.09360344707965851,
0.16354280710220337,
0.05524974316358566,
0.00927527341991663,
-0.15328006446361542,
-0.01818690076470375,
-0.08272240310907364,
0.060834672302007675,
-0.03922543674707413,
0.177123561501503,
-0.05446629971265793,
0.1961311399936676,
-0.07586461305618286,
-0.014630992896854877,
0.001561639946885407,
0.03745308890938759,
-0.029449600726366043,
0.019776709377765656,
-0.06628412008285522,
0.12841327488422394,
-0.03949931636452675,
-0.22720147669315338,
0.033409249037504196,
0.013738083653151989,
-0.07271411269903183,
-0.004188218154013157,
-0.02482609637081623,
0.060186125338077545,
0.08436093479394913,
0.020569035783410072,
-0.021005090326070786,
0.21856670081615448,
-0.017751669511198997,
-0.016468068584799767,
-0.11678091436624527,
0.1058957502245903,
-0.007910321466624737,
0.12039998173713684,
0.03900282457470894,
0.051258113235235214,
0.07376465946435928,
-0.0062314411625266075,
-0.1310809999704361,
0.040175680071115494,
-0.020631486549973488,
-0.07695159316062927,
-0.030569715425372124,
0.1814209669828415,
-0.007241870276629925,
0.12941184639930725,
-0.011614727787673473,
-0.14503735303878784,
0.03545693680644035,
-0.012485931627452374,
-0.07516490668058395,
-0.0017480190144851804,
0.12737122178077698,
-0.06595703959465027,
0.1576453298330307,
0.1262418031692505,
-0.008265810087323189,
-0.010336732491850853,
-0.07599365711212158,
0.0847846269607544,
-0.003496026387438178,
-0.07194135338068008,
0.053837165236473083,
-0.12707264721393585,
0.005990501958876848,
0.13260406255722046,
0.031264662742614746,
-0.17197199165821075,
-0.05672461539506912,
-0.023252343758940697,
0.01889781467616558,
0.061587557196617126,
0.01116239558905363,
0.011377628892660141,
0.03408174216747284,
-0.005578321870416403,
0.01951097883284092,
0.030093234032392502,
0.08675657957792282,
-0.04388910531997681,
-0.0726616233587265
] |
null | null | transformers |
## Swedish BERT models for sentiment analysis
[Recorded Future](https://www.recordedfuture.com/) together with [AI Sweden](https://www.ai.se/en) releases two language models for sentiment analysis in Swedish. The two models are based on the [KB\/bert-base-swedish-cased](https://huggingface.co/KB/bert-base-swedish-cased) model and has been fine-tuned to solve a multi-label sentiment analysis task.
The models have been fine-tuned for the sentiments fear and violence. The models output three floats corresponding to the labels "Negative", "Weak sentiment", and "Strong Sentiment" at the respective indexes.
The models have been trained on Swedish data with a conversational focus, collected from various internet sources and forums.
The models are only trained on Swedish data and only supports inference of Swedish input texts. The models inference metrics for all non-Swedish inputs are not defined, these inputs are considered as out of domain data.
The current models are supported at Transformers version >= 4.3.3 and Torch version 1.8.0, compatibility with older versions are not verified.
### Swedish-Sentiment-Fear
The model can be imported from the transformers library by running
from transformers import BertForSequenceClassification, BertTokenizerFast
tokenizer = BertTokenizerFast.from_pretrained("RecordedFuture/Swedish-Sentiment-Fear")
classifier_fear= BertForSequenceClassification.from_pretrained("RecordedFuture/Swedish-Sentiment-Fear")
When the model and tokenizer are initialized the model can be used for inference.
#### Sentiment definitions
#### The strong sentiment includes but are not limited to
Texts that:
- Hold an expressive emphasis on fear and/ or anxiety
#### The weak sentiment includes but are not limited to
Texts that:
- Express fear and/ or anxiety in a neutral way
#### Verification metrics
During training, the model had maximized validation metrics at the following classification breakpoint.
| Classification Breakpoint | F-score | Precision | Recall |
|:-------------------------:|:-------:|:---------:|:------:|
| 0.45 | 0.8754 | 0.8618 | 0.8895 |
#### Swedish-Sentiment-Violence
The model be can imported from the transformers library by running
from transformers import BertForSequenceClassification, BertTokenizerFast
tokenizer = BertTokenizerFast.from_pretrained("RecordedFuture/Swedish-Sentiment-Violence")
classifier_violence = BertForSequenceClassification.from_pretrained("RecordedFuture/Swedish-Sentiment-Violence")
When the model and tokenizer are initialized the model can be used for inference.
### Sentiment definitions
#### The strong sentiment includes but are not limited to
Texts that:
- Referencing highly violent acts
- Hold an aggressive tone
#### The weak sentiment includes but are not limited to
Texts that:
- Include general violent statements that do not fall under the strong sentiment
#### Verification metrics
During training, the model had maximized validation metrics at the following classification breakpoint.
| Classification Breakpoint | F-score | Precision | Recall |
|:-------------------------:|:-------:|:---------:|:------:|
| 0.35 | 0.7677 | 0.7456 | 0.791 | | {"language": "sv", "license": "mit"} | text-classification | RecordedFuture/Swedish-Sentiment-Violence | [
"transformers",
"pytorch",
"tf",
"jax",
"bert",
"text-classification",
"sv",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"sv"
] | TAGS
#transformers #pytorch #tf #jax #bert #text-classification #sv #license-mit #autotrain_compatible #endpoints_compatible #region-us
| Swedish BERT models for sentiment analysis
------------------------------------------
Recorded Future together with AI Sweden releases two language models for sentiment analysis in Swedish. The two models are based on the KB/bert-base-swedish-cased model and has been fine-tuned to solve a multi-label sentiment analysis task.
The models have been fine-tuned for the sentiments fear and violence. The models output three floats corresponding to the labels "Negative", "Weak sentiment", and "Strong Sentiment" at the respective indexes.
The models have been trained on Swedish data with a conversational focus, collected from various internet sources and forums.
The models are only trained on Swedish data and only supports inference of Swedish input texts. The models inference metrics for all non-Swedish inputs are not defined, these inputs are considered as out of domain data.
The current models are supported at Transformers version >= 4.3.3 and Torch version 1.8.0, compatibility with older versions are not verified.
### Swedish-Sentiment-Fear
The model can be imported from the transformers library by running
```
from transformers import BertForSequenceClassification, BertTokenizerFast
tokenizer = BertTokenizerFast.from_pretrained("RecordedFuture/Swedish-Sentiment-Fear")
classifier_fear= BertForSequenceClassification.from_pretrained("RecordedFuture/Swedish-Sentiment-Fear")
```
When the model and tokenizer are initialized the model can be used for inference.
#### Sentiment definitions
#### The strong sentiment includes but are not limited to
Texts that:
* Hold an expressive emphasis on fear and/ or anxiety
#### The weak sentiment includes but are not limited to
Texts that:
* Express fear and/ or anxiety in a neutral way
#### Verification metrics
During training, the model had maximized validation metrics at the following classification breakpoint.
#### Swedish-Sentiment-Violence
The model be can imported from the transformers library by running
```
from transformers import BertForSequenceClassification, BertTokenizerFast
tokenizer = BertTokenizerFast.from_pretrained("RecordedFuture/Swedish-Sentiment-Violence")
classifier_violence = BertForSequenceClassification.from_pretrained("RecordedFuture/Swedish-Sentiment-Violence")
```
When the model and tokenizer are initialized the model can be used for inference.
### Sentiment definitions
#### The strong sentiment includes but are not limited to
Texts that:
* Referencing highly violent acts
* Hold an aggressive tone
#### The weak sentiment includes but are not limited to
Texts that:
* Include general violent statements that do not fall under the strong sentiment
#### Verification metrics
During training, the model had maximized validation metrics at the following classification breakpoint.
| [
"### Swedish-Sentiment-Fear\n\n\nThe model can be imported from the transformers library by running\n\n\n\n```\nfrom transformers import BertForSequenceClassification, BertTokenizerFast\n\ntokenizer = BertTokenizerFast.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Fear\")\nclassifier_fear= BertForSequenceClassification.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Fear\") \n\n```\n\nWhen the model and tokenizer are initialized the model can be used for inference.",
"#### Sentiment definitions",
"#### The strong sentiment includes but are not limited to\n\n\nTexts that:\n\n\n* Hold an expressive emphasis on fear and/ or anxiety",
"#### The weak sentiment includes but are not limited to\n\n\nTexts that:\n\n\n* Express fear and/ or anxiety in a neutral way",
"#### Verification metrics\n\n\nDuring training, the model had maximized validation metrics at the following classification breakpoint.",
"#### Swedish-Sentiment-Violence\n\n\nThe model be can imported from the transformers library by running\n\n\n\n```\nfrom transformers import BertForSequenceClassification, BertTokenizerFast\n\ntokenizer = BertTokenizerFast.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Violence\")\nclassifier_violence = BertForSequenceClassification.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Violence\") \n\n```\n\nWhen the model and tokenizer are initialized the model can be used for inference.",
"### Sentiment definitions",
"#### The strong sentiment includes but are not limited to\n\n\nTexts that:\n\n\n* Referencing highly violent acts\n* Hold an aggressive tone",
"#### The weak sentiment includes but are not limited to\n\n\nTexts that:\n\n\n* Include general violent statements that do not fall under the strong sentiment",
"#### Verification metrics\n\n\nDuring training, the model had maximized validation metrics at the following classification breakpoint."
] | [
"TAGS\n#transformers #pytorch #tf #jax #bert #text-classification #sv #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"### Swedish-Sentiment-Fear\n\n\nThe model can be imported from the transformers library by running\n\n\n\n```\nfrom transformers import BertForSequenceClassification, BertTokenizerFast\n\ntokenizer = BertTokenizerFast.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Fear\")\nclassifier_fear= BertForSequenceClassification.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Fear\") \n\n```\n\nWhen the model and tokenizer are initialized the model can be used for inference.",
"#### Sentiment definitions",
"#### The strong sentiment includes but are not limited to\n\n\nTexts that:\n\n\n* Hold an expressive emphasis on fear and/ or anxiety",
"#### The weak sentiment includes but are not limited to\n\n\nTexts that:\n\n\n* Express fear and/ or anxiety in a neutral way",
"#### Verification metrics\n\n\nDuring training, the model had maximized validation metrics at the following classification breakpoint.",
"#### Swedish-Sentiment-Violence\n\n\nThe model be can imported from the transformers library by running\n\n\n\n```\nfrom transformers import BertForSequenceClassification, BertTokenizerFast\n\ntokenizer = BertTokenizerFast.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Violence\")\nclassifier_violence = BertForSequenceClassification.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Violence\") \n\n```\n\nWhen the model and tokenizer are initialized the model can be used for inference.",
"### Sentiment definitions",
"#### The strong sentiment includes but are not limited to\n\n\nTexts that:\n\n\n* Referencing highly violent acts\n* Hold an aggressive tone",
"#### The weak sentiment includes but are not limited to\n\n\nTexts that:\n\n\n* Include general violent statements that do not fall under the strong sentiment",
"#### Verification metrics\n\n\nDuring training, the model had maximized validation metrics at the following classification breakpoint."
] | [
49,
137,
6,
27,
27,
26,
141,
6,
29,
32,
26
] | [
"passage: TAGS\n#transformers #pytorch #tf #jax #bert #text-classification #sv #license-mit #autotrain_compatible #endpoints_compatible #region-us \n### Swedish-Sentiment-Fear\n\n\nThe model can be imported from the transformers library by running\n\n\n\n```\nfrom transformers import BertForSequenceClassification, BertTokenizerFast\n\ntokenizer = BertTokenizerFast.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Fear\")\nclassifier_fear= BertForSequenceClassification.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Fear\") \n\n```\n\nWhen the model and tokenizer are initialized the model can be used for inference.#### Sentiment definitions#### The strong sentiment includes but are not limited to\n\n\nTexts that:\n\n\n* Hold an expressive emphasis on fear and/ or anxiety#### The weak sentiment includes but are not limited to\n\n\nTexts that:\n\n\n* Express fear and/ or anxiety in a neutral way#### Verification metrics\n\n\nDuring training, the model had maximized validation metrics at the following classification breakpoint.#### Swedish-Sentiment-Violence\n\n\nThe model be can imported from the transformers library by running\n\n\n\n```\nfrom transformers import BertForSequenceClassification, BertTokenizerFast\n\ntokenizer = BertTokenizerFast.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Violence\")\nclassifier_violence = BertForSequenceClassification.from_pretrained(\"RecordedFuture/Swedish-Sentiment-Violence\") \n\n```\n\nWhen the model and tokenizer are initialized the model can be used for inference.### Sentiment definitions#### The strong sentiment includes but are not limited to\n\n\nTexts that:\n\n\n* Referencing highly violent acts\n* Hold an aggressive tone#### The weak sentiment includes but are not limited to\n\n\nTexts that:\n\n\n* Include general violent statements that do not fall under the strong sentiment#### Verification metrics\n\n\nDuring training, the model had maximized validation metrics at the following classification breakpoint."
] | [
0.003579544834792614,
0.12002968043088913,
-0.006823304109275341,
0.0754019096493721,
0.07986628264188766,
-0.02166338823735714,
0.05516795441508293,
0.09400958567857742,
0.015807343646883965,
0.10368963330984116,
-0.03039165958762169,
0.012332786805927753,
0.05437842383980751,
-0.11151539534330368,
-0.005911299027502537,
-0.21178044378757477,
0.03445553034543991,
-0.09815899282693863,
0.003398197004571557,
0.0546453632414341,
0.12064849585294724,
-0.05321710556745529,
0.06036005914211273,
-0.01380356028676033,
0.02246241457760334,
0.014222514815628529,
-0.023651333525776863,
0.03111841529607773,
0.05109642818570137,
0.033321794122457504,
0.06607842445373535,
-0.04587509110569954,
0.001015484449453652,
-0.22517572343349457,
-0.004892013035714626,
0.05039294436573982,
-0.001572724780999124,
0.009220224805176258,
0.10820479691028595,
-0.0622217170894146,
0.05740639939904213,
-0.2206767201423645,
0.060113873332738876,
0.05781051516532898,
-0.10403341054916382,
-0.1829993575811386,
-0.0889483243227005,
0.06562481075525284,
0.06318233907222748,
0.04546467214822769,
-0.0695858895778656,
0.10551885515451431,
-0.043026845902204514,
0.05329270660877228,
0.16278262436389923,
-0.1711733043193817,
-0.038306932896375656,
-0.06949246674776077,
0.006111201364547014,
0.08266136050224304,
-0.13536697626113892,
0.028097696602344513,
0.03454983979463577,
0.0069006686098873615,
0.06909332424402237,
-0.03738173842430115,
-0.05112524703145027,
-0.02287925034761429,
-0.1338661164045334,
-0.025335241109132767,
0.06109827384352684,
0.017353836447000504,
-0.026145756244659424,
-0.1143830344080925,
0.02387872524559498,
0.011195339262485504,
0.005002122838050127,
-0.07543176412582397,
0.06169794127345085,
0.01692412793636322,
0.11133978515863419,
-0.0705706849694252,
-0.11797616630792618,
0.07552918046712875,
0.0015560517786070704,
0.12275408208370209,
-0.013173343613743782,
-0.03363914415240288,
0.05154842510819435,
-0.033273257315158844,
-0.14434361457824707,
-0.11850935220718384,
-0.010116703808307648,
-0.03374972194433212,
-0.07455354183912277,
-0.041773319244384766,
-0.08591850847005844,
-0.15954966843128204,
0.030019570142030716,
0.05582444369792938,
-0.17158962786197662,
-0.003285098820924759,
-0.04460516571998596,
-0.005744329188019037,
0.08899872750043869,
0.10937904566526413,
-0.0642353966832161,
-0.07798267900943756,
-0.024565638974308968,
-0.024533644318580627,
0.0378190316259861,
0.011851267889142036,
-0.019572844728827477,
-0.01017001736909151,
0.10784450173377991,
0.04912963882088661,
-0.03534227982163429,
0.05539277195930481,
-0.10554444789886475,
-0.037866998463869095,
-0.09401190280914307,
-0.12446475774049759,
0.02363048680126667,
0.060258299112319946,
-0.018680214881896973,
0.01937897317111492,
-0.013750576414167881,
0.01532113179564476,
-0.045929450541734695,
0.026747846975922585,
-0.019141701981425285,
-0.0021294658072292805,
-0.0827716588973999,
-0.14410400390625,
0.046056196093559265,
-0.04909661412239075,
-0.054987527430057526,
-0.05363461375236511,
-0.09981343150138855,
-0.05571611598134041,
0.03695749118924141,
-0.06961524486541748,
0.021806592121720314,
-0.06243683397769928,
-0.07522425800561905,
0.02465745247900486,
0.040605202317237854,
-0.03585388883948326,
0.005881734658032656,
-0.0024200037587434053,
-0.04956759884953499,
0.05089515075087547,
0.07016096264123917,
-0.006222473457455635,
-0.14499621093273163,
0.010006938129663467,
-0.19180697202682495,
0.1611601859331131,
-0.15394985675811768,
0.050290144979953766,
-0.1264573335647583,
-0.036593981087207794,
0.020960671827197075,
0.02625446580350399,
0.008967775851488113,
0.15218816697597504,
-0.1903502196073532,
-0.05237370356917381,
0.10329161584377289,
-0.16020530462265015,
-0.04732321575284004,
0.1696529984474182,
-0.05722622573375702,
0.09804804623126984,
0.10775308310985565,
0.1983797401189804,
0.011185436509549618,
-0.07054292410612106,
-0.058452919125556946,
-0.003924774471670389,
-0.06706912070512772,
0.2329385131597519,
0.03886883333325386,
-0.08744823932647705,
0.06841137260198593,
0.0020402565132826567,
-0.018620405346155167,
0.003086124313995242,
0.028542032465338707,
-0.06966795027256012,
0.005668981000781059,
-0.022219283506274223,
0.0805378183722496,
0.0025305335875600576,
-0.03465196117758751,
-0.029434867203235626,
-0.1008414775133133,
0.1491796374320984,
0.017073385417461395,
-0.028801117092370987,
0.028413420543074608,
-0.06615591794252396,
0.00037007973878644407,
-0.023877400904893875,
-0.016633568331599236,
-0.12590931355953217,
-0.07206335663795471,
-0.008879268541932106,
-0.13208463788032532,
0.08391394466161728,
0.10940005630254745,
0.0793965682387352,
-0.014635623432695866,
-0.0604267455637455,
0.02295532636344433,
0.008865959011018276,
0.04624330997467041,
-0.05437023565173149,
-0.22173458337783813,
0.043789640069007874,
-0.049660973250865936,
0.10514418035745621,
-0.10238134860992432,
0.006326351314783096,
0.20700259506702423,
0.10496754944324493,
0.05438490957021713,
-0.02264847233891487,
0.06467258185148239,
0.019487762823700905,
0.03244026377797127,
-0.04577802121639252,
0.041169147938489914,
-0.02250506915152073,
-0.07543765008449554,
0.09322052448987961,
-0.13793794810771942,
-0.0949832946062088,
0.06704594194889069,
0.024440547451376915,
-0.1118011325597763,
-0.07359348982572556,
-0.01749231293797493,
-0.012306657619774342,
0.023138530552387238,
-0.06116446480154991,
0.09113939851522446,
0.05858786776661873,
0.05070142075419426,
-0.06516887992620468,
-0.027869559824466705,
-0.005222893785685301,
-0.11041343957185745,
-0.055153988301754,
0.11925428360700607,
-0.11221297830343246,
-0.2647801339626312,
0.10845067352056503,
0.084854356944561,
-0.05184822529554367,
0.19314256310462952,
0.016808930784463882,
-0.05650182068347931,
-0.08707308024168015,
0.022555731236934662,
0.06641390919685364,
0.06453827023506165,
-0.07789343595504761,
0.02058856561779976,
0.029859155416488647,
-0.024957293644547462,
-0.005565729457885027,
-0.0319758839905262,
0.04293437302112579,
0.055290598422288895,
0.01829317770898342,
0.11664412915706635,
0.007116029970347881,
0.03638870269060135,
0.04870283976197243,
0.012099351733922958,
0.044944554567337036,
0.009340795688331127,
-0.053747374564409256,
-0.12087251245975494,
0.13353778421878815,
-0.1469469666481018,
-0.22097398340702057,
-0.06383245438337326,
0.02740107849240303,
-0.06726723164319992,
0.009487505070865154,
0.04376408830285072,
-0.1492796689271927,
-0.07719853520393372,
-0.07347973436117172,
0.08618965744972229,
0.06386502087116241,
-0.09851622581481934,
-0.06553006917238235,
0.03781796619296074,
0.062377236783504486,
-0.07833235710859299,
0.013951288536190987,
-0.010066404938697815,
-0.04868520423769951,
-0.009118152782320976,
0.03916822373867035,
0.03248174861073494,
0.1017971783876419,
0.018134260550141335,
-0.04260428994894028,
-0.030878189951181412,
0.18215718865394592,
-0.062376827001571655,
0.03514249622821808,
0.02404114603996277,
-0.08806119114160538,
0.09520301222801208,
0.14947237074375153,
0.025063171982765198,
-0.035777732729911804,
0.004230157937854528,
0.11664365231990814,
0.020515060052275658,
-0.19317567348480225,
-0.08734719455242157,
-0.022085582837462425,
-0.02858036383986473,
0.020678507164120674,
0.0021345457062125206,
0.05248800665140152,
0.06632480025291443,
-0.02864672802388668,
-0.10955878347158432,
0.020922360941767693,
0.118246890604496,
0.1660619080066681,
-0.01559637300670147,
0.006761745549738407,
-0.021654654294252396,
-0.010524723678827286,
0.08337539434432983,
-0.047928161919116974,
0.06412722170352936,
-0.01594465598464012,
0.17633497714996338,
0.04716718941926956,
0.06476910412311554,
-0.0004948742571286857,
-0.013612261973321438,
-0.03275202214717865,
0.011662177741527557,
-0.03488341346383095,
-0.1038520559668541,
-0.0798109844326973,
0.09912761300802231,
0.01834878884255886,
0.052770473062992096,
-0.024969538673758507,
0.00793472584336996,
0.1771516650915146,
0.220070943236351,
0.0045166886411607265,
-0.06913778185844421,
-0.07731466740369797,
0.05103645846247673,
-0.013910187408328056,
-0.03344077244400978,
-0.055313631892204285,
0.06651468575000763,
-0.08468339592218399,
0.03548211231827736,
-0.06477966159582138,
0.03746895492076874,
-0.04660004749894142,
0.06746387481689453,
-0.035220589488744736,
0.13771522045135498,
0.005135141313076019,
0.056363556534051895,
-0.1014121025800705,
0.12766486406326294,
0.03833458945155144,
0.08364848792552948,
-0.055353619158267975,
0.009995623491704464,
0.04887222498655319,
-0.031097985804080963,
0.19779162108898163,
0.03899058327078819,
0.04068595543503761,
-0.012961804866790771,
-0.013205609284341335,
-0.021570589393377304,
0.1217234805226326,
-0.06443963199853897,
0.07549524307250977,
-0.02278311923146248,
-0.004002271685749292,
-0.021099135279655457,
0.050446245819330215,
-0.10020462423563004,
-0.1382584571838379,
0.0828075110912323,
-0.10359025746583939,
0.027459651231765747,
-0.021268822252750397,
-0.04259154945611954,
-0.18045219779014587,
0.26688089966773987,
-0.16850876808166504,
-0.11931832134723663,
-0.08566023409366608,
-0.03153497353196144,
0.10301583260297775,
-0.05207788944244385,
-0.0017899019876495004,
0.010676607489585876,
0.1378968060016632,
-0.05993299186229706,
0.04803404584527016,
0.05870485678315163,
-0.03390815481543541,
-0.13956928253173828,
-0.03140140697360039,
0.061725422739982605,
0.05855611711740494,
0.021054638549685478,
0.041494809091091156,
0.05506148189306259,
0.03151581436395645,
-0.08931384980678558,
-0.008727090433239937,
0.0867963507771492,
-0.025946887210011482,
0.04312947019934654,
-0.05857110768556595,
-0.11131981015205383,
-0.13919751346111298,
-0.02668338268995285,
0.124398373067379,
0.22609136998653412,
-0.07120246440172195,
0.13653279840946198,
0.2003237009048462,
-0.13103777170181274,
-0.1868254542350769,
0.005094780586659908,
0.08426754921674728,
-0.038301050662994385,
0.12081552296876907,
-0.10948403179645538,
0.03912336379289627,
0.060386717319488525,
0.017607809975743294,
-0.15992869436740875,
-0.16129538416862488,
-0.10319782048463821,
0.04892968758940697,
0.057330284267663956,
-0.0901871770620346,
-0.12239819765090942,
-0.07615125179290771,
-0.031987547874450684,
0.021702460944652557,
0.14468535780906677,
-0.05880173668265343,
0.05424833297729492,
0.03564848750829697,
0.026245642453432083,
0.0711255818605423,
-0.03251686319708824,
0.17541180551052094,
-0.042407650500535965,
0.06998006999492645,
-0.10740716010332108,
-0.01176639273762703,
0.12166574597358704,
-0.05108066275715828,
0.08298645913600922,
-0.04715821519494057,
-0.017640244215726852,
-0.10668231546878815,
-0.03494664281606674,
-0.07000721246004105,
0.08201104402542114,
-0.06595968455076218,
-0.08981481939554214,
-0.02436443790793419,
0.134597510099411,
0.12960995733737946,
-0.0340927392244339,
-0.023100068792700768,
-0.13271133601665497,
-0.0010757017880678177,
0.19903306663036346,
0.21265502274036407,
0.12529867887496948,
-0.10217797756195068,
0.03244961425662041,
-0.020154830068349838,
0.06713476032018661,
-0.025698918849229813,
0.028222238644957542,
0.05909864231944084,
0.0180372204631567,
0.1556846648454666,
-0.06120515242218971,
-0.10672964155673981,
0.01671595126390457,
0.049335312098264694,
-0.09492401033639908,
-0.13820765912532806,
0.0006586469244211912,
0.01510096900165081,
-0.08916059881448746,
-0.10051719099283218,
0.1827095001935959,
-0.06202046573162079,
-0.008134009316563606,
0.004199706483632326,
0.0798536166548729,
-0.018869467079639435,
0.014598855748772621,
0.028567150235176086,
0.037100307643413544,
-0.04959753155708313,
0.1626916378736496,
0.07632182538509369,
-0.1979324221611023,
0.1306730955839157,
0.1464719921350479,
-0.03321194648742676,
-0.07436912506818771,
-0.016071071848273277,
0.17241951823234558,
-0.08395375311374664,
-0.009343677200376987,
0.009509806521236897,
-0.06308094412088394,
-0.04555824026465416,
0.2181389033794403,
0.025792526081204414,
0.04980207979679108,
-0.04258815571665764,
0.0003644190146587789,
-0.059554629027843475,
0.09966535866260529,
0.07921399921178818,
-0.04935626685619354,
-0.0140169532969594,
0.08608371764421463,
-0.046110253781080246,
-0.08988361805677414,
-0.013942118734121323,
-0.013620776124298573,
-0.09084798395633698,
-0.07485933601856232,
-0.17513883113861084,
0.02008737437427044,
-0.04070863127708435,
0.015347550623118877,
0.007141444366425276,
0.027474693953990936,
0.032176777720451355,
-0.003767254762351513,
-0.05102366581559181,
0.002982096280902624,
0.02935543656349182,
0.10879480093717575,
-0.14916349947452545,
-0.04759247228503227,
0.061463095247745514,
-0.04288133606314659,
0.06431767344474792,
-0.005093407817184925,
-0.01376426313072443,
0.011482752859592438,
-0.21775412559509277,
0.025929205119609833,
-0.03439914062619209,
-0.042028799653053284,
0.010495691560208797,
-0.1452544778585434,
-0.038029417395591736,
-0.040110018104314804,
-0.02965332753956318,
0.0030802092514932156,
0.07957707345485687,
-0.06259385496377945,
0.08522088825702667,
0.12888281047344208,
-0.0033654284197837114,
-0.07109533250331879,
0.025546101853251457,
0.04419122636318207,
-0.02871221862733364,
0.11123421043157578,
-0.06575740873813629,
0.06932506710290909,
-0.08314475417137146,
0.0050973547622561455,
0.024141555652022362,
0.01738297939300537,
-0.12478449940681458,
0.0013928060652688146,
0.024114320054650307,
-0.04465511813759804,
0.07786490023136139,
0.03480137884616852,
-0.03052620030939579,
0.04949795454740524,
-0.07186255604028702,
-0.06642620265483856,
0.0348559208214283,
0.024450557306408882,
-0.007014369126409292,
-0.028958884999155998,
-0.064794160425663,
-0.019472062587738037,
-0.017680518329143524,
-0.028852134943008423,
0.07879643142223358,
0.1860216110944748,
0.11333587020635605,
0.031844742596149445,
0.049264200031757355,
0.01814032904803753,
-0.015355359762907028,
0.20183147490024567,
0.07609657198190689,
0.05802978202700615,
-0.07404191046953201,
0.04734765365719795,
0.13119962811470032,
-0.10824903845787048,
0.13426077365875244,
-0.03412922844290733,
-0.0544903539121151,
-0.032704614102840424,
-0.20468494296073914,
-0.08327531814575195,
0.042665034532547,
0.012763259001076221,
-0.13669182360172272,
0.07082901149988174,
-0.025358589366078377,
0.02935665473341942,
-0.008066922426223755,
0.06180622801184654,
-0.1362277716398239,
-0.08570694178342819,
0.12040212750434875,
0.00053560477681458,
-0.008572323247790337,
0.01722438633441925,
0.023229314014315605,
0.03076924756169319,
0.09238889068365097,
0.010932746343314648,
0.09925646334886551,
0.07832404226064682,
-0.0779142826795578,
-0.048904597759246826,
-0.11763583868741989,
0.02736835740506649,
0.013674763962626457,
-0.02095148339867592,
0.19354186952114105,
0.03877713903784752,
-0.05131639540195465,
-0.03586212173104286,
0.15608644485473633,
-0.06293099373579025,
-0.12157345563173294,
-0.1207229420542717,
0.23197804391384125,
-0.03669228404760361,
0.09282075613737106,
-0.02465939149260521,
-0.10863830894231796,
0.09956704080104828,
0.08963561058044434,
0.07846649736166,
-0.060569263994693756,
0.03944259509444237,
-0.04910940304398537,
0.010562952607870102,
-0.05772321671247482,
-0.04056142270565033,
-0.019431494176387787,
0.1321268081665039,
-0.120963454246521,
0.18962445855140686,
-0.020542025566101074,
-0.03723292425274849,
-0.07558226585388184,
0.05630948767066002,
-0.03187999874353409,
0.0467338003218174,
-0.05735380947589874,
0.10804229974746704,
-0.06590738147497177,
-0.22058618068695068,
0.051720865070819855,
-0.08094368129968643,
-0.11314086616039276,
-0.005170207004994154,
0.07381048053503036,
0.0889228880405426,
0.08052270114421844,
0.0677647590637207,
-0.03502865135669708,
0.06297580152750015,
0.005240375641733408,
-0.014113635756075382,
-0.012283362448215485,
0.1041211187839508,
-0.02867208607494831,
0.1513093262910843,
0.026567477732896805,
0.04312334582209587,
0.11884915083646774,
-0.050456687808036804,
-0.10266437381505966,
0.017498472705483437,
0.028958376497030258,
-0.12837587296962738,
0.008610640652477741,
0.19835180044174194,
0.0068070595152676105,
0.13273955881595612,
0.05964033305644989,
-0.10477068275213242,
0.053493984043598175,
0.12604790925979614,
-0.06248566880822182,
-0.021123021841049194,
0.09404585510492325,
-0.09613364934921265,
0.1104588657617569,
0.19986505806446075,
0.007801195606589317,
-0.0101876650005579,
-0.08563048392534256,
0.02184704877436161,
0.016971511766314507,
0.002913209842517972,
-0.020088372752070427,
-0.06824415177106857,
0.027839791029691696,
0.15948575735092163,
0.04954449087381363,
-0.13221468031406403,
-0.09639368206262589,
0.04255543649196625,
0.05882527679204941,
0.02233639732003212,
0.0440823920071125,
0.08842361718416214,
0.042528778314590454,
-0.002072684932500124,
-0.14146482944488525,
0.05435483530163765,
0.09941036999225616,
-0.021604981273412704,
0.0009133583516813815
] |
null | null | transformers | #Rick DialoGPT Model.
>Following https://github.com/RuolinZheng08/twewy-discord-chatbot Tutorial. | {"tags": ["conversational"]} | text-generation | Redolid/DialoGPT-small-Rick | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| #Rick DialoGPT Model.
>Following URL Tutorial. | [] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] | [
51
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] | [
-0.009697278961539268,
0.03208012506365776,
-0.007204889785498381,
0.004809224978089333,
0.16726240515708923,
0.014898733235895634,
0.09765533357858658,
0.13672804832458496,
-0.007841327227652073,
-0.031050153076648712,
0.14490588009357452,
0.20411323010921478,
-0.006439372431486845,
0.0661218985915184,
-0.07572533935308456,
-0.2683109939098358,
0.05759621039032936,
0.046649303287267685,
0.016515716910362244,
0.1200079694390297,
0.08573378622531891,
-0.05473608896136284,
0.08714032918214798,
-0.014583407901227474,
-0.150366872549057,
0.017733458429574966,
0.043394338339567184,
-0.12260226160287857,
0.11910516023635864,
0.05462685227394104,
0.07063519209623337,
0.014929565601050854,
-0.07541623711585999,
-0.1631229966878891,
0.03031250834465027,
0.01425902172923088,
-0.0594632662832737,
0.04757995903491974,
0.059961482882499695,
-0.10165371745824814,
0.10819483548402786,
0.09530027210712433,
-0.013078106567263603,
0.06798283755779266,
-0.16849711537361145,
-0.020869607105851173,
-0.01446688175201416,
0.009899779222905636,
0.05550243332982063,
0.09964893013238907,
-0.03413357585668564,
0.10497362166643143,
-0.09214533120393753,
0.11017382889986038,
0.10932035744190216,
-0.32057443261146545,
-0.005767723545432091,
0.09167823940515518,
0.039358653128147125,
0.07352814823389053,
-0.04467793554067612,
0.06258884817361832,
0.018015462905168533,
0.017986174672842026,
-0.014015024527907372,
-0.07283061742782593,
-0.11612214148044586,
0.04717336222529411,
-0.08668071031570435,
-0.059868961572647095,
0.2244078367948532,
-0.05464440956711769,
0.06881742179393768,
-0.05281897634267807,
-0.10522868484258652,
-0.04308144748210907,
-0.029833965003490448,
0.00475557055324316,
-0.07660607248544693,
0.08692064881324768,
0.00869679357856512,
-0.09547875821590424,
-0.1376667022705078,
-0.02496783249080181,
-0.1776352822780609,
0.16140350699424744,
0.02465328387916088,
0.05232657864689827,
-0.2027255892753601,
0.09623090922832489,
0.017906051129102707,
-0.08045592904090881,
0.022091427817940712,
-0.10046248883008957,
0.029131146147847176,
0.013760408386588097,
-0.04754498973488808,
-0.061387211084365845,
0.0843690037727356,
0.11199145019054413,
-0.01731434464454651,
0.025486016646027565,
-0.039331406354904175,
0.08100687712430954,
0.03553595021367073,
0.09077847748994827,
0.007288969587534666,
-0.028338588774204254,
0.025842782109975815,
-0.13719046115875244,
-0.003647835226729512,
-0.07116208970546722,
-0.16572439670562744,
-0.021088803187012672,
0.02994808368384838,
0.08289173990488052,
0.015449047088623047,
0.11682453751564026,
-0.03272046521306038,
-0.025152435526251793,
0.03602350503206253,
-0.047656361013650894,
-0.012649794109165668,
0.016648368909955025,
0.013163427822291851,
0.12399329990148544,
-0.0022096503525972366,
0.03235051408410072,
-0.13653022050857544,
0.031423524022102356,
-0.06793295592069626,
-0.003740974934771657,
-0.03486552834510803,
-0.040637075901031494,
0.009043924510478973,
-0.06862333416938782,
0.003486064961180091,
-0.15030112862586975,
-0.15063877403736115,
0.007587034720927477,
-0.007836631499230862,
-0.04107699543237686,
-0.06370922178030014,
-0.06952770054340363,
-0.013550350442528725,
0.04251532256603241,
-0.07093454152345657,
-0.011352915316820145,
-0.06403283774852753,
0.11004766076803207,
-0.03197755664587021,
0.07921615242958069,
-0.11953279376029968,
0.08390819281339645,
-0.11260783672332764,
-0.02386913076043129,
-0.060801517218351364,
0.09317506104707718,
-0.0006014376995153725,
0.09549830108880997,
-0.006563255097717047,
-0.017931854352355003,
-0.07981178909540176,
0.06445012241601944,
-0.042872510850429535,
0.21701598167419434,
-0.0615808479487896,
-0.11181682348251343,
0.28781595826148987,
-0.052628401666879654,
-0.1370542049407959,
0.11647392809391022,
0.008682746440172195,
0.05777018144726753,
0.10703510791063309,
0.19733482599258423,
-0.015276194550096989,
0.004040541127324104,
0.09471915662288666,
0.11263324320316315,
-0.11276852339506149,
-0.033160366117954254,
0.013019153848290443,
-0.04081077128648758,
-0.10867965966463089,
0.04689536616206169,
0.09810488671064377,
0.07090286910533905,
-0.04786505550146103,
-0.03377414867281914,
-0.01366397924721241,
0.0052589005790650845,
0.08885077387094498,
-0.007157256826758385,
0.10962837189435959,
-0.05819983780384064,
-0.03796621412038803,
-0.029282379895448685,
-0.012126247398555279,
-0.03951939567923546,
0.03137664496898651,
-0.043376367539167404,
0.10821941494941711,
-0.011204327456653118,
0.06364280730485916,
-0.16185984015464783,
-0.07691477984189987,
-0.017002692446112633,
0.1581239402294159,
0.024538565427064896,
0.09859629720449448,
0.0552486926317215,
-0.040398042649030685,
-0.0012767292791977525,
0.012792680412530899,
0.15581141412258148,
-0.022091681137681007,
-0.065607450902462,
-0.052166227251291275,
0.08642971515655518,
-0.05641226842999458,
0.04504093527793884,
-0.05937713757157326,
0.012367865070700645,
0.05064384639263153,
0.10342344641685486,
-0.00018274025933351368,
0.03323284164071083,
-0.008164864964783192,
0.002145637758076191,
-0.058205123990774155,
0.007405933458358049,
0.10799351334571838,
0.00036868182360194623,
-0.07365862280130386,
0.22074243426322937,
-0.17796069383621216,
0.1765957772731781,
0.1893044263124466,
-0.299345999956131,
0.017949223518371582,
-0.10759581625461578,
-0.04561871662735939,
0.014407722279429436,
0.05567655712366104,
-0.0454222597181797,
0.1703362911939621,
-0.009871348738670349,
0.18874616920948029,
-0.04946064203977585,
-0.04464937001466751,
-0.0200483538210392,
-0.05118836089968681,
-0.0024189651012420654,
0.07781197130680084,
0.10685696452856064,
-0.13992026448249817,
0.1964332014322281,
0.1621224284172058,
0.048237916082143784,
0.19945049285888672,
0.015346456319093704,
-0.011589210480451584,
0.0909530371427536,
0.005220826715230942,
-0.058739423751831055,
-0.07409929484128952,
-0.2594851851463318,
-0.030033592134714127,
0.07992640137672424,
0.0422382652759552,
0.1212305948138237,
-0.11349532753229141,
-0.038956157863140106,
-0.01763172075152397,
-0.023146281018853188,
0.021672505885362625,
0.0914369598031044,
0.06075398623943329,
0.13201528787612915,
-0.001710098935291171,
-0.007300339173525572,
0.10524573177099228,
0.01783694699406624,
-0.09354141354560852,
0.18308524787425995,
-0.13652534782886505,
-0.37097251415252686,
-0.13911493122577667,
-0.18057456612586975,
-0.05449081212282181,
0.05712554603815079,
0.11679314076900482,
-0.12011238187551498,
-0.018752124160528183,
0.01578843593597412,
0.10931742936372757,
-0.08449502289295197,
0.0021454424131661654,
-0.06880278885364532,
0.0321490578353405,
-0.10310184955596924,
-0.09194442629814148,
-0.055416494607925415,
-0.031392451375722885,
-0.08001253753900528,
0.1423761546611786,
-0.10777941346168518,
0.04476889222860336,
0.20262959599494934,
0.04653622955083847,
0.05625178664922714,
-0.044105201959609985,
0.19377262890338898,
-0.11264272034168243,
-0.01661740615963936,
0.19215328991413116,
-0.048360925167798996,
0.07476246356964111,
0.1232115849852562,
-0.006348740309476852,
-0.08765771239995956,
0.03011748194694519,
-0.02085109055042267,
-0.07988511025905609,
-0.23219464719295502,
-0.13938382267951965,
-0.12429051846265793,
0.09477275609970093,
0.028005298227071762,
0.056365787982940674,
0.17219258844852448,
0.06577219814062119,
-0.038416244089603424,
0.006410336587578058,
0.02959546446800232,
0.08237514644861221,
0.23417828977108002,
-0.06035616248846054,
0.1364797055721283,
-0.03420931473374367,
-0.14982740581035614,
0.08169995993375778,
0.0713929831981659,
0.10213395953178406,
0.06678459793329239,
0.0804823637008667,
0.0149586396291852,
0.06188136339187622,
0.1311223804950714,
0.08191446959972382,
0.019586285576224327,
-0.02480296604335308,
-0.03388110175728798,
-0.025523077696561813,
-0.05937909707427025,
0.040128443390131,
0.06589099019765854,
-0.16763372719287872,
-0.039227183908224106,
-0.09338314831256866,
0.09657008945941925,
0.0873042419552803,
0.06609832495450974,
-0.1842060089111328,
-0.008006223477423191,
0.08488986641168594,
-0.03854905813932419,
-0.13727426528930664,
0.09535189718008041,
0.01523482333868742,
-0.15144726634025574,
0.03139317408204079,
-0.04061909019947052,
0.12188644707202911,
-0.07804752141237259,
0.09809603542089462,
-0.08108244836330414,
-0.07448557764291763,
0.02123199962079525,
0.1261177361011505,
-0.30527687072753906,
0.20240111649036407,
-0.0024993624538183212,
-0.06486981362104416,
-0.1243603527545929,
-0.0032166161108762026,
0.002410882618278265,
0.07357452809810638,
0.10519039630889893,
-0.007196315098553896,
0.001897757756523788,
-0.06300821900367737,
-0.01829923689365387,
0.032471053302288055,
0.13080233335494995,
-0.0401318334043026,
-0.021158374845981598,
-0.050194524228572845,
-0.001653497340157628,
-0.03173094615340233,
-0.06934895366430283,
0.02002747356891632,
-0.19509181380271912,
0.08751901984214783,
0.04166261479258537,
0.09648149460554123,
0.029994789510965347,
0.004265148192644119,
-0.09651939570903778,
0.24698667228221893,
-0.07148019969463348,
-0.10072879493236542,
-0.10919588059186935,
-0.046813901513814926,
0.03569883480668068,
-0.05628936365246773,
0.04309194162487984,
-0.0788632407784462,
0.028997479006648064,
-0.06352769583463669,
-0.19235502183437347,
0.12410202622413635,
-0.09027006477117538,
-0.04412810131907463,
-0.02371402643620968,
0.2110891044139862,
-0.05598580464720726,
0.010335659608244896,
0.02930437959730625,
0.01208863127976656,
-0.11645778268575668,
-0.09678568691015244,
0.031018631532788277,
-0.007351789623498917,
0.050603240728378296,
0.041841957718133926,
-0.05915454775094986,
-0.017138581722974777,
-0.052199993282556534,
-0.022926922887563705,
0.3496883809566498,
0.14231905341148376,
-0.043836336582899094,
0.19347235560417175,
0.12347975373268127,
-0.07452994585037231,
-0.3159443140029907,
-0.1066238060593605,
-0.10937739163637161,
-0.04680149629712105,
-0.07012093812227249,
-0.2002030611038208,
0.06474938243627548,
0.00662544509395957,
-0.013415241613984108,
0.12749312818050385,
-0.2561831772327423,
-0.07571036368608475,
0.15906259417533875,
-0.017980827018618584,
0.3745945692062378,
-0.1168576180934906,
-0.10926306992769241,
-0.03950892388820648,
-0.14175476133823395,
0.16968177258968353,
-0.01989765651524067,
0.11221715062856674,
-0.009765521623194218,
0.14388824999332428,
0.05548359826207161,
-0.023479344323277473,
0.08544106781482697,
0.004999885335564613,
-0.03290518373250961,
-0.10304180532693863,
-0.05676887184381485,
0.007092386484146118,
0.02477436140179634,
0.018026655539870262,
-0.041834570467472076,
0.02227151393890381,
-0.11731979995965958,
-0.04657655209302902,
-0.08982590585947037,
0.04431166127324104,
0.03899754583835602,
-0.07325074821710587,
-0.002380647463724017,
-0.07165111601352692,
-0.012272949330508709,
0.022334342822432518,
0.20356793701648712,
-0.08029330521821976,
0.16448934376239777,
0.09239562600851059,
0.12419285625219345,
-0.14376309514045715,
-0.00019283240544609725,
-0.0762530043721199,
-0.05611240118741989,
0.07737895101308823,
-0.09433035552501678,
0.058893077075481415,
0.10901971161365509,
-0.04567738622426987,
0.08828683942556381,
0.10377411544322968,
0.008936077356338501,
0.003213887568563223,
0.10916902124881744,
-0.2667325437068939,
-0.0296600554138422,
-0.07532413303852081,
0.000883326749317348,
0.09092561900615692,
0.08562852442264557,
0.18840822577476501,
0.025361526757478714,
-0.04293036088347435,
-0.002770674182102084,
0.028597986325621605,
-0.039021048694849014,
0.051667019724845886,
0.001123449532315135,
0.01947369985282421,
-0.1530752182006836,
0.072522833943367,
0.01490565575659275,
-0.15215420722961426,
0.021316176280379295,
0.16572684049606323,
-0.11656328290700912,
-0.1283872276544571,
-0.06520111113786697,
0.08313824236392975,
-0.11755692958831787,
-0.01578943058848381,
-0.03279297426342964,
-0.13145680725574493,
0.07992171496152878,
0.12629036605358124,
0.05557859688997269,
0.0972496047616005,
-0.06061713397502899,
-0.020469192415475845,
-0.018721895292401314,
-0.014099318534135818,
-0.012384648434817791,
-0.007667020428925753,
-0.055978111922740936,
0.0590752474963665,
-0.026677248999476433,
0.1425808072090149,
-0.09221141785383224,
-0.1037059873342514,
-0.16142144799232483,
0.0374140702188015,
-0.11013076454401016,
-0.08825794607400894,
-0.08821134269237518,
-0.050188567489385605,
0.002360827289521694,
-0.019856395199894905,
-0.04037635400891304,
-0.05829505994915962,
-0.12300454825162888,
0.0338277705013752,
-0.040771447122097015,
0.024727050215005875,
-0.07512269169092178,
0.015856385231018066,
0.08507686108350754,
-0.03285100311040878,
0.15655414760112762,
0.1450488418340683,
-0.1006515845656395,
0.10741901397705078,
-0.14806775748729706,
-0.09138492494821548,
0.11116421222686768,
0.015329592861235142,
0.0449691042304039,
0.09723787009716034,
0.013362943194806576,
0.0635865181684494,
0.032776717096567154,
0.05308786407113075,
0.027619892731308937,
-0.11959987878799438,
0.06483134627342224,
-0.03626115620136261,
-0.14700546860694885,
-0.049338050186634064,
-0.05282869189977646,
0.01647452637553215,
0.013054544106125832,
0.09622690081596375,
-0.05301849544048309,
0.10698331147432327,
-0.04055701196193695,
0.0346808135509491,
0.017554637044668198,
-0.1730053424835205,
-0.03816922754049301,
-0.08538098633289337,
0.03681723028421402,
0.014741539023816586,
0.25266793370246887,
0.030072299763560295,
0.012416383251547813,
0.032671261578798294,
0.08285367488861084,
0.03899408504366875,
0.010228337720036507,
0.17482228577136993,
0.1162426546216011,
-0.06621865928173065,
-0.10445023328065872,
0.0729617029428482,
0.016332454979419708,
0.01286179106682539,
0.13617953658103943,
0.008365051820874214,
0.005795429926365614,
0.08649782836437225,
-0.016865963116288185,
0.009968153201043606,
-0.10052056610584259,
-0.13426925241947174,
-0.022176474332809448,
0.05151832848787308,
-0.04655967652797699,
0.11727844923734665,
0.1406494379043579,
-0.01806013658642769,
0.03222079202532768,
-0.021771740168333054,
-0.05699979141354561,
-0.1683429479598999,
-0.1429590880870819,
-0.06883849948644638,
-0.13416796922683716,
0.00897989235818386,
-0.11180389672517776,
0.05395037308335304,
0.06001098081469536,
0.06750501692295074,
-0.06899319589138031,
0.10220931470394135,
0.04626858979463577,
-0.11440542340278625,
0.06264589726924896,
-0.0296088308095932,
0.09430401772260666,
-0.02759445086121559,
-0.019505485892295837,
-0.09039592742919922,
0.014574515633285046,
0.011419114656746387,
0.06245238706469536,
-0.04707273095846176,
0.007463190704584122,
-0.14696238934993744,
-0.08972041308879852,
-0.0523175448179245,
0.0718572810292244,
-0.050409089773893356,
0.14282815158367157,
0.00775480642914772,
-0.0170906875282526,
0.039554283022880554,
0.22787313163280487,
-0.07476283609867096,
-0.04778539761900902,
-0.05269690603017807,
0.20717895030975342,
0.02975541539490223,
0.1171872541308403,
-0.022938819602131844,
-0.006106364540755749,
-0.0919521227478981,
0.3764844834804535,
0.30030161142349243,
-0.09031439572572708,
0.011794124729931355,
0.02137952297925949,
0.04502861574292183,
0.1316293478012085,
0.1216534823179245,
0.10318691283464432,
0.3006802201271057,
-0.07452366501092911,
-0.04653361067175865,
-0.012629742734134197,
-0.023858042433857918,
-0.09059546142816544,
0.1021224707365036,
0.04839762672781944,
-0.06382183730602264,
-0.03313443064689636,
0.0954432487487793,
-0.25862133502960205,
0.1277991235256195,
-0.12311873584985733,
-0.17578600347042084,
-0.06654827296733856,
0.009760108776390553,
0.10465722531080246,
0.015642458572983742,
0.0946015790104866,
0.007128213066607714,
-0.11252258718013763,
0.06305865943431854,
0.03397420793771744,
-0.22762253880500793,
0.0006893770187161863,
0.06642123311758041,
-0.07006710022687912,
-0.0024247700348496437,
-0.026499588042497635,
0.05657242611050606,
0.0656052976846695,
0.054629553109407425,
-0.00971333310008049,
0.03816632181406021,
0.0034184439573436975,
-0.0585215799510479,
0.016623929142951965,
0.05121519789099693,
0.02472509816288948,
-0.09763528406620026,
0.06927435845136642,
-0.1574270874261856,
0.04766253009438515,
-0.0030655991286039352,
-0.04124255105853081,
0.006064958870410919,
0.008823691867291927,
-0.06491616368293762,
0.05165379121899605,
0.07916834205389023,
-0.0016257909592241049,
-0.0062433634884655476,
-0.057178743183612823,
-0.02632102556526661,
-0.027755750343203545,
-0.09291748702526093,
-0.10495562851428986,
-0.14682936668395996,
-0.11640441417694092,
0.09368976950645447,
-0.01011267676949501,
-0.1848134547472,
0.022154374048113823,
-0.08606051653623581,
0.08319322764873505,
-0.1670055389404297,
0.08040720224380493,
0.07041648775339127,
0.013038921169936657,
-0.0031511052511632442,
-0.02002427540719509,
0.054132770746946335,
0.086809903383255,
-0.10407156497240067,
-0.07400695979595184
] |
null | null | transformers |
# Steins Gate DialoGPT Model | {"tags": ["conversational"]} | text-generation | Rei/DialoGPT-medium-kurisu | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Steins Gate DialoGPT Model | [
"# Steins Gate DialoGPT Model"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Steins Gate DialoGPT Model"
] | [
51,
9
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Steins Gate DialoGPT Model"
] | [
-0.046598706394433975,
0.06981657445430756,
-0.0057489341124892235,
0.03504592925310135,
0.08828812092542648,
-0.02516520395874977,
0.18319901823997498,
0.12216830253601074,
0.06301577389240265,
-0.04044743627309799,
0.13840942084789276,
0.15021193027496338,
-0.012106616981327534,
0.13563913106918335,
-0.07207920402288437,
-0.26969242095947266,
0.06512366235256195,
0.0505230650305748,
-0.0036240890622138977,
0.10412831604480743,
0.10032135248184204,
-0.04963393881917,
0.07635006308555603,
0.010438334196805954,
-0.11065434664487839,
0.03158833831548691,
0.01116763986647129,
-0.1111251562833786,
0.1358223408460617,
0.05726832151412964,
0.005046091508120298,
0.02873755432665348,
-0.060036107897758484,
-0.16042394936084747,
0.03296975418925285,
-0.015088582411408424,
-0.0426013208925724,
0.05471273511648178,
0.02142990753054619,
-0.10424590110778809,
0.06972432881593704,
0.07768756151199341,
0.007540381513535976,
0.03148684650659561,
-0.1376871019601822,
-0.09781760722398758,
0.02656593546271324,
0.15231946110725403,
0.1018720492720604,
0.061649806797504425,
-0.02027689293026924,
0.07774277776479721,
-0.0605953186750412,
0.12760497629642487,
0.12223295867443085,
-0.322817862033844,
0.0015639602206647396,
0.08216249197721481,
0.023275237530469894,
0.05518084019422531,
-0.05855183303356171,
0.0882180705666542,
0.013953437097370625,
-0.02471739798784256,
0.01456486713141203,
-0.10227209329605103,
-0.068756103515625,
0.04402991384267807,
-0.09924975782632828,
0.0074457935988903046,
0.19801035523414612,
-0.01350457314401865,
0.054563477635383606,
-0.10280314832925797,
-0.11239580065011978,
0.032354749739170074,
-0.03680616617202759,
-0.048799894750118256,
-0.08369548618793488,
0.06846136599779129,
-0.0018268153071403503,
-0.11900603771209717,
-0.11185361444950104,
-0.03135591745376587,
-0.1258203089237213,
0.1860152930021286,
0.04644016921520233,
0.04772649705410004,
-0.1871321201324463,
0.1122131496667862,
0.017892710864543915,
-0.10198433697223663,
0.003528713248670101,
-0.07760002464056015,
0.004063812084496021,
0.012881811708211899,
-0.019192136824131012,
0.016063055023550987,
0.04193074256181717,
0.21498671174049377,
0.008207451552152634,
0.030334535986185074,
0.013615417294204235,
0.057046301662921906,
0.04679345339536667,
0.09028291702270508,
0.0021172077395021915,
-0.061156779527664185,
0.03013274073600769,
-0.10046187043190002,
0.0036060968413949013,
-0.06969059258699417,
-0.17598679661750793,
-0.027711277827620506,
0.05757623910903931,
0.05630027502775192,
-0.01192915067076683,
0.13097049295902252,
0.000516669824719429,
-0.043277595192193985,
0.11209198832511902,
-0.030344609171152115,
0.007889765314757824,
0.014958680607378483,
0.003942565061151981,
0.12642142176628113,
0.014152979478240013,
0.043240394443273544,
-0.12699688971042633,
0.039772529155015945,
-0.05566873401403427,
-0.0015961260069161654,
0.004677793476730585,
-0.04166585952043533,
-0.007252980954945087,
-0.05934169143438339,
0.020119089633226395,
-0.15007227659225464,
-0.1971360743045807,
0.014328686520457268,
-0.002923211082816124,
-0.04548906907439232,
-0.09354248642921448,
-0.07148538529872894,
0.0008680391474626958,
0.039141543209552765,
-0.07677352428436279,
-0.07969751954078674,
-0.044029396027326584,
0.05878230184316635,
-0.029408086091279984,
0.09139595925807953,
-0.11790908128023148,
0.06674680858850479,
-0.10550209134817123,
-0.01545386016368866,
-0.09462938457727432,
0.13206687569618225,
-0.053457047790288925,
0.07130305469036102,
-0.032903362065553665,
-0.007968103513121605,
-0.01926996558904648,
0.07108775526285172,
-0.03001455031335354,
0.23359906673431396,
-0.10372474789619446,
-0.11849360167980194,
0.23709064722061157,
-0.07280358672142029,
-0.1329924464225769,
0.13725338876247406,
-0.012492773123085499,
0.04536980390548706,
0.14128148555755615,
0.2998073995113373,
-0.005181821063160896,
-0.03402094915509224,
0.09297218918800354,
0.12144653499126434,
-0.060894813388586044,
-0.007912656292319298,
0.023655444383621216,
-0.01604151353240013,
-0.020282134413719177,
0.04267849773168564,
0.046876873821020126,
0.060100849717855453,
-0.026580434292554855,
-0.019047243520617485,
0.0031996089965105057,
-0.01370062306523323,
0.08326950669288635,
-0.03745201975107193,
0.12784115970134735,
-0.04159214347600937,
-0.04061127454042435,
0.05448601767420769,
0.010435031726956367,
-0.05368191748857498,
0.052568819373846054,
-0.06699897348880768,
0.07913656532764435,
0.010386945679783821,
0.059589795768260956,
-0.11661924421787262,
-0.008665019646286964,
-0.0263441801071167,
0.13720203936100006,
0.05675635486841202,
0.1366521567106247,
0.05048690736293793,
-0.028110263869166374,
-0.03255198150873184,
0.014934428036212921,
0.1736277937889099,
-0.003305416088551283,
-0.09268897771835327,
-0.11544273048639297,
0.06594375520944595,
-0.05144505947828293,
0.0225076824426651,
-0.043457768857479095,
0.01978028193116188,
0.081072136759758,
0.10658146440982819,
0.004462909884750843,
0.023692648857831955,
0.0047832694835960865,
-0.02585975080728531,
-0.04429472237825394,
-0.007512346841394901,
0.101143017411232,
-0.002690977416932583,
-0.07070744037628174,
0.23579531908035278,
-0.17758986353874207,
0.1659088432788849,
0.15085746347904205,
-0.1923084706068039,
-0.015834901481866837,
-0.1354939192533493,
-0.04931916296482086,
-0.00342572177760303,
0.06958503276109695,
-0.02788088656961918,
0.24319809675216675,
-0.025444310158491135,
0.171758770942688,
-0.04897511750459671,
-0.06826622039079666,
-0.05439818650484085,
-0.0740937888622284,
0.027788124978542328,
0.10180999338626862,
0.028586216270923615,
-0.15409094095230103,
0.14507345855236053,
0.09137473255395889,
0.01292908564209938,
0.19796669483184814,
0.02913358062505722,
0.027045467868447304,
0.05960281938314438,
-0.0009804433211684227,
-0.052155911922454834,
-0.025612473487854004,
-0.2458745539188385,
-0.05422390252351761,
0.0799112319946289,
0.0407511368393898,
0.09014205634593964,
-0.09112099558115005,
-0.004764287266880274,
0.005290882661938667,
-0.02161773480474949,
0.0709642618894577,
0.12620320916175842,
0.0026740317698568106,
0.10971065610647202,
-0.02907227724790573,
-0.068565234541893,
0.05856456607580185,
0.015595171600580215,
-0.09435352683067322,
0.19722527265548706,
-0.11243151128292084,
-0.30506497621536255,
-0.12481129169464111,
-0.20991921424865723,
-0.060099828988313675,
0.05433497205376625,
0.10658648610115051,
-0.07508914172649384,
-0.02746070921421051,
-0.009716231375932693,
0.11668717861175537,
-0.05976248160004616,
0.005249978043138981,
-0.004482807591557503,
-0.023130793124437332,
-0.08778657019138336,
-0.10158425569534302,
-0.05635248124599457,
-0.03618007153272629,
-0.032019682228565216,
0.07758122682571411,
-0.09055046737194061,
0.007759491913020611,
0.2000955045223236,
0.043658822774887085,
0.05173507332801819,
-0.056108273565769196,
0.23232799768447876,
-0.08172077685594559,
0.05164581164717674,
0.1580069214105606,
-0.09399118274450302,
0.05722566321492195,
0.13509762287139893,
-0.004474987741559744,
-0.07333601266145706,
0.01946757175028324,
-0.018503818660974503,
-0.07670656591653824,
-0.1970399171113968,
-0.12479052692651749,
-0.10847757756710052,
0.1489066630601883,
0.035916171967983246,
0.05541619285941124,
0.20623216032981873,
0.07840440422296524,
-0.04838038608431816,
0.01880517229437828,
0.050188854336738586,
0.09675160050392151,
0.2866089940071106,
-0.07767003774642944,
0.15457135438919067,
-0.013288180343806744,
-0.15090502798557281,
0.07195451855659485,
0.08921563625335693,
0.022140365093946457,
0.02361208014190197,
0.0726398155093193,
0.01916365697979927,
0.048382099717855453,
0.12652572989463806,
0.05897308886051178,
-0.0048539647832512856,
-0.0527523048222065,
-0.043204884976148605,
-0.04651717096567154,
-0.0009319307282567024,
0.06836098432540894,
0.04224471002817154,
-0.12931755185127258,
-0.04097342491149902,
-0.047364845871925354,
0.05920358747243881,
0.06214221566915512,
0.10710568726062775,
-0.1296156793832779,
-0.03965986520051956,
0.08079732209444046,
-0.04320766031742096,
-0.09776944667100906,
0.09718343615531921,
0.023990657180547714,
-0.1382407397031784,
0.05260085314512253,
-0.016217000782489777,
0.10768620669841766,
-0.0886879414319992,
0.07993527501821518,
-0.11068997532129288,
-0.07608196884393692,
-0.0009820660343393683,
0.09898699820041656,
-0.2715371251106262,
0.18832191824913025,
-0.004024268127977848,
-0.05200548470020294,
-0.0919288843870163,
-0.0016334947431460023,
-0.004739466123282909,
0.10215157270431519,
0.10402119904756546,
-0.011388386599719524,
0.05336039513349533,
-0.004962367936968803,
-0.04848048463463783,
0.01915483921766281,
0.07523163408041,
-0.08100779354572296,
-0.041179027408361435,
-0.037509288638830185,
-0.002592606469988823,
-0.015730012208223343,
0.007073884829878807,
0.01807575486600399,
-0.18695127964019775,
0.05898234620690346,
0.10758278518915176,
0.06798943132162094,
0.017471127212047577,
-0.022928189486265182,
-0.11158934235572815,
0.29046693444252014,
-0.006254233419895172,
-0.10860468447208405,
-0.10744956135749817,
-0.06976477801799774,
0.053695984184741974,
-0.08337201178073883,
0.02388930879533291,
-0.05823995918035507,
0.02334514446556568,
-0.0784216821193695,
-0.18852145969867706,
0.12347642332315445,
-0.09844201803207397,
-0.056130263954401016,
-0.02804999239742756,
0.2294798344373703,
-0.03576195240020752,
0.009335181675851345,
0.04247008636593819,
-0.002691719215363264,
-0.10885076224803925,
-0.10757223516702652,
0.010865572839975357,
-0.00018754322081804276,
-0.03845115751028061,
-0.0031660976819694042,
-0.008914941921830177,
-0.0055207968689501286,
-0.036562785506248474,
-0.014554306864738464,
0.2829746901988983,
0.17351317405700684,
-0.054810479283332825,
0.2345106154680252,
0.18791350722312927,
-0.04678291454911232,
-0.2976668179035187,
-0.12653738260269165,
-0.07839353382587433,
-0.029698282480239868,
-0.040478914976119995,
-0.14283813536167145,
0.09481780230998993,
-0.06803202629089355,
-0.0388994924724102,
0.040060847997665405,
-0.21121063828468323,
-0.1246679350733757,
0.21897336840629578,
-0.03823665902018547,
0.4065539538860321,
-0.09350507706403732,
-0.08236077427864075,
-0.029415694996714592,
-0.1498069018125534,
0.14302775263786316,
0.02644202671945095,
0.09951168298721313,
-0.022011738270521164,
0.147931307554245,
0.044572316110134125,
-0.0028553660959005356,
0.07288117706775665,
0.06383214890956879,
-0.0206585880368948,
-0.10325178503990173,
-0.050697118043899536,
-0.040629662573337555,
-0.002538289874792099,
0.050646111369132996,
-0.08977115154266357,
0.06362573057413101,
-0.11725851148366928,
-0.05942695960402489,
-0.062176041305065155,
0.05272817984223366,
0.0359526090323925,
-0.1024603471159935,
-0.017466790974140167,
-0.064405158162117,
-0.01918080821633339,
0.014006764627993107,
0.21132120490074158,
-0.06574604660272598,
0.1615557074546814,
0.1351803094148636,
0.10239917039871216,
-0.18299999833106995,
-0.04334418475627899,
-0.059612467885017395,
-0.07158257067203522,
0.07828277349472046,
-0.08355364203453064,
0.030622513964772224,
0.10834184288978577,
-0.01702827960252762,
0.09074075520038605,
0.07836609333753586,
0.0008483233395963907,
-0.007648753467947245,
0.059760186821222305,
-0.2868064045906067,
-0.05683661624789238,
-0.08407548815011978,
0.005290706641972065,
0.08084384351968765,
0.13791589438915253,
0.22389863431453705,
-0.026086868718266487,
-0.033537860959768295,
0.02220454253256321,
0.001972425729036331,
-0.019481578841805458,
0.085786372423172,
0.02114672213792801,
0.044068142771720886,
-0.15617354214191437,
0.048060543835163116,
-0.027142098173499107,
-0.04994203895330429,
0.04475783184170723,
0.13918299973011017,
-0.12202887237071991,
-0.10666484385728836,
-0.08546862751245499,
0.08883128315210342,
-0.1183411180973053,
-0.007717607542872429,
-0.031039610505104065,
-0.14881020784378052,
0.059934355318546295,
0.09852636605501175,
0.07265400141477585,
0.08621101081371307,
-0.13244442641735077,
-0.04034816473722458,
0.0020838878117501736,
0.03330971300601959,
0.026368357241153717,
-0.010687458328902721,
-0.06924602389335632,
0.08514399081468582,
-0.06027238070964813,
0.11530248820781708,
-0.09349054098129272,
-0.09118600189685822,
-0.18594993650913239,
0.027702007442712784,
-0.16216278076171875,
-0.09328876435756683,
-0.08677522093057632,
-0.032120805233716965,
-0.01725774072110653,
-0.059367671608924866,
-0.019394127652049065,
-0.027578819543123245,
-0.1173388659954071,
0.055724501609802246,
-0.022422313690185547,
0.0181010439991951,
-0.09651336073875427,
0.02526826038956642,
0.0514720156788826,
-0.01757265441119671,
0.1707572191953659,
0.11630728840827942,
-0.12354230880737305,
0.06746366620063782,
-0.1575603187084198,
-0.04858118295669556,
0.10735996812582016,
0.0095059834420681,
0.05814504995942116,
0.08969864994287491,
-0.014305196702480316,
0.05633037909865379,
0.054996080696582794,
0.05417979136109352,
-0.019355252385139465,
-0.0745636448264122,
0.05003093183040619,
-0.07130414992570877,
-0.12836886942386627,
-0.02858235314488411,
-0.026133067905902863,
0.028261274099349976,
0.057308170944452286,
0.05912643298506737,
-0.0663670152425766,
0.08202774077653885,
-0.0669601783156395,
0.030775204300880432,
0.03989894688129425,
-0.16807197034358978,
-0.09296000003814697,
-0.09145189821720123,
0.029314391314983368,
-0.00005248934030532837,
0.2246173471212387,
-0.0013066334649920464,
-0.040546875447034836,
0.030032381415367126,
0.01877763494849205,
0.08430242538452148,
0.008811046369373798,
0.182980477809906,
0.061477355659008026,
-0.04910566285252571,
-0.10021016001701355,
0.06226914003491402,
0.027784736827015877,
0.037837304174900055,
0.09047938883304596,
-0.046827346086502075,
-0.10104817897081375,
0.0938122570514679,
-0.04119253158569336,
0.023396212607622147,
-0.11037348955869675,
-0.14556774497032166,
-0.04602836072444916,
0.051510922610759735,
-0.07177233695983887,
0.13824006915092468,
0.14529205858707428,
-0.014807616360485554,
0.03655741736292839,
0.04722966253757477,
-0.06994116306304932,
-0.19970005750656128,
-0.16781127452850342,
-0.059774406254291534,
-0.14827464520931244,
0.003213870106264949,
-0.10930901765823364,
0.029413167387247086,
-0.03461005538702011,
0.10024480521678925,
-0.043972644954919815,
0.09070691466331482,
0.025036390870809555,
-0.08270744979381561,
0.07451728731393814,
-0.034084953367710114,
0.05226357653737068,
-0.019624125212430954,
0.011308095417916775,
-0.07485170662403107,
0.03681477531790733,
-0.009711520746350288,
0.02332025021314621,
-0.03692600876092911,
-0.00491815572604537,
-0.10470134764909744,
-0.06824959814548492,
-0.05084993690252304,
0.0584503710269928,
0.013280805200338364,
0.1527446210384369,
0.01276435423642397,
-0.03896848112344742,
0.011118091642856598,
0.238887220621109,
-0.07017582654953003,
-0.08560709655284882,
-0.05531579628586769,
0.22212451696395874,
0.007922407239675522,
0.12428724020719528,
-0.027680199593305588,
0.023625239729881287,
-0.09435266256332397,
0.318786084651947,
0.2931014895439148,
-0.08479544520378113,
0.03273991122841835,
-0.0037178490310907364,
0.040323272347450256,
0.10947754979133606,
0.09381633996963501,
0.05205611139535904,
0.3234284520149231,
-0.04617438092827797,
-0.03125159069895744,
0.013324566185474396,
-0.04826600104570389,
-0.08193173259496689,
0.0749262273311615,
0.05386239290237427,
-0.0706227719783783,
-0.03987143933773041,
0.07671469449996948,
-0.25862282514572144,
0.06899514049291611,
-0.18088680505752563,
-0.18882417678833008,
-0.05996790528297424,
0.022927552461624146,
0.10159265249967575,
0.028411036357283592,
0.07288585603237152,
0.0076932646334171295,
-0.04750339314341545,
0.08272640407085419,
0.013288098387420177,
-0.18460869789123535,
0.023000653833150864,
0.0793231800198555,
-0.06759481877088547,
-0.06898482143878937,
-0.04520699381828308,
0.07229208946228027,
0.07622881978750229,
0.03886587172746658,
-0.018585283309221268,
-0.0001781350001692772,
-0.004430991131812334,
-0.08311604708433151,
0.005367580335587263,
0.03945891931653023,
0.029168184846639633,
-0.08631081879138947,
0.08091114461421967,
-0.135515034198761,
0.027301272377371788,
-0.03950171917676926,
-0.03903757780790329,
-0.026475880295038223,
0.024590017274022102,
-0.04871881753206253,
0.06568586081266403,
0.057493627071380615,
-0.031530797481536865,
-0.036502715200185776,
-0.01647614687681198,
-0.020211882889270782,
-0.00883416272699833,
-0.1206522136926651,
-0.05290324240922928,
-0.16437634825706482,
-0.08891735970973969,
0.036510709673166275,
-0.007684602867811918,
-0.17343610525131226,
-0.014174922369420528,
-0.13843868672847748,
0.06554131954908371,
-0.14800183475017548,
0.09221930801868439,
0.07309120893478394,
0.02684895507991314,
0.01898835599422455,
-0.06515534967184067,
0.04280847683548927,
0.05360083654522896,
-0.11921896040439606,
-0.07670287787914276
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# t5-small-finetuned-xsum-original
This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the xsum dataset.
It achieves the following results on the evaluation set:
- Loss: 2.4436
- Rouge1: 28.8838
- Rouge2: 8.1114
- Rougel: 22.8318
- Rougelsum: 22.8318
- Gen Len: 18.8141
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|:-------------:|:-----:|:-----:|:---------------:|:-------:|:------:|:-------:|:---------:|:-------:|
| 2.6754 | 1.0 | 51012 | 2.4436 | 28.8838 | 8.1114 | 22.8318 | 22.8318 | 18.8141 |
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.0+cu111
- Datasets 1.17.1.dev0
- Tokenizers 0.11.0
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["xsum"], "metrics": ["rouge"], "model-index": [{"name": "t5-small-finetuned-xsum-original", "results": [{"task": {"type": "text2text-generation", "name": "Sequence-to-sequence Language Modeling"}, "dataset": {"name": "xsum", "type": "xsum", "args": "default"}, "metrics": [{"type": "rouge", "value": 28.8838, "name": "Rouge1"}]}]}]} | text2text-generation | RenZHU/t5-small-finetuned-xsum-original | [
"transformers",
"pytorch",
"tensorboard",
"t5",
"text2text-generation",
"generated_from_trainer",
"dataset:xsum",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #t5 #text2text-generation #generated_from_trainer #dataset-xsum #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| t5-small-finetuned-xsum-original
================================
This model is a fine-tuned version of t5-small on the xsum dataset.
It achieves the following results on the evaluation set:
* Loss: 2.4436
* Rouge1: 28.8838
* Rouge2: 8.1114
* Rougel: 22.8318
* Rougelsum: 22.8318
* Gen Len: 18.8141
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 4
* eval\_batch\_size: 4
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 1
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.16.0.dev0
* Pytorch 1.10.0+cu111
* Datasets 1.17.1.dev0
* Tokenizers 0.11.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 4\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #pytorch #tensorboard #t5 #text2text-generation #generated_from_trainer #dataset-xsum #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 4\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0"
] | [
77,
113,
4,
41
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #t5 #text2text-generation #generated_from_trainer #dataset-xsum #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 4\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0"
] | [
-0.12855082750320435,
0.14359499514102936,
-0.002093957969918847,
0.1032930240035057,
0.137231707572937,
0.02583795040845871,
0.13073061406612396,
0.1549605429172516,
-0.08253911882638931,
0.07369140535593033,
0.13631588220596313,
0.10746893286705017,
0.05847449228167534,
0.1651376187801361,
-0.058124665170907974,
-0.23350223898887634,
0.024086518213152885,
0.030402852222323418,
-0.05052417516708374,
0.12056826055049896,
0.09121150523424149,
-0.12040674686431885,
0.09383583813905716,
0.005217083729803562,
-0.16408254206180573,
-0.03450510650873184,
-0.0019489420810714364,
-0.06501557677984238,
0.11654304713010788,
0.03604286164045334,
0.10672387480735779,
0.01307840645313263,
0.07774554938077927,
-0.16650879383087158,
0.004346966277807951,
0.06636430323123932,
0.013995845802128315,
0.10532660782337189,
0.0671272799372673,
0.010955444537103176,
0.08114849030971527,
-0.09712643921375275,
0.05183568224310875,
0.01690565049648285,
-0.12092578411102295,
-0.22662606835365295,
-0.10879701375961304,
0.0342680923640728,
0.07669243961572647,
0.09471913427114487,
-0.00594151858240366,
0.11524996161460876,
-0.01755695417523384,
0.08867532014846802,
0.1694747358560562,
-0.2667820453643799,
-0.05868379771709442,
0.0020051689352840185,
0.03840567544102669,
0.07612074911594391,
-0.07890679687261581,
-0.03514309227466583,
0.029951894655823708,
0.05281528830528259,
0.1295340210199356,
-0.008699133060872555,
-0.047196242958307266,
-0.0042531793005764484,
-0.13428767025470734,
-0.06908249855041504,
0.17317050695419312,
0.045258376747369766,
-0.034143537282943726,
-0.06412524729967117,
-0.07835832238197327,
-0.17517778277397156,
-0.032822538167238235,
0.012985264882445335,
0.02695559710264206,
-0.022556526586413383,
-0.06445273756980896,
-0.019855579361319542,
-0.08983398973941803,
-0.04936237633228302,
-0.04530248045921326,
0.06291847676038742,
0.027530787512660027,
0.024166379123926163,
-0.04033869504928589,
0.08680202066898346,
0.007550929207354784,
-0.16265644133090973,
0.002904196036979556,
0.011861534789204597,
-0.017263125628232956,
-0.027126586064696312,
-0.042127855122089386,
-0.08493075519800186,
0.0216479804366827,
0.1253824234008789,
-0.0509074367582798,
0.06000664085149765,
-0.021047640591859818,
0.019110774621367455,
-0.07128412276506424,
0.17538443207740784,
-0.055843453854322433,
-0.04742295295000076,
0.027111181989312172,
0.08720265328884125,
0.04428873211145401,
-0.026185758411884308,
-0.10934533178806305,
0.0018059717258438468,
0.12038261443376541,
0.020571626722812653,
-0.0020291749387979507,
0.046790361404418945,
-0.04977532476186752,
-0.035032887011766434,
0.07276744395494461,
-0.10105515271425247,
0.02603333257138729,
-0.00529641006141901,
-0.070697121322155,
-0.010679999366402626,
0.0260955560952425,
0.004574056249111891,
-0.034480419009923935,
0.07670684158802032,
-0.09838404506444931,
-0.011173604056239128,
-0.08745401352643967,
-0.12931358814239502,
0.042794857174158096,
-0.07133711129426956,
-0.0034684808924794197,
-0.08997885882854462,
-0.19983313977718353,
-0.024227023124694824,
0.05310230329632759,
-0.050588980317115784,
-0.059896621853113174,
-0.051178786903619766,
-0.08628185093402863,
0.052259959280490875,
-0.022999079897999763,
0.12279044836759567,
-0.06498920172452927,
0.0976918637752533,
0.029368393123149872,
0.05545033514499664,
-0.04399576038122177,
0.0536070317029953,
-0.07734230905771255,
0.04697512835264206,
-0.1507842242717743,
0.07686475664377213,
-0.046729329973459244,
0.054069310426712036,
-0.11698126047849655,
-0.0920901820063591,
0.022293109446763992,
-0.031631868332624435,
0.11066503077745438,
0.10714215785264969,
-0.17185837030410767,
-0.04774893447756767,
0.166145920753479,
-0.06871260702610016,
-0.14581814408302307,
0.12369848042726517,
-0.04462836682796478,
-0.010945800691843033,
0.05157000198960304,
0.15979409217834473,
0.08301864564418793,
-0.08852669596672058,
-0.034609027206897736,
-0.02417195402085781,
0.08660147339105606,
-0.08301546424627304,
0.09448087960481644,
0.002468216000124812,
0.032762378454208374,
0.009612074121832848,
-0.024149904027581215,
0.06147873029112816,
-0.08529701828956604,
-0.08415635675191879,
-0.05181727930903435,
-0.08112858235836029,
0.01685776747763157,
0.044413939118385315,
0.04911334067583084,
-0.08972756564617157,
-0.10538776963949203,
0.04934627562761307,
0.0929436907172203,
-0.08576586842536926,
0.0255582295358181,
-0.07080322504043579,
0.10349808633327484,
-0.10732109099626541,
-0.011824944987893105,
-0.18290825188159943,
-0.06967411935329437,
0.02798205427825451,
-0.004535319283604622,
-0.0047174543142318726,
-0.02390974387526512,
0.06241922825574875,
0.07103587687015533,
-0.03538540005683899,
-0.032114092260599136,
-0.03443104028701782,
-0.0028885849751532078,
-0.10743020474910736,
-0.18487602472305298,
-0.04785657674074173,
-0.02996768243610859,
0.145985409617424,
-0.19227789342403412,
0.02047073096036911,
0.004248459357768297,
0.10856243968009949,
0.026576990261673927,
-0.02633705735206604,
-0.012906915508210659,
0.06446487456560135,
-0.05421554669737816,
-0.07726878672838211,
0.05646751821041107,
0.02535979636013508,
-0.0873323604464531,
0.0022217261139303446,
-0.13493360579013824,
0.10558726638555527,
0.1279812604188919,
-0.01751440204679966,
-0.045738138258457184,
-0.009543907828629017,
-0.05504164844751358,
-0.03921695798635483,
-0.029346132650971413,
-0.009140035137534142,
0.11401344835758209,
0.025920437648892403,
0.15065963566303253,
-0.07851691544055939,
-0.036799926310777664,
0.046881552785634995,
-0.0020689067896455526,
-0.005355847999453545,
0.10489410161972046,
0.07764611393213272,
-0.08632801473140717,
0.1461627036333084,
0.1334167867898941,
-0.043167807161808014,
0.11981875449419022,
-0.05773209035396576,
-0.09156375378370285,
-0.037657272070646286,
-0.01327651459723711,
0.018872244283556938,
0.11005197465419769,
-0.10630905628204346,
-0.00004344751141616143,
0.051896415650844574,
0.03948124125599861,
0.015052473172545433,
-0.1850365400314331,
-0.004460600670427084,
0.03079376555979252,
-0.05710525065660477,
-0.03781480714678764,
-0.005849514156579971,
0.007954469881951809,
0.09886358678340912,
0.019528286531567574,
-0.047316137701272964,
0.03395741432905197,
-0.002383145038038492,
-0.07991090416908264,
0.19009022414684296,
-0.11150608211755753,
-0.17429591715335846,
-0.13971079885959625,
-0.07231397926807404,
-0.0493488535284996,
0.00721804378554225,
0.0460873618721962,
-0.05567147210240364,
-0.05190708488225937,
-0.08393797278404236,
-0.003253852017223835,
-0.010840363800525665,
0.024173500016331673,
0.023711582645773888,
-0.020583948120474815,
0.06521528214216232,
-0.10456911474466324,
-0.011848118156194687,
-0.0027420413680374622,
-0.0096891475841403,
0.040269870311021805,
0.022476844489574432,
0.11314656585454941,
0.13265487551689148,
-0.008822564966976643,
0.018871940672397614,
-0.028844522312283516,
0.24543194472789764,
-0.06760261952877045,
-0.0015471421647816896,
0.14603577554225922,
-0.009675071574747562,
0.06880275905132294,
0.13242241740226746,
0.040121667087078094,
-0.08542881160974503,
0.007577625568956137,
0.006183616351336241,
-0.035058945417404175,
-0.23095743358135223,
-0.04648596793413162,
-0.06629081070423126,
0.009291431866586208,
0.10540901869535446,
0.02326681837439537,
0.020683933049440384,
0.06091102957725525,
0.007647537626326084,
0.09198908507823944,
-0.0403897725045681,
0.08214680850505829,
0.15049029886722565,
0.05420871078968048,
0.1318647563457489,
-0.04183568060398102,
-0.01901024952530861,
0.059895530343055725,
0.015340575017035007,
0.22934342920780182,
-0.022674763575196266,
0.20356082916259766,
0.030309874564409256,
0.16192544996738434,
0.0018442549044266343,
0.07747141271829605,
-0.005263723898679018,
0.013455994427204132,
-0.015316477976739407,
-0.04788294807076454,
-0.054954644292593,
0.01782534457743168,
-0.05017460137605667,
0.036898158490657806,
-0.10350203514099121,
0.020821435377001762,
0.030830485746264458,
0.26797041296958923,
0.07644157856702805,
-0.3777969479560852,
-0.11021343618631363,
0.011007935740053654,
-0.020015569403767586,
-0.06137251481413841,
-0.0038663216400891542,
0.11224660277366638,
-0.07504776120185852,
0.050928037613630295,
-0.08814117312431335,
0.10224974900484085,
-0.0629628598690033,
0.01835957169532776,
0.07570953667163849,
0.09732433408498764,
0.0017210260266438127,
0.06800895929336548,
-0.279289186000824,
0.25145623087882996,
0.01750708371400833,
0.06640797108411789,
-0.08220823854207993,
0.02274356596171856,
0.010405871085822582,
0.015133370645344257,
0.06267531216144562,
-0.005255906842648983,
-0.07792079448699951,
-0.15591804683208466,
-0.12297965586185455,
0.030128398910164833,
0.06402337551116943,
-0.00730576366186142,
0.11889508366584778,
-0.03598665073513985,
-0.007533727213740349,
0.05520620942115784,
-0.003352005500346422,
-0.05741298571228981,
-0.10919307917356491,
0.028576238080859184,
0.045240361243486404,
-0.01215952169150114,
-0.06013697758316994,
-0.09835968911647797,
-0.051486071199178696,
0.12789492309093475,
-0.02063710242509842,
-0.06120114400982857,
-0.12766508758068085,
0.0925893485546112,
0.1257048398256302,
-0.08944812417030334,
0.04541243612766266,
-0.008308163844048977,
0.11091892421245575,
0.03004244528710842,
-0.07868824154138565,
0.09427087008953094,
-0.06598597764968872,
-0.17772944271564484,
-0.0645282194018364,
0.10394368320703506,
0.013901924714446068,
0.054358284920454025,
-0.013568396680057049,
0.044470228254795074,
-0.04910361394286156,
-0.0744008794426918,
0.01963149383664131,
0.028497567400336266,
0.09620386362075806,
0.0007340006995946169,
-0.0269705131649971,
0.010078968480229378,
-0.03836112841963768,
-0.03794432431459427,
0.1600715070962906,
0.22157683968544006,
-0.08468769490718842,
0.028943873941898346,
0.0392034649848938,
-0.0641404539346695,
-0.1725151091814041,
0.01584906317293644,
0.07686874270439148,
0.031172845512628555,
0.0008161953301168978,
-0.1798384189605713,
0.06899942457675934,
0.09621778875589371,
-0.020240822806954384,
0.12087360769510269,
-0.3137260377407074,
-0.12782655656337738,
0.07511484622955322,
0.11381389200687408,
0.07514350861310959,
-0.1590150147676468,
-0.052125509828329086,
-0.029633359983563423,
-0.1601318120956421,
0.15503130853176117,
-0.07417675107717514,
0.10801450163125992,
-0.025290224701166153,
0.09754212200641632,
0.011155414395034313,
-0.06682455539703369,
0.1316935122013092,
0.006470251828432083,
0.05495133250951767,
-0.05660714954137802,
0.02196745201945305,
0.08521655946969986,
-0.07361429929733276,
0.03815876320004463,
-0.07489477097988129,
0.051167089492082596,
-0.13619641959667206,
-0.01878492161631584,
-0.07571107894182205,
0.010082026943564415,
-0.03379400819540024,
-0.04483908414840698,
-0.027663134038448334,
0.027744974941015244,
0.0751425176858902,
-0.021358715370297432,
0.1441139578819275,
0.03139284625649452,
0.10662934184074402,
0.12628446519374847,
0.07061678916215897,
-0.04579616338014603,
-0.04685225337743759,
-0.009526344016194344,
-0.03201332315802574,
0.04070022702217102,
-0.1644754260778427,
0.027250146493315697,
0.14858482778072357,
0.01076390128582716,
0.13600513339042664,
0.06400265544652939,
-0.047785621136426926,
0.018495090305805206,
0.059895310550928116,
-0.14986535906791687,
-0.11302120983600616,
-0.003295386442914605,
-0.0050177061930298805,
-0.14525717496871948,
0.0042053875513374805,
0.128646120429039,
-0.06312929838895798,
-0.011819439940154552,
-0.001032650819979608,
0.025444334372878075,
-0.024417400360107422,
0.19821299612522125,
0.04356350377202034,
0.05304121598601341,
-0.10127726942300797,
0.0820607990026474,
0.07570589333772659,
-0.11279870569705963,
0.03907458856701851,
0.07964888960123062,
-0.09298606216907501,
-0.028337974101305008,
0.03654145449399948,
0.15748412907123566,
-0.06896362453699112,
-0.05095576122403145,
-0.12232258170843124,
-0.10038965195417404,
0.0938996970653534,
0.12293795496225357,
0.05467764288187027,
0.035550832748413086,
-0.03626834601163864,
-0.008826646022498608,
-0.12445467710494995,
0.11927959322929382,
0.05975256487727165,
0.07892268896102905,
-0.13117344677448273,
0.1171383410692215,
-0.023649919778108597,
0.040156297385692596,
-0.013292825780808926,
0.03432411327958107,
-0.08543256670236588,
-0.011789800599217415,
-0.167084738612175,
0.019154004752635956,
-0.0452018603682518,
-0.012375487945973873,
-0.017570283263921738,
-0.05536496639251709,
-0.05768168345093727,
0.01516213919967413,
-0.09414760768413544,
-0.05515385791659355,
-0.0008352526929229498,
0.03977679833769798,
-0.12858515977859497,
-0.03883796185255051,
0.008094976656138897,
-0.0859074518084526,
0.08441154658794403,
0.04230138286948204,
0.018930090591311455,
0.028544384986162186,
-0.04801307991147041,
0.006310418713837862,
0.03531333804130554,
0.02361808903515339,
0.07480840384960175,
-0.11577563732862473,
-0.003760731313377619,
0.0068241627886891365,
0.013247410766780376,
0.010994619689881802,
0.09176280349493027,
-0.12225577235221863,
-0.010017297230660915,
-0.019121069461107254,
-0.02943767048418522,
-0.06432167440652847,
0.05748726800084114,
0.08627918362617493,
0.04266677051782608,
0.17691409587860107,
-0.06994211673736572,
0.04042286425828934,
-0.22514179348945618,
-0.002973441733047366,
-0.011437945999205112,
-0.10684023052453995,
-0.08373206853866577,
-0.036377761512994766,
0.07948504388332367,
-0.060804106295108795,
0.1019274964928627,
-0.0044997516088187695,
0.06494756788015366,
0.03159261494874954,
-0.00897690188139677,
-0.01988947205245495,
0.019525455310940742,
0.20621205866336823,
0.03362715616822243,
-0.03697163984179497,
0.06548698991537094,
0.031124761328101158,
0.08879540860652924,
0.14031726121902466,
0.1792246252298355,
0.1260775774717331,
0.054995838552713394,
0.09325036406517029,
0.0451417937874794,
-0.06637416779994965,
-0.16751977801322937,
0.048844825476408005,
-0.04697953537106514,
0.14641547203063965,
-0.009348848834633827,
0.2027227133512497,
0.06430789083242416,
-0.1744040995836258,
0.03525731712579727,
-0.05423968285322189,
-0.09150434285402298,
-0.09511325508356094,
-0.07869375497102737,
-0.08786686509847641,
-0.11985138058662415,
-0.014456161297857761,
-0.13264302909374237,
0.0423024483025074,
0.09214930981397629,
0.021069848909974098,
-0.01805940270423889,
0.11172975599765778,
0.04480675235390663,
0.007298635318875313,
0.0562535859644413,
0.020086709409952164,
-0.0025864155031740665,
-0.04522966593503952,
-0.0832650288939476,
0.010751243680715561,
0.016237016767263412,
0.06383660435676575,
-0.02451329305768013,
-0.01856330595910549,
0.059347402304410934,
-0.008633002638816833,
-0.11202488839626312,
0.004289740230888128,
0.029941463842988014,
0.055227961391210556,
0.07512085139751434,
0.019909357652068138,
0.02017132006585598,
-0.009991462342441082,
0.21195422112941742,
-0.07171200960874557,
-0.056493327021598816,
-0.12684491276741028,
0.18753814697265625,
0.002266466850414872,
-0.04619578644633293,
0.04890391230583191,
-0.08173219114542007,
-0.012905887328088284,
0.19119144976139069,
0.1719585359096527,
-0.07637633383274078,
-0.031708646565675735,
0.009362826123833656,
-0.010910967364907265,
-0.028801746666431427,
0.11550635099411011,
0.11574555933475494,
0.0647699162364006,
-0.08034469187259674,
-0.03697717934846878,
-0.05738862603902817,
-0.01085064560174942,
-0.03173771873116493,
0.057490427047014236,
0.0015125162899494171,
-0.005988489370793104,
-0.016342395916581154,
0.05681632459163666,
-0.0546620674431324,
-0.09722711890935898,
0.01734371855854988,
-0.2056589424610138,
-0.18155808746814728,
-0.017915504053235054,
0.06847668439149857,
0.0019695216324180365,
0.048421744257211685,
-0.01753035932779312,
0.019988933578133583,
0.12384744733572006,
-0.03342029079794884,
-0.06873136758804321,
-0.06188717484474182,
0.08649452775716782,
-0.08579149842262268,
0.1926296353340149,
-0.026131024584174156,
0.044535331428050995,
0.13547277450561523,
0.06266453862190247,
-0.13282501697540283,
0.03773436322808266,
0.06137578561902046,
-0.02536636032164097,
0.04481996223330498,
0.10921554267406464,
-0.0411825068295002,
0.0670425295829773,
0.043722547590732574,
-0.10697236657142639,
-0.01686847023665905,
-0.04299560561776161,
-0.03163353353738785,
-0.043797556310892105,
-0.05066459998488426,
-0.04907612502574921,
0.15158697962760925,
0.19308046996593475,
-0.05506562814116478,
-0.007058561313897371,
-0.06193239986896515,
0.009910875000059605,
0.04184070974588394,
0.05820310860872269,
-0.05003019794821739,
-0.2312677800655365,
0.000980774057097733,
0.05976249650120735,
0.008966661058366299,
-0.25282737612724304,
-0.0714346319437027,
0.003005142556503415,
-0.06015384569764137,
-0.09504278004169464,
0.11173238605260849,
0.05672108381986618,
0.038019102066755295,
-0.06561197340488434,
0.01989220455288887,
-0.07514628767967224,
0.15407615900039673,
-0.14215995371341705,
-0.08322953432798386
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# t5-small-finetuned-xsum
This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.5310
- Rouge1: 27.9232
- Rouge2: 7.5324
- Rougel: 22.035
- Rougelsum: 22.0304
- Gen Len: 18.8116
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|:-------------:|:-----:|:-----:|:---------------:|:-------:|:------:|:------:|:---------:|:-------:|
| 2.7564 | 1.0 | 51012 | 2.5310 | 27.9232 | 7.5324 | 22.035 | 22.0304 | 18.8116 |
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.0+cu111
- Datasets 1.17.1.dev0
- Tokenizers 0.11.0
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["rouge"], "model-index": [{"name": "t5-small-finetuned-xsum", "results": []}]} | text2text-generation | RenZHU/t5-small-finetuned-xsum | [
"transformers",
"pytorch",
"tensorboard",
"t5",
"text2text-generation",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #t5 #text2text-generation #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| t5-small-finetuned-xsum
=======================
This model is a fine-tuned version of t5-small on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 2.5310
* Rouge1: 27.9232
* Rouge2: 7.5324
* Rougel: 22.035
* Rougelsum: 22.0304
* Gen Len: 18.8116
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 4
* eval\_batch\_size: 4
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 1
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.16.0.dev0
* Pytorch 1.10.0+cu111
* Datasets 1.17.1.dev0
* Tokenizers 0.11.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 4\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #pytorch #tensorboard #t5 #text2text-generation #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 4\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0"
] | [
67,
113,
4,
41
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #t5 #text2text-generation #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 4\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0"
] | [
-0.08675417304039001,
0.07696607708930969,
-0.004343593493103981,
0.07244137674570084,
0.1214820072054863,
-0.007830770686268806,
0.1393778920173645,
0.1534535139799118,
-0.15704520046710968,
0.06170257553458214,
0.13841429352760315,
0.15276110172271729,
0.038827113807201385,
0.15872885286808014,
-0.06418923288583755,
-0.2587791681289673,
0.04094953462481499,
0.05173317715525627,
-0.0227544903755188,
0.12034598737955093,
0.09099873900413513,
-0.12709255516529083,
0.07926789671182632,
0.03650294244289398,
-0.1806674301624298,
0.0018368352903053164,
0.004301621112972498,
-0.08960816264152527,
0.10978442430496216,
0.027672110125422478,
0.10698559135198593,
0.038841117173433304,
0.047977034002542496,
-0.1671227514743805,
0.009372047148644924,
0.07654310762882233,
0.004311722237616777,
0.0940370187163353,
0.060720477253198624,
-0.01149066537618637,
0.14086318016052246,
-0.08706733584403992,
0.07430004328489304,
0.025985607877373695,
-0.11303902417421341,
-0.26837000250816345,
-0.11078105866909027,
0.05091305449604988,
0.06799323856830597,
0.08382554352283478,
-0.003592651803046465,
0.15988501906394958,
-0.02086569182574749,
0.11439259350299835,
0.2434510886669159,
-0.29835379123687744,
-0.054475087672472,
-0.023604271933436394,
0.04646042734384537,
0.06943587958812714,
-0.06151801720261574,
-0.027487503364682198,
0.03024648129940033,
0.05992741137742996,
0.1493329405784607,
-0.014267617836594582,
-0.044443417340517044,
-0.02281787432730198,
-0.1428338885307312,
-0.06583321839570999,
0.1306263953447342,
0.008311419747769833,
-0.042861562222242355,
-0.06556826829910278,
-0.08814520388841629,
-0.19177137315273285,
-0.046636760234832764,
-0.025236649438738823,
0.04188603535294533,
-0.0330355130136013,
-0.060317330062389374,
-0.03365261107683182,
-0.07725828886032104,
-0.03733966499567032,
-0.06259218603372574,
0.13903436064720154,
0.05884707719087601,
0.018058404326438904,
-0.06335651874542236,
0.06760264188051224,
-0.03092392161488533,
-0.14696955680847168,
-0.016890360042452812,
0.010377926751971245,
0.03310105577111244,
-0.03486756980419159,
-0.04696545749902725,
-0.1456785798072815,
0.013510462827980518,
0.15390434861183167,
-0.1314464956521988,
0.08431519567966461,
-0.04342631250619888,
0.035452231764793396,
-0.08615637570619583,
0.1682029366493225,
-0.013527097180485725,
0.04031447321176529,
0.03308691829442978,
0.04811865836381912,
0.059359729290008545,
-0.02820274978876114,
-0.10347273200750351,
0.024197349324822426,
0.11281147599220276,
0.0257548950612545,
-0.03831547126173973,
0.07236769050359726,
-0.025345873087644577,
-0.015987195074558258,
0.056670673191547394,
-0.11659713834524155,
0.033107101917266846,
-0.009220930747687817,
-0.056352399289608,
-0.001461466890759766,
0.03714953362941742,
-0.010813518427312374,
-0.072422094643116,
0.10029297322034836,
-0.06840700656175613,
0.01580444723367691,
-0.09036846458911896,
-0.14594034850597382,
0.035714734345674515,
-0.08685319870710373,
-0.007834239862859249,
-0.10635562986135483,
-0.13718301057815552,
-0.013741726987063885,
0.05070200562477112,
-0.03413740172982216,
-0.03943810611963272,
-0.04376170411705971,
-0.09713152050971985,
0.04524414241313934,
-0.02812320739030838,
0.08380398899316788,
-0.061732977628707886,
0.0788363367319107,
0.03928026929497719,
0.07028160244226456,
-0.04177626967430115,
0.05071496590971947,
-0.06970332562923431,
0.03754284605383873,
-0.23538847267627716,
0.06369134038686752,
-0.055839575827121735,
0.06277966499328613,
-0.10440640151500702,
-0.11164478212594986,
0.022648867219686508,
-0.018363192677497864,
0.11627074331045151,
0.08391551673412323,
-0.14008218050003052,
-0.07305551320314407,
0.20531924068927765,
-0.100537970662117,
-0.12140963971614838,
0.11622815579175949,
-0.046007752418518066,
-0.007396901957690716,
0.05385757237672806,
0.18889962136745453,
0.06644609570503235,
-0.10048197209835052,
-0.014021092094480991,
-0.0444532185792923,
0.030025558546185493,
-0.05823132395744324,
0.04641076177358627,
0.00614727009087801,
0.07230151444673538,
0.009622069075703621,
0.02779960446059704,
0.03080936335027218,
-0.08297547698020935,
-0.06433244794607162,
-0.06670055538415909,
-0.051856424659490585,
-0.0013837431324645877,
0.03332312032580376,
0.07335661351680756,
-0.12911009788513184,
-0.11481118202209473,
0.07426092773675919,
0.06626780331134796,
-0.08649346232414246,
0.06825263798236847,
-0.10704616457223892,
0.10677747428417206,
-0.05435673147439957,
0.009465880692005157,
-0.19405129551887512,
-0.01359461061656475,
0.027302570641040802,
-0.018014293164014816,
0.00716406898573041,
-0.05035846680402756,
0.06493411213159561,
0.06322377920150757,
-0.03098997287452221,
-0.01917407289147377,
-0.02740015834569931,
-0.0016450050752609968,
-0.10659584403038025,
-0.19356413185596466,
-0.027725543826818466,
-0.03902444988489151,
0.04549039155244827,
-0.15111951529979706,
0.0518009252846241,
0.06815137714147568,
0.10963217169046402,
0.02719617821276188,
-0.011828921735286713,
-0.023735057562589645,
0.077784463763237,
-0.05269154533743858,
-0.06423074007034302,
0.06259515136480331,
0.03323768451809883,
-0.07520833611488342,
0.027803299948573112,
-0.16357702016830444,
0.1198253408074379,
0.14420361816883087,
-0.05369177833199501,
-0.061026427894830704,
-0.004175485111773014,
-0.05880487337708473,
-0.02532786689698696,
-0.0074407183565199375,
0.029060088098049164,
0.18061555922031403,
0.015867752954363823,
0.17135106027126312,
-0.0875665694475174,
-0.05143117532134056,
0.0545831099152565,
-0.01808059588074684,
0.0023743738420307636,
0.09748239815235138,
0.05012361705303192,
-0.07287769764661789,
0.12747223675251007,
0.124407097697258,
-0.06172936037182808,
0.11754123866558075,
-0.06302075833082199,
-0.06919362396001816,
-0.017063813284039497,
-0.005348567385226488,
0.026171719655394554,
0.06347323209047318,
-0.13580848276615143,
-0.019731303676962852,
0.03511219099164009,
0.037594735622406006,
0.020827248692512512,
-0.19282947480678558,
0.02476399391889572,
0.04283954203128815,
-0.05471137911081314,
-0.0333542600274086,
-0.0018462638836354017,
0.020910559222102165,
0.09812850505113602,
0.01716560870409012,
-0.048593517392873764,
0.03271044045686722,
0.011242127045989037,
-0.07155560702085495,
0.17998704314231873,
-0.11797653138637543,
-0.1658553183078766,
-0.11181606352329254,
-0.09847614914178848,
-0.04796883836388588,
0.002634562086313963,
0.09048078209161758,
-0.07821651548147202,
-0.05697724223136902,
-0.08957779407501221,
0.0028949188999831676,
-0.01161043532192707,
0.031368546187877655,
0.04544343426823616,
-0.01590990461409092,
0.07224256545305252,
-0.11936359107494354,
-0.030468855053186417,
-0.017498932778835297,
-0.008700812235474586,
0.06386874616146088,
0.03509646654129028,
0.09515303373336792,
0.12950512766838074,
-0.05042028799653053,
0.0365305170416832,
-0.03819408640265465,
0.2073286771774292,
-0.06154658645391464,
-0.01693941466510296,
0.1502576321363449,
-0.011475445702672005,
0.08220665901899338,
0.09824689477682114,
0.03736897557973862,
-0.08938422054052353,
0.002999254735186696,
0.010232768021523952,
-0.04544903337955475,
-0.23942101001739502,
-0.02615204080939293,
-0.048817794770002365,
0.005885637830942869,
0.09883861243724823,
0.03678499907255173,
0.03857516497373581,
0.04912467300891876,
0.0144326938316226,
0.0716148242354393,
-0.009289199486374855,
0.11871825158596039,
0.18321062624454498,
0.0448056161403656,
0.14378270506858826,
-0.059524524956941605,
-0.02340526320040226,
0.05040498822927475,
-0.013262634165585041,
0.2123815417289734,
-0.008352075703442097,
0.18115127086639404,
0.059755098074674606,
0.13677039742469788,
0.02461598813533783,
0.0662170797586441,
-0.019449688494205475,
-0.0044251917861402035,
-0.0033414869103580713,
-0.046479951590299606,
-0.051366597414016724,
0.012689908966422081,
-0.09127546101808548,
0.023515189066529274,
-0.11614109575748444,
0.042406998574733734,
0.04765000939369202,
0.31185463070869446,
0.01960689015686512,
-0.3573263883590698,
-0.1094939187169075,
-0.005767262075096369,
-0.055605221539735794,
-0.049799010157585144,
0.021197447553277016,
0.08481504768133163,
-0.059301234781742096,
0.08072293549776077,
-0.0860380008816719,
0.11045879125595093,
-0.04328244552016258,
0.04116157442331314,
0.05132557824254036,
0.10957639664411545,
0.0016023538773879409,
0.046802449971437454,
-0.3060656785964966,
0.23940126597881317,
0.019731367006897926,
0.07805677503347397,
-0.07164954394102097,
0.030671264976263046,
0.008341638371348381,
0.06803840398788452,
0.04973723739385605,
-0.022033849731087685,
-0.1308838278055191,
-0.11631964892148972,
-0.0916084423661232,
0.014770757406949997,
0.09811419993638992,
0.05048360303044319,
0.10988224297761917,
-0.025068894028663635,
0.0053110504522919655,
0.05029554292559624,
-0.04426846280694008,
-0.057096756994724274,
-0.09766853600740433,
0.01916559785604477,
0.043379154056310654,
-0.020667729899287224,
-0.06914586573839188,
-0.09971191734075546,
-0.06268306821584702,
0.19192256033420563,
-0.010845291428267956,
-0.07086344808340073,
-0.12183606624603271,
0.05783815309405327,
0.059528566896915436,
-0.07894428074359894,
0.05548854544758797,
-0.013259482569992542,
0.11323890089988708,
0.003723819274455309,
-0.08534679561853409,
0.11226025223731995,
-0.056050945073366165,
-0.1744966357946396,
-0.04071463271975517,
0.10243170708417892,
0.008860882371664047,
0.04591507837176323,
-0.014551169238984585,
0.03491285815834999,
-0.03195948526263237,
-0.07781369239091873,
0.018586842343211174,
0.015608140267431736,
0.10997274518013,
-0.043258942663669586,
-0.02759253792464733,
0.021163152530789375,
-0.052539221942424774,
-0.018954643979668617,
0.1817680448293686,
0.2394220232963562,
-0.08966623991727829,
0.06408137083053589,
0.03176316246390343,
-0.06605081260204315,
-0.17021839320659637,
0.010406816378235817,
0.06255056709051132,
0.000234587729210034,
-0.030259424820542336,
-0.18098227679729462,
0.041478801518678665,
0.09099813550710678,
-0.010103769600391388,
0.09759760648012161,
-0.3030959665775299,
-0.13868901133537292,
0.09773457050323486,
0.12842108309268951,
0.09109430760145187,
-0.15525472164154053,
-0.03719695284962654,
-0.03186705708503723,
-0.13212734460830688,
0.12052811682224274,
-0.10717567056417465,
0.1067802906036377,
-0.03152884170413017,
0.10213171690702438,
0.007092860992997885,
-0.06254325807094574,
0.09772378206253052,
-0.0237447340041399,
0.07497464120388031,
-0.07930165529251099,
0.04931512475013733,
0.10293702036142349,
-0.07868640869855881,
0.047992780804634094,
-0.07989053428173065,
0.039991799741983414,
-0.09152217954397202,
-0.02427482232451439,
-0.06430923938751221,
0.01401340402662754,
-0.03038019873201847,
-0.03270235285162926,
-0.04775894433259964,
-0.018824957311153412,
0.07876000553369522,
-0.02623586915433407,
0.19734647870063782,
0.010004251264035702,
0.13971097767353058,
0.1694399118423462,
0.09053465723991394,
-0.11529062688350677,
-0.03601625934243202,
0.01908377930521965,
-0.02828974463045597,
0.057392966002225876,
-0.17229320108890533,
0.045301295816898346,
0.1392596960067749,
-0.00426930096000433,
0.11823950707912445,
0.07439739257097244,
-0.056635063141584396,
0.021281281486153603,
0.05225224047899246,
-0.16427291929721832,
-0.11033691465854645,
0.016504686325788498,
0.03522846847772598,
-0.09069354832172394,
0.0588274709880352,
0.12639759480953217,
-0.07144474238157272,
-0.017592964693903923,
0.006439234595745802,
0.013803982175886631,
-0.0261211134493351,
0.16815565526485443,
0.02466086857020855,
0.05271265655755997,
-0.09921663999557495,
0.08419663459062576,
0.05391130968928337,
-0.1380699723958969,
0.06145055964589119,
0.11871349066495895,
-0.09731384366750717,
-0.029890043660998344,
0.061521947383880615,
0.15293365716934204,
-0.04869475215673447,
-0.06388437747955322,
-0.15100163221359253,
-0.12732023000717163,
0.1191154196858406,
0.19134403765201569,
0.05604027211666107,
0.010087908245623112,
-0.04132787510752678,
0.013210480101406574,
-0.1348961591720581,
0.09205752611160278,
0.025340449064970016,
0.07858822494745255,
-0.11926977336406708,
0.1527814269065857,
-0.0012212274596095085,
0.04319809749722481,
-0.01899656467139721,
0.020695194602012634,
-0.09164439886808395,
0.005466360133141279,
-0.17406347393989563,
0.012929613701999187,
-0.0442219153046608,
-0.00849214568734169,
-0.0067207468673586845,
-0.02198525331914425,
-0.06573212891817093,
0.032292090356349945,
-0.1066536232829094,
-0.035932499915361404,
0.0011016956996172667,
0.022201307117938995,
-0.12116031348705292,
-0.014914380386471748,
-0.009480167180299759,
-0.08460067212581635,
0.07849117368459702,
0.050959426909685135,
-0.016195684671401978,
0.02607978880405426,
-0.03258988633751869,
0.0006125279469415545,
0.07652365416288376,
0.008623857982456684,
0.06575316190719604,
-0.10615676641464233,
-0.010670374147593975,
0.022536391392350197,
0.006945321336388588,
0.016364259645342827,
0.10313791781663895,
-0.11892437934875488,
0.008229236118495464,
-0.00629051961004734,
-0.05257255956530571,
-0.06818877905607224,
0.055007901042699814,
0.07940377295017242,
0.023066425696015358,
0.1749328374862671,
-0.0897415280342102,
0.02773284912109375,
-0.18837963044643402,
-0.0016209863824769855,
0.010329700075089931,
-0.1291842758655548,
-0.042301248759031296,
-0.037489213049411774,
0.06923069804906845,
-0.07082577049732208,
0.1261657029390335,
0.008173485286533833,
0.021088948473334312,
0.05996398627758026,
-0.06361071020364761,
-0.0630548819899559,
0.022517312318086624,
0.1700015813112259,
0.022509265691041946,
-0.0474415197968483,
0.06143003702163696,
0.02206750400364399,
0.10029825568199158,
0.10834559798240662,
0.19964449107646942,
0.13756439089775085,
0.04883885383605957,
0.11144844442605972,
0.028976455330848694,
-0.042000751942396164,
-0.18091429769992828,
0.05462988466024399,
-0.05006655678153038,
0.15863700211048126,
-0.016293924301862717,
0.18107330799102783,
0.12419024109840393,
-0.13535365462303162,
0.056352607905864716,
-0.03691123053431511,
-0.08213356882333755,
-0.12314532697200775,
-0.06886487454175949,
-0.08850997686386108,
-0.1566389799118042,
-0.004809251520782709,
-0.12064151465892792,
0.056875210255384445,
0.03334139660000801,
0.029730023816227913,
-0.009886288084089756,
0.1241295263171196,
0.0298661720007658,
0.0022101758513599634,
0.06149084493517876,
-0.0002406728599453345,
-0.0252850241959095,
-0.058040544390678406,
-0.08413490653038025,
0.020023077726364136,
0.004402462858706713,
0.057884395122528076,
0.00899402517825365,
-0.008497108705341816,
0.047340769320726395,
-0.027202732861042023,
-0.09793537110090256,
0.022788643836975098,
0.04236476868391037,
0.06179070100188255,
0.05921878293156624,
0.008051506243646145,
-0.004861596971750259,
-0.014620843343436718,
0.1830170899629593,
-0.07283011078834534,
-0.06987274438142776,
-0.11163285374641418,
0.2454269379377365,
0.04852140694856644,
-0.026025356724858284,
0.0313597172498703,
-0.06309022009372711,
-0.036059677600860596,
0.1882013976573944,
0.18067003786563873,
-0.009395202621817589,
-0.012288951314985752,
-0.013819831423461437,
-0.00852072425186634,
-0.024837763980031013,
0.10924660414457321,
0.1476481705904007,
0.02850615233182907,
-0.05676792934536934,
-0.041489627212285995,
-0.05560237169265747,
0.0018051295774057508,
-0.062317609786987305,
0.06843806058168411,
0.011846687644720078,
-0.0063233282417058945,
-0.018464934080839157,
0.030821477994322777,
-0.0486554354429245,
-0.08041249215602875,
0.021156493574380875,
-0.18931880593299866,
-0.1473974585533142,
0.011028901673853397,
0.07717998325824738,
-0.016607418656349182,
0.05351048707962036,
0.006534457206726074,
-0.008846702054142952,
0.11953645944595337,
-0.027214504778385162,
-0.0684061199426651,
-0.05803617835044861,
0.0922936424612999,
-0.17511379718780518,
0.1801784187555313,
-0.026212509721517563,
0.03210141137242317,
0.13785865902900696,
0.06050584837794304,
-0.10306715965270996,
0.057563792914152145,
0.03342093154788017,
-0.05499912053346634,
0.013098626397550106,
0.1216096505522728,
-0.03368125855922699,
0.04529717564582825,
0.0384979322552681,
-0.12831063568592072,
-0.01662163995206356,
-0.09169570356607437,
-0.04553050920367241,
-0.021113483235239983,
-0.045900531113147736,
-0.05076678842306137,
0.11073366552591324,
0.20592930912971497,
-0.030841251835227013,
0.011828931979835033,
-0.07410740852355957,
0.006457115523517132,
0.05173541232943535,
0.013748438097536564,
-0.04577936977148056,
-0.25388747453689575,
0.0157159510999918,
0.09856761246919632,
-0.002971239620819688,
-0.2616673707962036,
-0.08288370072841644,
0.005623320583254099,
-0.04822764918208122,
-0.11337704956531525,
0.08565639704465866,
0.06443727016448975,
0.05115223675966263,
-0.06923575699329376,
-0.027615061029791832,
-0.0665406882762909,
0.16576619446277618,
-0.1357303261756897,
-0.06266892701387405
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# rubert-base-srl-seqlabeling
This model is a fine-tuned version of [./ruBert-base/](https://huggingface.co/./ruBert-base/) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1723
- Causator Precision: 0.8539
- Causator Recall: 0.8352
- Causator F1: 0.8444
- Causator Number: 91
- Expiriencer Precision: 0.9259
- Expiriencer Recall: 0.9740
- Expiriencer F1: 0.9494
- Expiriencer Number: 77
- Instrument Precision: 0.375
- Instrument Recall: 1.0
- Instrument F1: 0.5455
- Instrument Number: 3
- Other Precision: 0.0
- Other Recall: 0.0
- Other F1: 0.0
- Other Number: 1
- Predicate Precision: 0.9352
- Predicate Recall: 0.9902
- Predicate F1: 0.9619
- Predicate Number: 102
- Overall Precision: 0.8916
- Overall Recall: 0.9307
- Overall F1: 0.9107
- Overall Accuracy: 0.9667
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-06
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.06
- num_epochs: 10.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Causator Precision | Causator Recall | Causator F1 | Causator Number | Expiriencer Precision | Expiriencer Recall | Expiriencer F1 | Expiriencer Number | Instrument Precision | Instrument Recall | Instrument F1 | Instrument Number | Other Precision | Other Recall | Other F1 | Other Number | Predicate Precision | Predicate Recall | Predicate F1 | Predicate Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:------------------:|:---------------:|:-----------:|:---------------:|:---------------------:|:------------------:|:--------------:|:------------------:|:--------------------:|:-----------------:|:-------------:|:-----------------:|:---------------:|:------------:|:--------:|:------------:|:-------------------:|:----------------:|:------------:|:----------------:|:-----------------:|:--------------:|:----------:|:----------------:|
| 0.2552 | 1.0 | 56 | 0.3471 | 0.8841 | 0.6703 | 0.7625 | 91 | 0.8421 | 0.8312 | 0.8366 | 77 | 0.0 | 0.0 | 0.0 | 3 | 0.0 | 0.0 | 0.0 | 1 | 0.9259 | 0.9804 | 0.9524 | 102 | 0.8893 | 0.8212 | 0.8539 | 0.9203 |
| 0.2385 | 2.0 | 112 | 0.1608 | 0.9103 | 0.7802 | 0.8402 | 91 | 0.9375 | 0.9740 | 0.9554 | 77 | 0.2857 | 0.6667 | 0.4 | 3 | 0.0 | 0.0 | 0.0 | 1 | 0.9519 | 0.9706 | 0.9612 | 102 | 0.9182 | 0.9015 | 0.9098 | 0.9554 |
| 0.0367 | 3.0 | 168 | 0.1311 | 0.8902 | 0.8022 | 0.8439 | 91 | 0.9375 | 0.9740 | 0.9554 | 77 | 0.4286 | 1.0 | 0.6 | 3 | 0.0 | 0.0 | 0.0 | 1 | 0.9709 | 0.9804 | 0.9756 | 102 | 0.9228 | 0.9161 | 0.9194 | 0.9673 |
| 0.0494 | 4.0 | 224 | 0.1507 | 0.7812 | 0.8242 | 0.8021 | 91 | 0.9241 | 0.9481 | 0.9359 | 77 | 0.4286 | 1.0 | 0.6 | 3 | 0.0 | 0.0 | 0.0 | 1 | 0.9524 | 0.9804 | 0.9662 | 102 | 0.8746 | 0.9161 | 0.8948 | 0.9637 |
| 0.0699 | 5.0 | 280 | 0.1830 | 0.8276 | 0.7912 | 0.8090 | 91 | 0.8941 | 0.9870 | 0.9383 | 77 | 0.375 | 1.0 | 0.5455 | 3 | 0.0 | 0.0 | 0.0 | 1 | 0.9352 | 0.9902 | 0.9619 | 102 | 0.875 | 0.9197 | 0.8968 | 0.9560 |
| 0.0352 | 6.0 | 336 | 0.1994 | 0.7857 | 0.8462 | 0.8148 | 91 | 0.9048 | 0.9870 | 0.9441 | 77 | 0.375 | 1.0 | 0.5455 | 3 | 0.0 | 0.0 | 0.0 | 1 | 0.9266 | 0.9902 | 0.9573 | 102 | 0.8595 | 0.9380 | 0.8970 | 0.9572 |
| 0.0186 | 7.0 | 392 | 0.1657 | 0.8652 | 0.8462 | 0.8556 | 91 | 0.9146 | 0.9740 | 0.9434 | 77 | 0.375 | 1.0 | 0.5455 | 3 | 0.0 | 0.0 | 0.0 | 1 | 0.9352 | 0.9902 | 0.9619 | 102 | 0.8920 | 0.9343 | 0.9127 | 0.9673 |
| 0.0052 | 8.0 | 448 | 0.1716 | 0.8556 | 0.8462 | 0.8508 | 91 | 0.9259 | 0.9740 | 0.9494 | 77 | 0.375 | 1.0 | 0.5455 | 3 | 0.0 | 0.0 | 0.0 | 1 | 0.9352 | 0.9902 | 0.9619 | 102 | 0.8920 | 0.9343 | 0.9127 | 0.9673 |
| 0.0094 | 9.0 | 504 | 0.1715 | 0.8444 | 0.8352 | 0.8398 | 91 | 0.9259 | 0.9740 | 0.9494 | 77 | 0.4286 | 1.0 | 0.6 | 3 | 0.0 | 0.0 | 0.0 | 1 | 0.9352 | 0.9902 | 0.9619 | 102 | 0.8916 | 0.9307 | 0.9107 | 0.9667 |
| 0.0078 | 10.0 | 560 | 0.1723 | 0.8539 | 0.8352 | 0.8444 | 91 | 0.9259 | 0.9740 | 0.9494 | 77 | 0.375 | 1.0 | 0.5455 | 3 | 0.0 | 0.0 | 0.0 | 1 | 0.9352 | 0.9902 | 0.9619 | 102 | 0.8916 | 0.9307 | 0.9107 | 0.9667 |
### Framework versions
- Transformers 4.13.0.dev0
- Pytorch 1.10.0+cu102
- Datasets 1.15.1
- Tokenizers 0.10.3
| {"tags": ["generated_from_trainer"], "model-index": [{"name": "rubert-base-srl-seqlabeling", "results": []}]} | token-classification | Rexhaif/rubert-base-srl-seqlabeling | [
"transformers",
"pytorch",
"safetensors",
"bert",
"token-classification",
"generated_from_trainer",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #safetensors #bert #token-classification #generated_from_trainer #autotrain_compatible #endpoints_compatible #has_space #region-us
| rubert-base-srl-seqlabeling
===========================
This model is a fine-tuned version of ./ruBert-base/ on an unknown dataset.
It achieves the following results on the evaluation set:
* Loss: 0.1723
* Causator Precision: 0.8539
* Causator Recall: 0.8352
* Causator F1: 0.8444
* Causator Number: 91
* Expiriencer Precision: 0.9259
* Expiriencer Recall: 0.9740
* Expiriencer F1: 0.9494
* Expiriencer Number: 77
* Instrument Precision: 0.375
* Instrument Recall: 1.0
* Instrument F1: 0.5455
* Instrument Number: 3
* Other Precision: 0.0
* Other Recall: 0.0
* Other F1: 0.0
* Other Number: 1
* Predicate Precision: 0.9352
* Predicate Recall: 0.9902
* Predicate F1: 0.9619
* Predicate Number: 102
* Overall Precision: 0.8916
* Overall Recall: 0.9307
* Overall F1: 0.9107
* Overall Accuracy: 0.9667
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 16
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-06
* lr\_scheduler\_type: cosine
* lr\_scheduler\_warmup\_ratio: 0.06
* num\_epochs: 10.0
### Training results
### Framework versions
* Transformers 4.13.0.dev0
* Pytorch 1.10.0+cu102
* Datasets 1.15.1
* Tokenizers 0.10.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-06\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.06\n* num\\_epochs: 10.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.13.0.dev0\n* Pytorch 1.10.0+cu102\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #pytorch #safetensors #bert #token-classification #generated_from_trainer #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-06\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.06\n* num\\_epochs: 10.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.13.0.dev0\n* Pytorch 1.10.0+cu102\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] | [
53,
119,
4,
36
] | [
"passage: TAGS\n#transformers #pytorch #safetensors #bert #token-classification #generated_from_trainer #autotrain_compatible #endpoints_compatible #has_space #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-06\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.06\n* num\\_epochs: 10.0### Training results### Framework versions\n\n\n* Transformers 4.13.0.dev0\n* Pytorch 1.10.0+cu102\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] | [
-0.11606007069349289,
0.0680498331785202,
-0.0016738295089453459,
0.11576107144355774,
0.16699747741222382,
0.02746477723121643,
0.1123208999633789,
0.10236440598964691,
-0.10186848789453506,
0.054728809744119644,
0.13279472291469574,
0.1552075296640396,
-0.0006406697211787105,
0.14692607522010803,
-0.0713902935385704,
-0.2513320744037628,
0.011406137607991695,
0.016980638727545738,
-0.07105841487646103,
0.12145595997571945,
0.0779198408126831,
-0.16956911981105804,
0.0913754478096962,
-0.006221169605851173,
-0.19074924290180206,
-0.012987765483558178,
0.017411746084690094,
-0.06343091279268265,
0.15138444304466248,
-0.004266468342393637,
0.1631215363740921,
0.025923697277903557,
0.1059035062789917,
-0.17218464612960815,
0.008025272749364376,
0.05718543007969856,
0.018764954060316086,
0.08271230757236481,
0.06636471301317215,
-0.007978636771440506,
0.07590383291244507,
-0.11265386641025543,
0.07846447825431824,
-0.0005489648319780827,
-0.13450616598129272,
-0.19098211824893951,
-0.08858823031187057,
0.006769696716219187,
0.05354253202676773,
0.08887689560651779,
-0.00884800124913454,
0.16304048895835876,
-0.10771716386079788,
0.09851408749818802,
0.23825834691524506,
-0.2800716757774353,
-0.07753812521696091,
0.03208225592970848,
0.026543673127889633,
0.06371131539344788,
-0.13063815236091614,
-0.01892327331006527,
0.043270692229270935,
0.043809372931718826,
0.15424607694149017,
-0.035224586725234985,
-0.11881278455257416,
0.00900660827755928,
-0.14494913816452026,
-0.02766602672636509,
0.08318707346916199,
0.024028517305850983,
-0.03295687586069107,
-0.031529828906059265,
-0.06615001708269119,
-0.19208623468875885,
-0.04702712595462799,
-0.027143243700265884,
0.03810443729162216,
-0.058864250779151917,
-0.08672773092985153,
0.026827353984117508,
-0.11163435131311417,
-0.06587465852499008,
-0.04821320250630379,
0.17690373957157135,
0.031991418451070786,
0.0018450901843607426,
-0.019485821947455406,
0.1190226823091507,
0.010302224196493626,
-0.14845168590545654,
0.01462540216743946,
0.018496306613087654,
-0.028772251680493355,
-0.06736524403095245,
-0.0647372230887413,
-0.014332232996821404,
0.01388094387948513,
0.1320922076702118,
-0.08529384434223175,
0.04022420570254326,
0.04128771647810936,
0.013976073823869228,
-0.08713023364543915,
0.193619504570961,
-0.05239002779126167,
-0.030164040625095367,
0.00823003239929676,
0.07154068350791931,
0.003693810198456049,
-0.018420061096549034,
-0.08786386996507645,
-0.0011666315840557218,
0.15890158712863922,
0.01192446332424879,
-0.07676969468593597,
0.07575932890176773,
-0.0391179583966732,
-0.0003754867648240179,
0.036314819008111954,
-0.09060364216566086,
0.053097959607839584,
0.011034670285880566,
-0.06575102359056473,
-0.044354259967803955,
0.014110252261161804,
0.02566276304423809,
0.012148761190474033,
0.16025640070438385,
-0.1048060730099678,
0.02555340714752674,
-0.09115965664386749,
-0.12652568519115448,
0.007812017109245062,
-0.09942354261875153,
0.028919856995344162,
-0.11397287249565125,
-0.13599346578121185,
-0.022917751222848892,
0.030203448608517647,
-0.024771438911557198,
-0.02794281579554081,
-0.050997354090213776,
-0.09368342906236649,
0.021866584196686745,
-0.015866946429014206,
0.14175912737846375,
-0.06818993389606476,
0.10755442082881927,
0.02165387198328972,
0.06482687592506409,
-0.02914978563785553,
0.04421546682715416,
-0.10134083777666092,
0.01684148795902729,
-0.2031548023223877,
0.02766362950205803,
-0.05508238077163696,
0.05723796412348747,
-0.0654120072722435,
-0.11118512600660324,
0.026617038995027542,
-0.011406957171857357,
0.08402292430400848,
0.11135181784629822,
-0.1615537703037262,
-0.054786209017038345,
0.1550189107656479,
-0.0711490660905838,
-0.09631265699863434,
0.11299660056829453,
-0.07347607612609863,
0.008322333917021751,
0.05771355703473091,
0.14315389096736908,
0.07756099849939346,
-0.10767202079296112,
-0.013264259323477745,
-0.03095102682709694,
0.05603967607021332,
-0.02457694709300995,
0.04692512005567551,
0.03714471310377121,
0.01201095525175333,
0.01779792830348015,
-0.015323147177696228,
0.04695713892579079,
-0.10955788940191269,
-0.08265263587236404,
-0.03850187733769417,
-0.10286907106637955,
0.06949372589588165,
0.06399980187416077,
0.07423532754182816,
-0.10792310535907745,
-0.08110933005809784,
0.04746333137154579,
0.07138368487358093,
-0.0514194555580616,
0.019490528851747513,
-0.05073566362261772,
0.07556694000959396,
-0.04978293552994728,
-0.04114680364727974,
-0.19489961862564087,
-0.07528255879878998,
0.014129787683486938,
0.05035698041319847,
-0.0015557081205770373,
0.021486474201083183,
0.08535313606262207,
0.09633345901966095,
-0.048676732927560806,
-0.03743380308151245,
-0.02004135213792324,
0.015292149037122726,
-0.14314933121204376,
-0.19999167323112488,
-0.036698900163173676,
-0.035539910197257996,
0.14388330280780792,
-0.2067551463842392,
0.01693776063621044,
0.002163694938644767,
0.09172581881284714,
0.021306108683347702,
-0.00994794350117445,
-0.043025653809309006,
0.0967816561460495,
-0.04089755192399025,
-0.055414531379938126,
0.051611270755529404,
-0.0036162107717245817,
-0.051370326429605484,
-0.04135114327073097,
-0.14876259863376617,
0.17614340782165527,
0.12866468727588654,
-0.09764730930328369,
-0.1100822165608406,
-0.005640292074531317,
-0.05698237195611,
-0.019552337005734444,
-0.059512704610824585,
0.044619377702474594,
0.1671740561723709,
-0.0022278965916484594,
0.1381435990333557,
-0.059632282704114914,
-0.044600289314985275,
0.028376445174217224,
-0.019326332956552505,
0.0211680606007576,
0.10801337659358978,
0.09834896773099899,
-0.08772831410169601,
0.14274194836616516,
0.15195193886756897,
-0.08986305445432663,
0.11182590574026108,
-0.02591518871486187,
-0.06160654500126839,
-0.01997235417366028,
-0.032933320850133896,
-0.0009573542047291994,
0.11071643978357315,
-0.08087284117937088,
-0.00763747189193964,
0.014197126030921936,
0.02777615748345852,
0.00014860357623547316,
-0.2242458313703537,
-0.03833350911736488,
0.03359019383788109,
-0.016491228714585304,
-0.0060355328023433685,
-0.011655110865831375,
0.034894343465566635,
0.11496766656637192,
0.006679229438304901,
-0.10263551026582718,
0.012373642064630985,
0.011127675883471966,
-0.061055880039930344,
0.21462561190128326,
-0.0883268266916275,
-0.12168953567743301,
-0.0793205127120018,
-0.057805079966783524,
-0.03446197882294655,
0.023234708234667778,
0.055071331560611725,
-0.07979787141084671,
-0.025521177798509598,
-0.0666157677769661,
-0.01486940123140812,
0.019245654344558716,
0.05454830452799797,
-0.022370964288711548,
-0.023127306252717972,
0.04363931342959404,
-0.09274336695671082,
-0.021531255915760994,
-0.05533050000667572,
-0.06552767008543015,
0.07781845331192017,
0.06679143756628036,
0.12164850533008575,
0.14607276022434235,
-0.03455605357885361,
-0.003443461377173662,
-0.036366138607263565,
0.24701674282550812,
-0.07416624575853348,
-0.01933426409959793,
0.10038214176893234,
-0.018960516899824142,
0.052334997802972794,
0.13562192022800446,
0.056083355098962784,
-0.08961620181798935,
0.015617039985954762,
0.027569767087697983,
-0.028569169342517853,
-0.17719517648220062,
-0.048956308513879776,
-0.03926992043852806,
-0.03292834758758545,
0.09859530627727509,
0.00596465403214097,
0.03703639656305313,
0.0649586096405983,
0.037482988089323044,
0.05568990856409073,
-0.04458478093147278,
0.058272574096918106,
0.08028516918420792,
0.05158322677016258,
0.1331801563501358,
-0.016549324616789818,
-0.08725357800722122,
0.025630170479416847,
-0.021661272272467613,
0.20328176021575928,
-0.011503946036100388,
0.06834860891103745,
0.03401912748813629,
0.18897764384746552,
0.0028477911837399006,
0.10519097000360489,
0.021500563248991966,
-0.04901338368654251,
-0.010045447386801243,
-0.04182736575603485,
-0.05071315914392471,
0.022100744768977165,
-0.0376405343413353,
0.04123339429497719,
-0.1413106769323349,
0.006745858583599329,
0.04960421472787857,
0.2613806426525116,
0.05065507814288139,
-0.33648407459259033,
-0.08148661255836487,
-0.00401692371815443,
-0.01922629401087761,
-0.02841889299452305,
0.007224731147289276,
0.12582628428936005,
-0.09667409956455231,
0.026319121941924095,
-0.06756792962551117,
0.06867952644824982,
-0.03281673043966293,
0.04413503408432007,
0.04669148847460747,
0.08757326006889343,
-0.014949237927794456,
0.0518709272146225,
-0.2676275074481964,
0.28793588280677795,
0.011500267311930656,
0.06529270112514496,
-0.0782359167933464,
-0.016143616288900375,
0.04384788125753403,
0.0803203210234642,
0.053973887115716934,
-0.02084566466510296,
-0.07853292673826218,
-0.2464495152235031,
-0.07095689326524734,
0.02831222116947174,
0.11317621916532516,
-0.03174511343240738,
0.12127888202667236,
-0.029944363981485367,
-0.011653262190520763,
0.0754535123705864,
-0.018260912969708443,
-0.06309501826763153,
-0.06926807761192322,
-0.010526396334171295,
0.03683633729815483,
-0.017940575256943703,
-0.06755923479795456,
-0.1251537799835205,
-0.12531313300132751,
0.12036331743001938,
-0.001937397406436503,
-0.011040483601391315,
-0.13575421273708344,
0.07657431811094284,
0.08932085335254669,
-0.0883784368634224,
0.046054381877183914,
0.005354274995625019,
0.08356230705976486,
0.02087762951850891,
-0.04799114167690277,
0.1132580041885376,
-0.07046986371278763,
-0.1924116313457489,
-0.062032174319028854,
0.08380699902772903,
0.0452461838722229,
0.06291518360376358,
-0.005632659886032343,
0.0467432476580143,
-0.03226089105010033,
-0.08348841220140457,
0.01823907345533371,
-0.027564367279410362,
0.09996717423200607,
0.019650982692837715,
-0.04642033949494362,
0.002777121728286147,
-0.0522177629172802,
-0.002398434095084667,
0.1797678917646408,
0.2676086723804474,
-0.10763891786336899,
-0.000696338654961437,
0.022783460095524788,
-0.04438941553235054,
-0.20938193798065186,
0.04959803447127342,
0.06275765597820282,
0.00753759266808629,
0.03615354374051094,
-0.13839416205883026,
0.12100096791982651,
0.08557525277137756,
-0.012509716674685478,
0.1072126254439354,
-0.2515482008457184,
-0.13477841019630432,
0.12292628735303879,
0.15011602640151978,
0.11190781742334366,
-0.13276852667331696,
0.00255755172111094,
0.0027616426814347506,
-0.0930551216006279,
0.10618668049573898,
-0.07910515367984772,
0.11892722547054291,
-0.016804780811071396,
0.09118232876062393,
0.018961386755108833,
-0.0666801854968071,
0.10645875334739685,
0.0056179482489824295,
0.11930479109287262,
-0.05556946247816086,
-0.0691133588552475,
0.017927370965480804,
-0.05928775295615196,
-0.006677570752799511,
-0.057139597833156586,
0.02787451073527336,
-0.05551435798406601,
-0.012638553977012634,
-0.09503326565027237,
0.04408006742596626,
-0.03569047152996063,
-0.06626657396554947,
-0.0305118877440691,
0.039443906396627426,
0.04682362079620361,
-0.020761828869581223,
0.1258329153060913,
0.012692667543888092,
0.16798636317253113,
0.10579287260770798,
0.0724303349852562,
-0.061754800379276276,
-0.01004916150122881,
0.010215893387794495,
-0.01931500807404518,
0.05205769091844559,
-0.09347638487815857,
0.03279583528637886,
0.14465942978858948,
0.017719995230436325,
0.11880335956811905,
0.08828871697187424,
-0.013512060977518559,
0.0044272481463849545,
0.06178557500243187,
-0.18772083520889282,
-0.03981049731373787,
0.009379545226693153,
-0.04536566883325577,
-0.10999851673841476,
0.04146365076303482,
0.1127508357167244,
-0.07224036008119583,
-0.008440222591161728,
-0.028421808034181595,
0.0007033493020571768,
-0.036146242171525955,
0.21292611956596375,
0.06672649830579758,
0.0561223104596138,
-0.09614894539117813,
0.04505180940032005,
0.05697997286915779,
-0.08029960840940475,
0.008162748999893665,
0.06287539750337601,
-0.09063111990690231,
-0.043121252208948135,
0.10320133715867996,
0.181087926030159,
-0.05888034403324127,
-0.026138631626963615,
-0.12878979742527008,
-0.11792177706956863,
0.07287827879190445,
0.21684810519218445,
0.09627504646778107,
0.02289913222193718,
-0.036332786083221436,
0.018929967656731606,
-0.13349345326423645,
0.0975164845585823,
0.03577755019068718,
0.08840616047382355,
-0.14910930395126343,
0.20093530416488647,
-0.01762005127966404,
0.0447765588760376,
-0.028580328449606895,
0.035486090928316116,
-0.11502503603696823,
-0.0006502107135020196,
-0.1122313141822815,
-0.022774914279580116,
-0.02579212561249733,
0.001564531004987657,
-0.00007829421520000324,
-0.07487015426158905,
-0.056021664291620255,
0.002337973564863205,
-0.1116488054394722,
-0.01679166592657566,
0.018518676981329918,
0.03801674768328667,
-0.13114486634731293,
-0.06290905177593231,
0.027292195707559586,
-0.07294716686010361,
0.07117537409067154,
0.04803105443716049,
0.02374146319925785,
0.05235562473535538,
-0.13868913054466248,
-0.011738449335098267,
0.06585703790187836,
-0.0046761599369347095,
0.08690189570188522,
-0.09038712084293365,
-0.008289644494652748,
-0.020654942840337753,
0.07073502242565155,
0.024531833827495575,
0.09354639798402786,
-0.12221042066812515,
0.020061908289790154,
-0.004651325289160013,
-0.08719241619110107,
-0.06102559342980385,
0.011289851740002632,
0.09008467197418213,
-0.021771589294075966,
0.1790483444929123,
-0.0836767926812172,
0.0557316318154335,
-0.2065523862838745,
-0.00858814176172018,
-0.022491659969091415,
-0.12326259911060333,
-0.12352234125137329,
-0.05968961492180824,
0.09076613932847977,
-0.043556567281484604,
0.10606284439563751,
0.02860168181359768,
0.08413078635931015,
0.02283630147576332,
-0.03523700684309006,
0.008797800168395042,
0.04856608435511589,
0.1669464409351349,
0.04793809726834297,
-0.05878875404596329,
0.0694577619433403,
0.06504368036985397,
0.1140659749507904,
0.09016349166631699,
0.24051159620285034,
0.13034766912460327,
0.0016138346400111914,
0.09421570599079132,
0.02308947965502739,
-0.08706150203943253,
-0.15788652002811432,
-0.012718120589852333,
-0.07955878973007202,
0.07854634523391724,
-0.03965149074792862,
0.19247470796108246,
0.047258101403713226,
-0.17109034955501556,
0.027957674115896225,
-0.07383628189563751,
-0.08598794043064117,
-0.11377038061618805,
0.00813971646130085,
-0.09423556178808212,
-0.1198229268193245,
0.00003345185177749954,
-0.11098171025514603,
0.018011318519711494,
0.09968484938144684,
0.010420586913824081,
-0.0033138906583189964,
0.19136103987693787,
0.03138447180390358,
0.05797404795885086,
0.05133625864982605,
0.02637139894068241,
-0.005992485210299492,
-0.08411109447479248,
-0.087120920419693,
-0.030527621507644653,
-0.018564023077487946,
0.0320151150226593,
-0.0757744312286377,
-0.07275338470935822,
0.0467989556491375,
-0.0026760436594486237,
-0.1088508814573288,
0.02132054604589939,
0.013314498588442802,
0.06457879394292831,
0.032580163329839706,
0.0035970606841146946,
0.030034704133868217,
-0.013972392305731773,
0.2134067267179489,
-0.07891276478767395,
-0.06859435886144638,
-0.09548785537481308,
0.28428563475608826,
0.03976432979106903,
0.008995931595563889,
0.0305790938436985,
-0.06905853748321533,
0.004526675213128328,
0.2203732132911682,
0.18420249223709106,
-0.10755620896816254,
0.01638462394475937,
-0.007095617707818747,
-0.013077614828944206,
-0.03070102632045746,
0.10816393047571182,
0.12387067824602127,
0.03374536335468292,
-0.11162418127059937,
-0.044581443071365356,
-0.044152434915304184,
-0.015914350748062134,
-0.03767141327261925,
0.03558732569217682,
0.032502710819244385,
0.019698813557624817,
-0.05318981409072876,
0.039797380566596985,
-0.049597542732954025,
-0.10885283350944519,
0.08166389167308807,
-0.22713598608970642,
-0.16158291697502136,
-0.004885497502982616,
0.09014474600553513,
-0.0077140419743955135,
0.07486574351787567,
-0.023124022409319878,
-0.024742845445871353,
0.06259734183549881,
-0.026287557557225227,
-0.04125715047121048,
-0.07217373698949814,
0.08883293718099594,
-0.08552171289920807,
0.21069364249706268,
-0.052138909697532654,
0.050307951867580414,
0.13297581672668457,
0.06543905287981033,
-0.08170542865991592,
0.0553351491689682,
0.06383080035448074,
-0.112401582300663,
0.023431910201907158,
0.10408468544483185,
-0.04973756894469261,
0.06627627462148666,
0.04409031569957733,
-0.16239042580127716,
0.03938019275665283,
-0.07030294835567474,
-0.0429670549929142,
-0.027295585721731186,
-0.047015298157930374,
-0.03286590054631233,
0.13912267982959747,
0.19534853100776672,
-0.0227751974016428,
0.024497803300619125,
-0.0621202290058136,
0.006992453709244728,
0.04908609762787819,
0.045319464057683945,
-0.08584366738796234,
-0.25175178050994873,
0.032299358397722244,
0.07713661342859268,
-0.019422488287091255,
-0.22180292010307312,
-0.09100081771612167,
0.017766542732715607,
-0.05930081009864807,
-0.09734898805618286,
0.08781746029853821,
0.04011070728302002,
0.04786471277475357,
-0.051802847534418106,
-0.07547947764396667,
-0.07946070283651352,
0.1636458933353424,
-0.15533918142318726,
-0.08767848461866379
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# rubert-base-srl
This model is a fine-tuned version of [./ruBert-base/](https://huggingface.co/./ruBert-base/) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2429
- F1: 0.9563
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-06
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.06
- num_epochs: 10.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 0.5816 | 1.0 | 57 | 0.3865 | 0.8371 |
| 0.3685 | 2.0 | 114 | 0.1707 | 0.9325 |
| 0.1057 | 3.0 | 171 | 0.0972 | 0.9563 |
| 0.0964 | 4.0 | 228 | 0.1429 | 0.9775 |
| 0.1789 | 5.0 | 285 | 0.2493 | 0.9457 |
| 0.0016 | 6.0 | 342 | 0.1900 | 0.6349 |
| 0.0013 | 7.0 | 399 | 0.2060 | 0.9563 |
| 0.0008 | 8.0 | 456 | 0.2321 | 0.9563 |
| 0.0006 | 9.0 | 513 | 0.2412 | 0.9563 |
| 0.0006 | 10.0 | 570 | 0.2429 | 0.9563 |
### Framework versions
- Transformers 4.13.0.dev0
- Pytorch 1.10.0+cu102
- Datasets 1.15.1
- Tokenizers 0.10.3
| {"tags": ["generated_from_trainer"], "metrics": ["f1"], "model-index": [{"name": "rubert-base-srl", "results": []}]} | text-classification | Rexhaif/rubert-base-srl | [
"transformers",
"pytorch",
"tensorboard",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #safetensors #bert #text-classification #generated_from_trainer #autotrain_compatible #endpoints_compatible #region-us
| rubert-base-srl
===============
This model is a fine-tuned version of ./ruBert-base/ on an unknown dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2429
* F1: 0.9563
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 16
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-06
* lr\_scheduler\_type: cosine
* lr\_scheduler\_warmup\_ratio: 0.06
* num\_epochs: 10.0
### Training results
### Framework versions
* Transformers 4.13.0.dev0
* Pytorch 1.10.0+cu102
* Datasets 1.15.1
* Tokenizers 0.10.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-06\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.06\n* num\\_epochs: 10.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.13.0.dev0\n* Pytorch 1.10.0+cu102\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #safetensors #bert #text-classification #generated_from_trainer #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-06\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.06\n* num\\_epochs: 10.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.13.0.dev0\n* Pytorch 1.10.0+cu102\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] | [
52,
119,
4,
36
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #safetensors #bert #text-classification #generated_from_trainer #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-06\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.06\n* num\\_epochs: 10.0### Training results### Framework versions\n\n\n* Transformers 4.13.0.dev0\n* Pytorch 1.10.0+cu102\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] | [
-0.11290835589170456,
0.06043785437941551,
-0.002011731266975403,
0.11796876788139343,
0.16527695953845978,
0.02461603470146656,
0.14783713221549988,
0.10156147927045822,
-0.09732821583747864,
0.056491076946258545,
0.1314038336277008,
0.15308529138565063,
0.010100658982992172,
0.1387784332036972,
-0.07186699658632278,
-0.24547700583934784,
-0.00012191793939564377,
0.014410273171961308,
-0.058427728712558746,
0.12668316066265106,
0.07129081338644028,
-0.16455285251140594,
0.08763136714696884,
-0.013040773570537567,
-0.1909749060869217,
-0.012965705245733261,
0.02373564802110195,
-0.05843225121498108,
0.1497923731803894,
0.007018211763352156,
0.1603509932756424,
0.03823037073016167,
0.1011519581079483,
-0.1872616857290268,
0.014255760237574577,
0.05611297860741615,
0.01672348566353321,
0.08826291561126709,
0.06941858679056168,
-0.012623090296983719,
0.06748218089342117,
-0.10959406942129135,
0.08074770122766495,
0.001930004102177918,
-0.13266699016094208,
-0.16025100648403168,
-0.08306916803121567,
-0.001979142427444458,
0.04946981370449066,
0.09855891019105911,
-0.015668196603655815,
0.16298452019691467,
-0.08975814282894135,
0.10257259756326675,
0.22233989834785461,
-0.2665451765060425,
-0.08531912416219711,
0.03119621053338051,
0.01796010695397854,
0.09438080340623856,
-0.13011489808559418,
-0.006293464917689562,
0.04783375561237335,
0.038568682968616486,
0.13693439960479736,
-0.03257899358868599,
-0.12133722752332687,
-0.00139285356272012,
-0.13548041880130768,
-0.01714172214269638,
0.08293233066797256,
0.02394535206258297,
-0.03538236767053604,
-0.03620941936969757,
-0.061111558228731155,
-0.173915833234787,
-0.04962319880723953,
-0.02880907990038395,
0.03957420587539673,
-0.049171824008226395,
-0.10146528482437134,
0.007053047884255648,
-0.12024880200624466,
-0.06635342538356781,
-0.0478522852063179,
0.14895258843898773,
0.025112109258770943,
-0.004291722550988197,
-0.024624379351735115,
0.11525709182024002,
0.019882403314113617,
-0.15057210624217987,
0.02667735330760479,
0.026649516075849533,
-0.02283429354429245,
-0.07186384499073029,
-0.0609268844127655,
-0.06795529276132584,
0.019566867500543594,
0.11338522285223007,
-0.08204059302806854,
0.0527474582195282,
0.0041565378196537495,
0.019760310649871826,
-0.09800475090742111,
0.1904958337545395,
-0.041276171803474426,
-0.019163016229867935,
0.008216376416385174,
0.07999975979328156,
0.0115098487585783,
-0.03164268285036087,
-0.10100005567073822,
0.02143937163054943,
0.143266960978508,
0.011294614523649216,
-0.07596135884523392,
0.07536905258893967,
-0.04323430731892586,
-0.006752114277333021,
0.012380014173686504,
-0.08653366565704346,
0.04958578199148178,
0.021701643243432045,
-0.06251177936792374,
-0.03836263716220856,
0.0152854910120368,
0.030938291922211647,
0.005364327225834131,
0.1511319875717163,
-0.10596444457769394,
0.035878218710422516,
-0.08504165709018707,
-0.13058407604694366,
0.012665065936744213,
-0.10271541774272919,
0.01732594147324562,
-0.1070546880364418,
-0.146981343626976,
-0.03133242577314377,
0.028760617598891258,
-0.02868104912340641,
-0.015948450192809105,
-0.07503481209278107,
-0.08174590766429901,
0.027214035391807556,
-0.008355710655450821,
0.1382209211587906,
-0.06420495361089706,
0.10935403406620026,
0.015008176676928997,
0.054102275520563126,
-0.03397560864686966,
0.04025181382894516,
-0.09931141883134842,
0.00944356806576252,
-0.1950521320104599,
0.042729537934064865,
-0.05571387708187103,
0.06213541701436043,
-0.06939973682165146,
-0.09821640700101852,
0.024431923404335976,
-0.009725266136229038,
0.08250937610864639,
0.11523047089576721,
-0.18214310705661774,
-0.0701146349310875,
0.17218591272830963,
-0.07315579056739807,
-0.09408506006002426,
0.12315081804990768,
-0.08062395453453064,
0.02551323175430298,
0.07692054659128189,
0.15954570472240448,
0.07431964576244354,
-0.09639550000429153,
0.003012000350281596,
-0.03503992035984993,
0.07635107636451721,
-0.013012094423174858,
0.04088006541132927,
0.017395954579114914,
0.01283847913146019,
0.018902136012911797,
-0.011306867934763432,
0.043349213898181915,
-0.10768388956785202,
-0.08676106482744217,
-0.029135843738913536,
-0.09930995851755142,
0.06345118582248688,
0.06865409016609192,
0.07667513191699982,
-0.10530515015125275,
-0.07845496386289597,
0.0440945066511631,
0.06599416583776474,
-0.06313716620206833,
0.02321464940905571,
-0.05262051895260811,
0.06538271903991699,
-0.04301437363028526,
-0.03505519777536392,
-0.19084303081035614,
-0.0691070705652237,
0.02102397382259369,
0.05458155274391174,
0.01691305637359619,
0.01685774140059948,
0.0886327251791954,
0.0936073437333107,
-0.05260201171040535,
-0.02248033508658409,
-0.01570076309144497,
0.0043134018778800964,
-0.14487822353839874,
-0.22528250515460968,
-0.0030043928418308496,
-0.035507313907146454,
0.17706947028636932,
-0.22469523549079895,
0.02650628052651882,
0.015289926901459694,
0.07850228995084763,
0.028930163010954857,
-0.018851224333047867,
-0.0354006327688694,
0.09460529685020447,
-0.042927518486976624,
-0.05121057853102684,
0.060786716639995575,
-0.002148721367120743,
-0.07678834348917007,
-0.04598838463425636,
-0.16914600133895874,
0.14422817528247833,
0.12806624174118042,
-0.10083500295877457,
-0.10875245928764343,
-0.024164307862520218,
-0.04943319037556648,
-0.02036837302148342,
-0.054167233407497406,
0.04527846351265907,
0.18235531449317932,
0.005017479415982962,
0.14386409521102905,
-0.06277906149625778,
-0.039466556161642075,
0.030857395380735397,
-0.022776223719120026,
0.030525358393788338,
0.11416932195425034,
0.10040661692619324,
-0.08259156346321106,
0.13741400837898254,
0.1314065009355545,
-0.09294971078634262,
0.13985967636108398,
-0.02846488170325756,
-0.06309640407562256,
-0.013640901073813438,
-0.022592751309275627,
0.011760292574763298,
0.09601430594921112,
-0.08948441594839096,
-0.005795110948383808,
0.0075063519179821014,
0.01697199046611786,
-0.0019887560047209263,
-0.22728471457958221,
-0.037123363465070724,
0.03494376316666603,
-0.024305827915668488,
0.0036168356891721487,
-0.029973726719617844,
0.022445611655712128,
0.11668483912944794,
0.0007991465390659869,
-0.0943664014339447,
0.010341161862015724,
0.0041742222383618355,
-0.07589888572692871,
0.22808237373828888,
-0.08928186446428299,
-0.13466708362102509,
-0.09334558993577957,
-0.07055757939815521,
-0.041643835604190826,
0.024980440735816956,
0.05002470687031746,
-0.09442909806966782,
-0.019479550421237946,
-0.07355226576328278,
-0.01153052318841219,
0.031197426840662956,
0.04864368960261345,
-0.00874151848256588,
-0.021566404029726982,
0.0449008047580719,
-0.09487191587686539,
-0.013661789707839489,
-0.061204712837934494,
-0.07057420909404755,
0.07301017642021179,
0.05581071972846985,
0.12684670090675354,
0.16333161294460297,
-0.023740483447909355,
-0.0036610071547329426,
-0.039678510278463364,
0.22207239270210266,
-0.08178781718015671,
-0.01789923757314682,
0.08250518888235092,
-0.0382409393787384,
0.04954162985086441,
0.12765386700630188,
0.0636371374130249,
-0.09166767448186874,
0.01845358870923519,
0.03565407171845436,
-0.03276865556836128,
-0.20266975462436676,
-0.03905019909143448,
-0.032630205154418945,
-0.010341915301978588,
0.09809675812721252,
0.01025072205811739,
0.06034050136804581,
0.07239483296871185,
0.04301240295171738,
0.05871997028589249,
-0.02226082794368267,
0.05267021805047989,
0.07575364410877228,
0.051760319620370865,
0.13509230315685272,
-0.021677911281585693,
-0.08466321974992752,
0.030787287279963493,
-0.029960501939058304,
0.19918112456798553,
-0.0019558360800147057,
0.11392483115196228,
0.03987015038728714,
0.1571774035692215,
0.0003446746268309653,
0.09443173557519913,
0.019366523250937462,
-0.05332222208380699,
-0.007206397596746683,
-0.05042578652501106,
-0.04836476966738701,
0.018676385283470154,
-0.04924910143017769,
0.047384776175022125,
-0.13443420827388763,
0.023572612553834915,
0.05446271970868111,
0.24944208562374115,
0.047508832067251205,
-0.33447688817977905,
-0.08020426332950592,
-0.002272813580930233,
-0.01402383204549551,
-0.0207331795245409,
0.005322393029928207,
0.1289222240447998,
-0.09306219965219498,
0.03957531228661537,
-0.07554186135530472,
0.07354070991277695,
-0.04979291558265686,
0.04607574641704559,
0.03944958373904228,
0.09416677057743073,
-0.020465252920985222,
0.051408324390649796,
-0.2839391827583313,
0.28571927547454834,
0.0166635662317276,
0.073077492415905,
-0.07674773037433624,
-0.017641114071011543,
0.0413433313369751,
0.07444009929895401,
0.053441133350133896,
-0.028240518644452095,
-0.04011737182736397,
-0.23451559245586395,
-0.0874873474240303,
0.03013692796230316,
0.119858019053936,
-0.03949980065226555,
0.11868423223495483,
-0.022931694984436035,
-0.006271706894040108,
0.07641114294528961,
-0.039737071841955185,
-0.06839726120233536,
-0.08639679849147797,
-0.008571191690862179,
0.032598525285720825,
-0.032105088233947754,
-0.06262772530317307,
-0.13002370297908783,
-0.1264776885509491,
0.12957048416137695,
-0.02073434181511402,
-0.009944923222064972,
-0.12843014299869537,
0.09407982975244522,
0.06934100389480591,
-0.08449207991361618,
0.03158260136842728,
0.003205382265150547,
0.08592573553323746,
0.013741941191256046,
-0.040580712258815765,
0.11311328411102295,
-0.06266016513109207,
-0.19945847988128662,
-0.07066042721271515,
0.09343241900205612,
0.05874083191156387,
0.06574980914592743,
-0.0042350320145487785,
0.03470518812537193,
-0.017197825014591217,
-0.08618739992380142,
0.02446415089070797,
-0.017130594700574875,
0.07882583886384964,
0.013030018657445908,
-0.04121502488851547,
-0.0003655592736322433,
-0.07129763066768646,
-0.011708271689713001,
0.1866297870874405,
0.2698093354701996,
-0.09292075037956238,
0.010882350616157055,
0.04788700118660927,
-0.06038042902946472,
-0.20754916965961456,
0.06646974384784698,
0.06334438920021057,
0.00002469926948833745,
0.03035012260079384,
-0.15656371414661407,
0.1187271922826767,
0.08644041419029236,
-0.006579020991921425,
0.11033324897289276,
-0.26571428775787354,
-0.14364629983901978,
0.12182105332612991,
0.15772058069705963,
0.10385361313819885,
-0.14519888162612915,
-0.014337706379592419,
-0.005772336386144161,
-0.07984098792076111,
0.10958225280046463,
-0.109589122235775,
0.11838305741548538,
-0.010320626199245453,
0.08363350480794907,
0.0195161160081625,
-0.06583363562822342,
0.10721364617347717,
-0.0032517416402697563,
0.1139565110206604,
-0.06233293190598488,
-0.056898340582847595,
0.031187687069177628,
-0.049840398132801056,
-0.014015678316354752,
-0.06998118758201599,
0.021056603640317917,
-0.0441816970705986,
-0.021525530144572258,
-0.08529480546712875,
0.03526465222239494,
-0.04071392863988876,
-0.06736142933368683,
-0.025677163153886795,
0.02837212383747101,
0.042246729135513306,
-0.017441773787140846,
0.1269795000553131,
-0.0012947681825608015,
0.17779290676116943,
0.1116212010383606,
0.09767679125070572,
-0.052386634051799774,
0.00013814959675073624,
0.004000517074018717,
-0.02660677768290043,
0.04514522850513458,
-0.09700632095336914,
0.026988716796040535,
0.15063080191612244,
0.016655754297971725,
0.13259059190750122,
0.08295371383428574,
-0.022410539910197258,
0.012490388005971909,
0.06201564893126488,
-0.18574295938014984,
-0.05956849083304405,
0.006132874637842178,
-0.04467112571001053,
-0.11718571931123734,
0.033980824053287506,
0.12607090175151825,
-0.06312669813632965,
-0.005805507302284241,
-0.018292207270860672,
0.0014555774396285415,
-0.02374640479683876,
0.21906404197216034,
0.061408672481775284,
0.052431609481573105,
-0.09878963232040405,
0.05515865236520767,
0.05775322765111923,
-0.08736339211463928,
0.01664377562701702,
0.07628344744443893,
-0.09716254472732544,
-0.03940100595355034,
0.10840993374586105,
0.21297433972358704,
-0.055382754653692245,
-0.029137974604964256,
-0.14180371165275574,
-0.11973567306995392,
0.0667438879609108,
0.22016535699367523,
0.08946888148784637,
0.019518934190273285,
-0.05016003176569939,
0.022508734837174416,
-0.13633939623832703,
0.10632789880037308,
0.03611554577946663,
0.08258553594350815,
-0.13455642759799957,
0.20243588089942932,
-0.007845431566238403,
0.02542760595679283,
-0.026682155206799507,
0.03133835271000862,
-0.11077632009983063,
0.0026060291565954685,
-0.12860074639320374,
-0.00900049414485693,
-0.014743338339030743,
-0.0025562841910868883,
-0.000552027951925993,
-0.0669904351234436,
-0.05045340210199356,
-0.004965153057128191,
-0.10949844121932983,
-0.014315648004412651,
0.02113800309598446,
0.03679501265287399,
-0.1334936022758484,
-0.055168088525533676,
0.02260904759168625,
-0.06672436743974686,
0.07507101446390152,
0.04466487467288971,
0.01892845891416073,
0.06275074183940887,
-0.17426389455795288,
0.012261617928743362,
0.0674578920006752,
-0.0038535732310265303,
0.0700729638338089,
-0.07595673948526382,
-0.013412609696388245,
-0.02130233310163021,
0.06992052495479584,
0.03140011429786682,
0.1001506969332695,
-0.11758895963430405,
0.03237459808588028,
-0.0011446548160165548,
-0.08949720114469528,
-0.06579912453889847,
0.03280877321958542,
0.07435586303472519,
-0.013027292676270008,
0.1766088902950287,
-0.09787915647029877,
0.057439543306827545,
-0.21219117939472198,
-0.003661394352093339,
-0.010352726094424725,
-0.11983006447553635,
-0.12993036210536957,
-0.0681842491030693,
0.09061473608016968,
-0.05689379200339317,
0.09206279367208481,
0.020140958949923515,
0.07909641414880753,
0.01982494443655014,
-0.031587984412908554,
0.00773576321080327,
0.04889674484729767,
0.1697605848312378,
0.05014199763536453,
-0.057671185582876205,
0.05932623893022537,
0.05764595791697502,
0.12310940027236938,
0.10037268698215485,
0.24937376379966736,
0.12085458636283875,
-0.005526158958673477,
0.09301235526800156,
0.03097428008913994,
-0.0687796026468277,
-0.1412368267774582,
0.015022342093288898,
-0.09003116190433502,
0.08321337401866913,
-0.036391731351614,
0.20005859434604645,
0.05386555939912796,
-0.15874454379081726,
0.01934167742729187,
-0.06625595688819885,
-0.09792467206716537,
-0.12135221809148788,
-0.002590500982478261,
-0.10051581263542175,
-0.11412230134010315,
-0.00011194270337000489,
-0.11714468896389008,
0.015552252531051636,
0.08526812493801117,
0.019120030105113983,
-0.007312646135687828,
0.1997033953666687,
0.03601985052227974,
0.050062425434589386,
0.06403207033872604,
0.02076500840485096,
-0.015437363646924496,
-0.09336818754673004,
-0.09425429999828339,
-0.02186906896531582,
-0.011634619906544685,
0.03232269734144211,
-0.0706726461648941,
-0.05055667832493782,
0.04519442096352577,
-0.007254885509610176,
-0.11709575355052948,
0.029241887852549553,
0.0155279990285635,
0.06708037853240967,
0.033383216708898544,
0.003737445455044508,
0.02336433157324791,
-0.007698650471866131,
0.2083374559879303,
-0.06806642562150955,
-0.06806332617998123,
-0.09664776176214218,
0.2725503742694855,
0.024647915735840797,
0.010992275550961494,
0.02579233981668949,
-0.07358485460281372,
0.01888756826519966,
0.23065564036369324,
0.19546331465244293,
-0.1108940914273262,
0.012082653120160103,
-0.002382773207500577,
-0.006875287741422653,
-0.02678603120148182,
0.11338077485561371,
0.10708826780319214,
0.024371137842535973,
-0.11170177906751633,
-0.040845826268196106,
-0.031027469784021378,
-0.0170981977134943,
-0.02767135202884674,
0.04037128761410713,
0.03973216563463211,
0.029070422053337097,
-0.05826129391789436,
0.0524015799164772,
-0.058733437210321426,
-0.10981101542711258,
0.06443912535905838,
-0.2320486456155777,
-0.16156893968582153,
-0.01130975503474474,
0.09141403436660767,
-0.012104862369596958,
0.06655731797218323,
-0.021227633580565453,
-0.018130986019968987,
0.04887484014034271,
-0.029762452468276024,
-0.04764970764517784,
-0.06245012953877449,
0.08212670683860779,
-0.10854872316122055,
0.20495258271694183,
-0.05588946118950844,
0.048052649945020676,
0.13198819756507874,
0.06363091617822647,
-0.06033514440059662,
0.058868780732154846,
0.06184376776218414,
-0.09473817050457001,
0.02687266655266285,
0.10953611135482788,
-0.05011112987995148,
0.06705646961927414,
0.05888661742210388,
-0.14849714934825897,
0.04405469074845314,
-0.08420515060424805,
-0.05924754962325096,
-0.029410894960165024,
-0.04222628101706505,
-0.041582752019166946,
0.13174299895763397,
0.19624024629592896,
-0.017590608447790146,
0.030315034091472626,
-0.05818431079387665,
0.002819183748215437,
0.04764680564403534,
0.05440307781100273,
-0.08071419596672058,
-0.2610069811344147,
0.024243243038654327,
0.07952983677387238,
-0.011011251248419285,
-0.26408883929252625,
-0.08770795911550522,
0.0094225462526083,
-0.05399906262755394,
-0.10291098803281784,
0.09189402312040329,
0.05561380088329315,
0.05229540541768074,
-0.05128804221749306,
-0.08214364945888519,
-0.07362645864486694,
0.16501224040985107,
-0.15121546387672424,
-0.08636777102947235
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# finetuned-bert-mrpc
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4382
- Accuracy: 0.8676
- F1: 0.9085
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.5454 | 1.0 | 230 | 0.4396 | 0.8309 | 0.8871 |
| 0.3387 | 2.0 | 460 | 0.3783 | 0.8529 | 0.8976 |
| 0.1956 | 3.0 | 690 | 0.4382 | 0.8676 | 0.9085 |
### Framework versions
- Transformers 4.10.0
- Pytorch 1.9.0+cu102
- Datasets 1.11.0
- Tokenizers 0.10.3
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy", "f1"], "model-index": [{"name": "finetuned-bert-mrpc", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "mrpc"}, "metrics": [{"type": "accuracy", "value": 0.8676470588235294, "name": "Accuracy"}, {"type": "f1", "value": 0.9084745762711864, "name": "F1"}]}]}]} | text-classification | Riad/finetuned-bert-mrpc | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #bert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
| finetuned-bert-mrpc
===================
This model is a fine-tuned version of bert-base-cased on the glue dataset.
It achieves the following results on the evaluation set:
* Loss: 0.4382
* Accuracy: 0.8676
* F1: 0.9085
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3.0
### Training results
### Framework versions
* Transformers 4.10.0
* Pytorch 1.9.0+cu102
* Datasets 1.11.0
* Tokenizers 0.10.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.10.0\n* Pytorch 1.9.0+cu102\n* Datasets 1.11.0\n* Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #bert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.10.0\n* Pytorch 1.9.0+cu102\n* Datasets 1.11.0\n* Tokenizers 0.10.3"
] | [
65,
98,
4,
34
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #bert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0### Training results### Framework versions\n\n\n* Transformers 4.10.0\n* Pytorch 1.9.0+cu102\n* Datasets 1.11.0\n* Tokenizers 0.10.3"
] | [
-0.1099623516201973,
0.09195390343666077,
-0.0018693433376029134,
0.11931179463863373,
0.1645607054233551,
0.039567116647958755,
0.12058848142623901,
0.12693901360034943,
-0.0839335173368454,
0.02246769331395626,
0.12278467416763306,
0.1555577665567398,
0.021530302241444588,
0.11377258598804474,
-0.0461362823843956,
-0.2579251527786255,
-0.010376873426139355,
0.046933554112911224,
-0.06131478399038315,
0.1316031813621521,
0.08891723304986954,
-0.12555338442325592,
0.0954236388206482,
0.012580709531903267,
-0.19738274812698364,
0.00023118406534194946,
0.008881320245563984,
-0.056341368705034256,
0.14700061082839966,
0.028523633256554604,
0.1228896751999855,
-0.0008527235477231443,
0.09087475389242172,
-0.1900576651096344,
0.010642610490322113,
0.050492607057094574,
0.004062595311552286,
0.09623035788536072,
0.05255168303847313,
0.006744243670254946,
0.12041251361370087,
-0.07938406616449356,
0.05738493427634239,
0.03003811649978161,
-0.11874392628669739,
-0.22075767815113068,
-0.07590339332818985,
0.038750141859054565,
0.0715445950627327,
0.10862989723682404,
-0.004281405825167894,
0.12853725254535675,
-0.08733826130628586,
0.08625119179487228,
0.2258506715297699,
-0.2940434515476227,
-0.06456496566534042,
0.04345645010471344,
0.014303714968264103,
0.0457196980714798,
-0.10848381370306015,
-0.030748484656214714,
0.05218559876084328,
0.04919349029660225,
0.12552672624588013,
-0.029897522181272507,
-0.11965041607618332,
0.012435279786586761,
-0.13644148409366608,
-0.024844225496053696,
0.16193819046020508,
0.04004897549748421,
-0.03182058781385422,
-0.045380957424640656,
-0.05408364161849022,
-0.14556756615638733,
-0.0355372428894043,
-0.006946444045752287,
0.04854659363627434,
-0.026648908853530884,
-0.056933075189590454,
-0.0021340888924896717,
-0.1111871600151062,
-0.07247120887041092,
-0.07884345948696136,
0.12376362830400467,
0.032614558935165405,
0.016752617433667183,
-0.031313177198171616,
0.11236543208360672,
-0.007363962475210428,
-0.12801764905452728,
0.020018748939037323,
0.025303814560174942,
0.00595576548948884,
-0.0442751981317997,
-0.05351336672902107,
-0.05115237087011337,
0.015464754775166512,
0.1297973245382309,
-0.04860534518957138,
0.04337131977081299,
0.053719595074653625,
0.044920414686203,
-0.09219382703304291,
0.19801422953605652,
-0.03977487236261368,
-0.02743595838546753,
0.0045997328124940395,
0.0445091538131237,
0.01833212375640869,
-0.011501545086503029,
-0.11994177848100662,
0.004131000954657793,
0.08445000648498535,
0.007640965282917023,
-0.06353721022605896,
0.07352979481220245,
-0.054460614919662476,
-0.023575730621814728,
-0.0007107516285032034,
-0.08879241347312927,
0.026902738958597183,
0.003628041595220566,
-0.07519575953483582,
-0.021138694137334824,
0.03231457620859146,
0.01637820340692997,
-0.015897583216428757,
0.11981043964624405,
-0.0933658629655838,
0.028882784768939018,
-0.0936700701713562,
-0.10996803641319275,
0.01938202977180481,
-0.09813620895147324,
0.02405768632888794,
-0.09450183063745499,
-0.172689750790596,
-0.013659138232469559,
0.05805948004126549,
-0.027751175686717033,
-0.061360158026218414,
-0.04311712458729744,
-0.06542916595935822,
0.014649590477347374,
-0.011267924681305885,
0.13145758211612701,
-0.06779849529266357,
0.0911710113286972,
0.02803470380604267,
0.06101854145526886,
-0.047328803688287735,
0.05671493709087372,
-0.10131806880235672,
0.008760878816246986,
-0.1555294692516327,
0.02687046490609646,
-0.05298766866326332,
0.07167359441518784,
-0.08294843137264252,
-0.0961429700255394,
0.013445778749883175,
0.00008888452430255711,
0.06205778941512108,
0.0964447632431984,
-0.1789816915988922,
-0.08171653002500534,
0.15015845000743866,
-0.07113626599311829,
-0.13219597935676575,
0.11642511188983917,
-0.05363263934850693,
0.051268428564071655,
0.06273319572210312,
0.16581177711486816,
0.0704042837023735,
-0.0863380879163742,
-0.002841040724888444,
0.029924826696515083,
0.05848310515284538,
-0.07959580421447754,
0.07655280828475952,
-0.002416650764644146,
0.009530282579362392,
0.03448062390089035,
-0.029296299442648888,
0.061759427189826965,
-0.09162022173404694,
-0.09796805679798126,
-0.04104851931333542,
-0.08940441906452179,
0.038125183433294296,
0.07543037831783295,
0.0670669749379158,
-0.09455466270446777,
-0.08476106077432632,
0.0526500903069973,
0.08446578681468964,
-0.0477440282702446,
0.023541899397969246,
-0.05232362449169159,
0.0710204467177391,
-0.0345148928463459,
-0.025644570589065552,
-0.17899559438228607,
-0.03943818807601929,
0.0038351244293153286,
-0.0024775280617177486,
0.018384048715233803,
0.025355201214551926,
0.06673914194107056,
0.06136220693588257,
-0.053867120295763016,
-0.016759265214204788,
-0.03234461694955826,
0.00241402187384665,
-0.13522079586982727,
-0.20735763013362885,
-0.032956596463918686,
-0.021699104458093643,
0.1550862193107605,
-0.2036026567220688,
0.042414210736751556,
-0.012809577398002148,
0.07440167665481567,
0.01108099427074194,
-0.00493267085403204,
-0.044962115585803986,
0.07211864739656448,
-0.03956294432282448,
-0.05213974788784981,
0.07642080634832382,
0.01677800342440605,
-0.09315383434295654,
-0.04904377833008766,
-0.09417581558227539,
0.16994906961917877,
0.13646157085895538,
-0.11157583445310593,
-0.07463518530130386,
-0.011962619610130787,
-0.06671375036239624,
-0.03353984281420708,
-0.054317623376846313,
0.026423916220664978,
0.18283648788928986,
-0.0038769724778831005,
0.14897902309894562,
-0.06454623490571976,
-0.04546095430850983,
0.019868893548846245,
-0.03473510965704918,
0.02349720150232315,
0.1222613975405693,
0.1417759507894516,
-0.05684487149119377,
0.1534135937690735,
0.1530696004629135,
-0.0956166535615921,
0.13812781870365143,
-0.04124201089143753,
-0.0715966448187828,
-0.014599706046283245,
-0.04102203622460365,
-0.007958800531923771,
0.11353200674057007,
-0.1530165672302246,
-0.0011265596840530634,
0.0325411818921566,
0.016526564955711365,
0.024751005694270134,
-0.2234777808189392,
-0.03950520232319832,
0.03277337923645973,
-0.03867390751838684,
-0.01214287057518959,
-0.01678779534995556,
0.0049084387719631195,
0.10785243660211563,
0.007321001496165991,
-0.08247160166501999,
0.03628230839967728,
0.0041761090978980064,
-0.08361790329217911,
0.2209569811820984,
-0.07684725522994995,
-0.15639092028141022,
-0.13073627650737762,
-0.07280213385820389,
-0.04865991324186325,
0.00046428185305558145,
0.06456613540649414,
-0.0980115681886673,
-0.034365907311439514,
-0.0662960633635521,
0.031510043889284134,
0.004048082046210766,
0.035412006080150604,
-0.0004081855295225978,
0.0032473644241690636,
0.06937355548143387,
-0.11140744388103485,
-0.015090374276041985,
-0.0614466592669487,
-0.05280723422765732,
0.035926807671785355,
0.03830181807279587,
0.11440595984458923,
0.15418629348278046,
-0.012919986620545387,
0.011597721837460995,
-0.028433294966816902,
0.2380431592464447,
-0.06049168482422829,
-0.025482194498181343,
0.13920973241329193,
-0.010565184988081455,
0.046220578253269196,
0.11752784997224808,
0.07958466559648514,
-0.07750653475522995,
0.0024581346660852432,
0.0421425923705101,
-0.03387393802404404,
-0.230340838432312,
-0.05417139455676079,
-0.053310051560401917,
0.004333828575909138,
0.09008999168872833,
0.028058815747499466,
0.03372926265001297,
0.0710885301232338,
0.03829089552164078,
0.07587788254022598,
-0.04826962947845459,
0.05462060123682022,
0.11823151260614395,
0.03680289164185524,
0.1283741295337677,
-0.048904530704021454,
-0.06043952703475952,
0.04745301604270935,
-0.01603775843977928,
0.217815101146698,
0.005740640684962273,
0.12711142003536224,
0.05878666415810585,
0.16484855115413666,
-0.003804995445534587,
0.07963360100984573,
-0.007836747914552689,
-0.0449637696146965,
-0.014413370750844479,
-0.04168691858649254,
-0.03399999812245369,
0.02330620586872101,
-0.06749162822961807,
0.07140743732452393,
-0.12498118728399277,
0.003162767505273223,
0.05865389108657837,
0.24034857749938965,
0.044460318982601166,
-0.32423844933509827,
-0.09780238568782806,
0.0034092124551534653,
-0.023074842989444733,
-0.024469945579767227,
0.023912396281957626,
0.08627420663833618,
-0.09583806991577148,
0.029090501368045807,
-0.06810063868761063,
0.09872200340032578,
-0.04701734334230423,
0.04968000575900078,
0.09362491965293884,
0.09672902524471283,
0.005608524661511183,
0.09064947068691254,
-0.28504079580307007,
0.2758583724498749,
0.005798816215246916,
0.06498413532972336,
-0.0790952667593956,
0.007737691048532724,
0.04314043000340462,
0.06746888160705566,
0.07431235164403915,
-0.01443100068718195,
-0.02408650331199169,
-0.19957493245601654,
-0.06775258481502533,
0.033484674990177155,
0.06493387371301651,
-0.037662386894226074,
0.08428925275802612,
-0.03283782675862312,
0.00923174899071455,
0.07675690948963165,
0.005254363175481558,
-0.05633237585425377,
-0.10193370282649994,
-0.007589491084218025,
0.02856815792620182,
-0.059985850006341934,
-0.06233203411102295,
-0.12262143194675446,
-0.12420444935560226,
0.16238811612129211,
-0.03211174160242081,
-0.034095779061317444,
-0.11456093937158585,
0.08736875653266907,
0.06646371632814407,
-0.09151667356491089,
0.038579754531383514,
-0.0009776698425412178,
0.07862234115600586,
0.03064943104982376,
-0.07737762480974197,
0.10692007094621658,
-0.07660847157239914,
-0.15482941269874573,
-0.06700288504362106,
0.10229738056659698,
0.030043262988328934,
0.06827183067798615,
-0.015740737318992615,
0.012577613815665245,
-0.04813998565077782,
-0.09196532517671585,
0.01867610588669777,
-0.000158378723426722,
0.07518614083528519,
0.014693928882479668,
-0.07275495678186417,
0.012715239077806473,
-0.05452028289437294,
-0.036271605640649796,
0.1992189735174179,
0.22129712998867035,
-0.10392866283655167,
0.020403077825903893,
0.030177000910043716,
-0.07134175300598145,
-0.20713749527931213,
0.035814836621284485,
0.04955093935132027,
0.009982099756598473,
0.040734801441431046,
-0.17583660781383514,
0.1572076976299286,
0.11083956807851791,
-0.01638602837920189,
0.09949801862239838,
-0.30606207251548767,
-0.12594875693321228,
0.1444849967956543,
0.13399344682693481,
0.11275794357061386,
-0.1411253809928894,
-0.022531146183609962,
-0.026303382590413094,
-0.13885121047496796,
0.1167428120970726,
-0.11135119199752808,
0.11831853538751602,
-0.03597181290388107,
0.07284466922283173,
0.001965837087482214,
-0.05945703759789467,
0.12591418623924255,
0.02517959475517273,
0.08883647620677948,
-0.05948368459939957,
-0.03723510354757309,
0.034841686487197876,
-0.041951168328523636,
0.03111938387155533,
-0.09934544563293457,
0.02762921340763569,
-0.10230135917663574,
-0.02494012750685215,
-0.06929002702236176,
0.046927377581596375,
-0.044196344912052155,
-0.06773862987756729,
-0.033967792987823486,
0.02618524618446827,
0.03758494183421135,
-0.016498005017638206,
0.12982161343097687,
0.02229183353483677,
0.15447451174259186,
0.09697496145963669,
0.07568047195672989,
-0.07738681882619858,
-0.08341407030820847,
-0.01868833415210247,
-0.017553025856614113,
0.05513828247785568,
-0.13819174468517303,
0.02296086959540844,
0.14816772937774658,
0.020690150558948517,
0.14298783242702484,
0.08647476881742477,
-0.022461358457803726,
-0.003245946252718568,
0.0631113052368164,
-0.16225284337997437,
-0.07771051675081253,
-0.014235200360417366,
-0.06357123702764511,
-0.13228052854537964,
0.047727443277835846,
0.09105370193719864,
-0.06632637977600098,
-0.005292607471346855,
-0.005063291639089584,
0.007903960533440113,
-0.05610872060060501,
0.19444550573825836,
0.06483189016580582,
0.04568549245595932,
-0.10198909044265747,
0.06955058127641678,
0.04491600766777992,
-0.06948738545179367,
-0.001052706385962665,
0.07558467239141464,
-0.08452301472425461,
-0.05179308354854584,
0.08425231277942657,
0.1990080028772354,
-0.056815266609191895,
-0.050220370292663574,
-0.14408272504806519,
-0.13065297901630402,
0.08038175106048584,
0.14453090727329254,
0.1217687651515007,
0.013927517458796501,
-0.0615113265812397,
0.004344762768596411,
-0.10938889533281326,
0.09709422290325165,
0.043722543865442276,
0.06413494050502777,
-0.1443548947572708,
0.1488465517759323,
0.01850929483771324,
0.047353170812129974,
-0.020772602409124374,
0.028503717854619026,
-0.1104668378829956,
0.005587735213339329,
-0.11153461039066315,
-0.017201483249664307,
-0.029540086165070534,
0.010224418714642525,
-0.003265869105234742,
-0.055577781051397324,
-0.06107985973358154,
0.009723146446049213,
-0.10743756592273712,
-0.02378648892045021,
0.030740225687623024,
0.06924804300069809,
-0.11584126204252243,
-0.033299341797828674,
0.027310973033308983,
-0.0605965331196785,
0.07088369876146317,
0.04514489695429802,
0.025516264140605927,
0.06001252308487892,
-0.142464280128479,
0.013875860720872879,
0.06752743571996689,
0.0240124873816967,
0.07082417607307434,
-0.08965319395065308,
-0.008825946599245071,
-0.006717182230204344,
0.047813545912504196,
0.022127864882349968,
0.07726994156837463,
-0.13850264251232147,
-0.0015352239133790135,
-0.021091103553771973,
-0.08764521777629852,
-0.06305250525474548,
0.026598626747727394,
0.10040495544672012,
0.01717621646821499,
0.1992562711238861,
-0.07713988423347473,
0.04402167722582817,
-0.22220444679260254,
0.01096754427999258,
-0.014939376153051853,
-0.10308145731687546,
-0.11562187969684601,
-0.07001759111881256,
0.06094980612397194,
-0.05580552667379379,
0.15077826380729675,
0.04487583041191101,
0.037140797823667526,
0.032337892800569534,
-0.0008800430805422366,
0.02109427936375141,
0.014327052980661392,
0.20203235745429993,
0.03676402568817139,
-0.0321761779487133,
0.06142092123627663,
0.04609519988298416,
0.10220816731452942,
0.11421311646699905,
0.20854772627353668,
0.1380224972963333,
0.0012830146588385105,
0.0936136469244957,
0.04446928948163986,
-0.06670842319726944,
-0.1510421633720398,
0.04179738089442253,
-0.04384172335267067,
0.10169597715139389,
-0.02144763059914112,
0.21725068986415863,
0.0626644492149353,
-0.16941006481647491,
0.04186910390853882,
-0.060985490679740906,
-0.08642333000898361,
-0.12199258804321289,
-0.03710310906171799,
-0.08047996461391449,
-0.13166718184947968,
-0.0035971717443317175,
-0.11334836483001709,
-0.002216693479567766,
0.13020449876785278,
0.0032487332355231047,
-0.021915534511208534,
0.1567469835281372,
0.006570795085281134,
0.027038967236876488,
0.049924712628126144,
0.012931476347148418,
-0.03359391912817955,
-0.1263289749622345,
-0.05622165650129318,
-0.01836809515953064,
-0.00907990150153637,
0.028636986389756203,
-0.06656309217214584,
-0.052758678793907166,
0.04021812602877617,
-0.017929403111338615,
-0.09870294481515884,
0.009662381373345852,
0.0070979176089167595,
0.06015006825327873,
0.04568884149193764,
0.006352545693516731,
0.024934442713856697,
-0.007234856020659208,
0.20733898878097534,
-0.07925210893154144,
-0.06796388328075409,
-0.10566052049398422,
0.24671468138694763,
0.0312030091881752,
-0.020651185885071754,
0.03294394910335541,
-0.06915563344955444,
-0.0013645238941535354,
0.25363898277282715,
0.21973951160907745,
-0.08616185933351517,
-0.007584455423057079,
0.016769150272011757,
-0.009215157479047775,
-0.024203168228268623,
0.10354705154895782,
0.13794595003128052,
0.057539429515600204,
-0.09745544195175171,
-0.03961995244026184,
-0.052172645926475525,
-0.017795026302337646,
-0.03392862528562546,
0.07140541821718216,
0.049507323652505875,
0.008347791619598866,
-0.039570409804582596,
0.051628053188323975,
-0.061672620475292206,
-0.08957717567682266,
0.06320344656705856,
-0.21420541405677795,
-0.1690712720155716,
-0.015823597088456154,
0.10640480369329453,
0.006336505059152842,
0.0652453750371933,
-0.02702539972960949,
-0.0016882363706827164,
0.08493463695049286,
-0.020304862409830093,
-0.10683203488588333,
-0.08413222432136536,
0.09582217037677765,
-0.10170868784189224,
0.2269982546567917,
-0.04598504304885864,
0.059263285249471664,
0.12776495516300201,
0.06713955849409103,
-0.07649590820074081,
0.060444727540016174,
0.04037080332636833,
-0.0650382936000824,
0.025807388126850128,
0.0670456811785698,
-0.04124290496110916,
0.06111852452158928,
0.043037138879299164,
-0.1431955248117447,
0.02010318636894226,
-0.05865422636270523,
-0.07036349922418594,
-0.041276197880506516,
-0.027540026232600212,
-0.06204864755272865,
0.13372448086738586,
0.21600843966007233,
-0.026761718094348907,
-0.012145334854722023,
-0.07039070129394531,
0.013311746530234814,
0.05735402926802635,
0.02025877684354782,
-0.06078163906931877,
-0.21058282256126404,
0.020447945222258568,
0.037983380258083344,
-0.02186141535639763,
-0.25049635767936707,
-0.09806964546442032,
0.003135986626148224,
-0.07040722668170929,
-0.0975857824087143,
0.07102695107460022,
0.08755866438150406,
0.050581641495227814,
-0.05825087055563927,
-0.05372558906674385,
-0.07181690633296967,
0.14553287625312805,
-0.14695575833320618,
-0.09914746135473251
] |
null | null | transformers | [Github](https://github.com/rifkybujana/IndoBERT-QA)
This project is part of my research with my friend Muhammad Fajrin Buyang Daffa entitled "Teman Belajar : Asisten Digital Pelajar SMA Negeri 28 Jakarta dalam Membaca" for KOPSI (Kompetisi Penelitian Siswa Indonesia/Indonesian Student Research Competition).
## indoBERT Base-Uncased fine-tuned on Translated Squad v2.0
[IndoBERT](https://huggingface.co/indolem/indobert-base-uncased) trained by [IndoLEM](https://indolem.github.io/) and fine-tuned on [Translated SQuAD 2.0](https://github.com/Wikidepia/indonesian_datasets/tree/master/question-answering/squad) for **Q&A** downstream task.
**Model Size** (after training): 420mb
## Details of indoBERT (from their documentation)
[IndoBERT](https://huggingface.co/indolem/indobert-base-uncased) is the Indonesian version of BERT model. We train the model using over 220M words, aggregated from three main sources:
- Indonesian Wikipedia (74M words)
- news articles from Kompas, Tempo (Tala et al., 2003), and Liputan6 (55M words in total)
- an Indonesian Web Corpus (Medved and Suchomel, 2017) (90M words).
We trained the model for 2.4M steps (180 epochs) with the final perplexity over the development set being 3.97 (similar to English BERT-base).
This IndoBERT was used to examine IndoLEM - an Indonesian benchmark that comprises of seven tasks for the Indonesian language, spanning morpho-syntax, semantics, and discourse.[[1]](#1)
## Details of the downstream task (Q&A) - Dataset
SQuAD2.0 combines the 100,000 questions in SQuAD1.1 with over 50,000 unanswerable questions written adversarially by crowdworkers to look similar to answerable ones. To do well on SQuAD2.0, systems must not only answer questions when possible, but also determine when no answer is supported by the paragraph and abstain from answering.
| Dataset | Split | # samples |
| -------- | ----- | --------- |
| SQuAD2.0 | train | 130k |
| SQuAD2.0 | eval | 12.3k |
## Model Training
The model was trained on a Tesla T4 GPU and 12GB of RAM.
## Results:
| Metric | # Value |
| ------ | --------- |
| **EM** | **51.61** |
| **F1** | **69.09** |
## Simple Usage
```py
from transformers import pipeline
qa_pipeline = pipeline(
"question-answering",
model="Rifky/Indobert-QA",
tokenizer="Rifky/Indobert-QA"
)
qa_pipeline({
'context': """Pangeran Harya Dipanegara (atau biasa dikenal dengan nama Pangeran Diponegoro, lahir di Ngayogyakarta Hadiningrat, 11 November 1785 – meninggal di Makassar, Hindia Belanda, 8 Januari 1855 pada umur 69 tahun) adalah salah seorang pahlawan nasional Republik Indonesia, yang memimpin Perang Diponegoro atau Perang Jawa selama periode tahun 1825 hingga 1830 melawan pemerintah Hindia Belanda. Sejarah mencatat, Perang Diponegoro atau Perang Jawa dikenal sebagai perang yang menelan korban terbanyak dalam sejarah Indonesia, yakni 8.000 korban serdadu Hindia Belanda, 7.000 pribumi, dan 200 ribu orang Jawa serta kerugian materi 25 juta Gulden.""",
'question': "kapan pangeran diponegoro lahir?"
})
```
*output:*
```py
{
'answer': '11 November 1785',
'end': 131,
'score': 0.9272009134292603,
'start': 115
}
```
### Reference
<a id="1">[1]</a>Fajri Koto and Afshin Rahimi and Jey Han Lau and Timothy Baldwin. 2020. IndoLEM and IndoBERT: A Benchmark Dataset and Pre-trained Language Model for Indonesian NLP. Proceedings of the 28th COLING. | {"language": "id", "license": "apache-2.0", "tags": ["indobert", "indolem"], "datasets": ["220M words (IndoWiki, IndoWC, News)", "Squad 2.0 (Indonesian translated)"], "widget": [{"text": "kapan pangeran diponegoro lahir?", "context": "Pangeran Harya Dipanegara (atau biasa dikenal dengan nama Pangeran Diponegoro, lahir di Ngayogyakarta Hadiningrat, 11 November 1785 \u2013 meninggal di Makassar, Hindia Belanda, 8 Januari 1855 pada umur 69 tahun) adalah salah seorang pahlawan nasional Republik Indonesia, yang memimpin Perang Diponegoro atau Perang Jawa selama periode tahun 1825 hingga 1830 melawan pemerintah Hindia Belanda. Sejarah mencatat, Perang Diponegoro atau Perang Jawa dikenal sebagai perang yang menelan korban terbanyak dalam sejarah Indonesia, yakni 8.000 korban serdadu Hindia Belanda, 7.000 pribumi, dan 200 ribu orang Jawa serta kerugian materi 25 juta Gulden."}]} | question-answering | Rifky/Indobert-QA | [
"transformers",
"pytorch",
"safetensors",
"bert",
"question-answering",
"indobert",
"indolem",
"id",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"id"
] | TAGS
#transformers #pytorch #safetensors #bert #question-answering #indobert #indolem #id #license-apache-2.0 #endpoints_compatible #region-us
| Github
This project is part of my research with my friend Muhammad Fajrin Buyang Daffa entitled "Teman Belajar : Asisten Digital Pelajar SMA Negeri 28 Jakarta dalam Membaca" for KOPSI (Kompetisi Penelitian Siswa Indonesia/Indonesian Student Research Competition).
indoBERT Base-Uncased fine-tuned on Translated Squad v2.0
---------------------------------------------------------
IndoBERT trained by IndoLEM and fine-tuned on Translated SQuAD 2.0 for Q&A downstream task.
Model Size (after training): 420mb
Details of indoBERT (from their documentation)
----------------------------------------------
IndoBERT is the Indonesian version of BERT model. We train the model using over 220M words, aggregated from three main sources:
* Indonesian Wikipedia (74M words)
* news articles from Kompas, Tempo (Tala et al., 2003), and Liputan6 (55M words in total)
* an Indonesian Web Corpus (Medved and Suchomel, 2017) (90M words).
We trained the model for 2.4M steps (180 epochs) with the final perplexity over the development set being 3.97 (similar to English BERT-base).
This IndoBERT was used to examine IndoLEM - an Indonesian benchmark that comprises of seven tasks for the Indonesian language, spanning morpho-syntax, semantics, and discourse.[[1]](#1)
Details of the downstream task (Q&A) - Dataset
----------------------------------------------
SQuAD2.0 combines the 100,000 questions in SQuAD1.1 with over 50,000 unanswerable questions written adversarially by crowdworkers to look similar to answerable ones. To do well on SQuAD2.0, systems must not only answer questions when possible, but also determine when no answer is supported by the paragraph and abstain from answering.
Dataset: SQuAD2.0, Split: train, # samples: 130k
Dataset: SQuAD2.0, Split: eval, # samples: 12.3k
Model Training
--------------
The model was trained on a Tesla T4 GPU and 12GB of RAM.
Results:
--------
Simple Usage
------------
*output:*
### Reference
[1]Fajri Koto and Afshin Rahimi and Jey Han Lau and Timothy Baldwin. 2020. IndoLEM and IndoBERT: A Benchmark Dataset and Pre-trained Language Model for Indonesian NLP. Proceedings of the 28th COLING.
| [
"# samples: 130k\nDataset: SQuAD2.0, Split: eval, # samples: 12.3k\n\n\nModel Training\n--------------\n\n\nThe model was trained on a Tesla T4 GPU and 12GB of RAM.\n\n\nResults:\n--------\n\n\n\nSimple Usage\n------------\n\n\n*output:*",
"### Reference\n\n\n[1]Fajri Koto and Afshin Rahimi and Jey Han Lau and Timothy Baldwin. 2020. IndoLEM and IndoBERT: A Benchmark Dataset and Pre-trained Language Model for Indonesian NLP. Proceedings of the 28th COLING."
] | [
"TAGS\n#transformers #pytorch #safetensors #bert #question-answering #indobert #indolem #id #license-apache-2.0 #endpoints_compatible #region-us \n",
"# samples: 130k\nDataset: SQuAD2.0, Split: eval, # samples: 12.3k\n\n\nModel Training\n--------------\n\n\nThe model was trained on a Tesla T4 GPU and 12GB of RAM.\n\n\nResults:\n--------\n\n\n\nSimple Usage\n------------\n\n\n*output:*",
"### Reference\n\n\n[1]Fajri Koto and Afshin Rahimi and Jey Han Lau and Timothy Baldwin. 2020. IndoLEM and IndoBERT: A Benchmark Dataset and Pre-trained Language Model for Indonesian NLP. Proceedings of the 28th COLING."
] | [
50,
61,
62
] | [
"passage: TAGS\n#transformers #pytorch #safetensors #bert #question-answering #indobert #indolem #id #license-apache-2.0 #endpoints_compatible #region-us \n# samples: 130k\nDataset: SQuAD2.0, Split: eval, # samples: 12.3k\n\n\nModel Training\n--------------\n\n\nThe model was trained on a Tesla T4 GPU and 12GB of RAM.\n\n\nResults:\n--------\n\n\n\nSimple Usage\n------------\n\n\n*output:*### Reference\n\n\n[1]Fajri Koto and Afshin Rahimi and Jey Han Lau and Timothy Baldwin. 2020. IndoLEM and IndoBERT: A Benchmark Dataset and Pre-trained Language Model for Indonesian NLP. Proceedings of the 28th COLING."
] | [
-0.07298996299505234,
0.02567691169679165,
0.0001591230829944834,
0.059643860906362534,
0.013555411249399185,
-0.02685893140733242,
0.1390303671360016,
0.08660568296909332,
0.008992569521069527,
-0.03357524797320366,
0.11787352710962296,
0.02928355149924755,
0.050617728382349014,
0.0922347754240036,
-0.020311199128627777,
-0.2691386044025421,
0.08939165621995926,
0.006727663800120354,
0.015732886269688606,
0.125922292470932,
0.10905685275793076,
-0.062144793570041656,
0.09578996151685715,
0.03510096296668053,
-0.06383243203163147,
-0.016499973833560944,
-0.06602796912193298,
-0.08241542428731918,
0.08625478297472,
0.014090389013290405,
0.10756929218769073,
0.034769777208566666,
0.029346872121095657,
-0.15827837586402893,
0.03800614923238754,
-0.039310719817876816,
-0.014801716431975365,
0.04350865259766579,
0.042725130915641785,
0.025641081854701042,
0.22991202771663666,
0.03900531306862831,
-0.015062779188156128,
-0.03404121845960617,
-0.09882526844739914,
-0.06686054915189743,
-0.10159459710121155,
0.06871635466814041,
0.09078135341405869,
0.09316237270832062,
-0.012225129641592503,
0.16805286705493927,
-0.16370737552642822,
0.06660663336515427,
0.07452616095542908,
-0.21962881088256836,
-0.05449375882744789,
0.10189919173717499,
0.03235423192381859,
0.02435333840548992,
-0.02670283429324627,
-0.037924155592918396,
0.06589838117361069,
0.02806748077273369,
-0.037078384310007095,
-0.06586279720067978,
-0.11520788073539734,
0.020950255915522575,
-0.09572043269872665,
-0.006405944004654884,
0.21724453568458557,
0.015019563026726246,
-0.039871133863925934,
-0.007492656819522381,
-0.04615221917629242,
0.004304010421037674,
-0.012700765393674374,
-0.042349014431238174,
-0.021824074909090996,
0.005388934165239334,
0.10543330013751984,
0.019609035924077034,
-0.06776162981987,
-0.048989757895469666,
-0.07788021862506866,
0.08480215817689896,
0.06921135634183884,
0.06486191600561142,
-0.09345858544111252,
0.03915746510028839,
-0.07982602715492249,
-0.11761951446533203,
-0.05688321590423584,
-0.04934314638376236,
0.05169457569718361,
0.007232378702610731,
-0.009918644092977047,
-0.007564612198621035,
0.05422671139240265,
0.12079980969429016,
-0.09149552136659622,
0.0017260770546272397,
0.07236312329769135,
0.020691828802227974,
-0.029084134846925735,
0.07123153656721115,
-0.09045082330703735,
-0.03513890132308006,
0.031097710132598877,
-0.023233234882354736,
0.06361628323793411,
-0.010205200873315334,
-0.03257104381918907,
-0.000600451254285872,
0.03404312953352928,
0.05146370083093643,
-0.05023172125220299,
0.042561255395412445,
-0.004500168841332197,
-0.04199744760990143,
0.10238692164421082,
-0.08287807554006577,
-0.08848116546869278,
-0.05505076423287392,
-0.05994872748851776,
0.03034226968884468,
0.0218819472938776,
0.06511678546667099,
-0.03186687454581261,
-0.00867786817252636,
-0.039767395704984665,
0.014087126590311527,
-0.09983715415000916,
-0.03151460364460945,
0.04607415571808815,
-0.009824653156101704,
0.033451203256845474,
-0.1205497682094574,
-0.2610290050506592,
0.06411955505609512,
0.11793404072523117,
-0.09105999022722244,
0.006850996986031532,
-0.02234780043363571,
-0.032389137893915176,
-0.00458098528906703,
-0.04437102749943733,
0.00889158621430397,
-0.07017625123262405,
0.03529355302453041,
0.039107512682676315,
0.07095000892877579,
-0.09188666939735413,
0.016278350725769997,
-0.14112180471420288,
0.08670390397310257,
-0.1176493689417839,
-0.005631712730973959,
-0.06678222864866257,
-0.005907602142542601,
-0.08504616469144821,
-0.06933552771806717,
-0.027758557349443436,
-0.018808158114552498,
0.047170285135507584,
0.1505887359380722,
-0.17569391429424286,
0.007117709144949913,
0.09821837395429611,
-0.1506979763507843,
-0.22009879350662231,
0.12593354284763336,
-0.029415052384138107,
0.06303957849740982,
0.003904067212715745,
0.17217114567756653,
-0.03175723925232887,
-0.05574439465999603,
-0.028499506413936615,
0.04816058650612831,
0.023859845474362373,
-0.12009592354297638,
0.14719894528388977,
0.0713919997215271,
-0.07156897336244583,
0.05993538722395897,
-0.008354902267456055,
0.06975884735584259,
-0.061871882528066635,
-0.08032422512769699,
-0.009555339813232422,
-0.09282012283802032,
0.009466643445193768,
0.010473457165062428,
0.08706400543451309,
-0.059284184128046036,
0.002203049836680293,
0.01593070849776268,
0.11528561264276505,
-0.02792251668870449,
-0.009526926092803478,
-0.09877865761518478,
0.1308138221502304,
-0.1582069993019104,
-0.01773981563746929,
-0.09065842628479004,
-0.03525499999523163,
0.02915818616747856,
-0.05383746325969696,
0.022902876138687134,
0.03621940687298775,
0.05864390358328819,
0.04805779829621315,
-0.04297669976949692,
0.020329276099801064,
0.047739624977111816,
0.003896784270182252,
-0.028114736080169678,
-0.06859944015741348,
0.03754975274205208,
-0.029059074819087982,
0.08445607125759125,
-0.11866042762994766,
-0.021356845274567604,
-0.07420004159212112,
0.12097885459661484,
-0.02971922792494297,
0.015864484012126923,
0.036316316574811935,
0.0037491372786462307,
-0.0291131604462862,
-0.004042421001940966,
0.04587937891483307,
0.008485312573611736,
-0.027968550100922585,
0.1439044028520584,
0.0039755371399223804,
0.13863258063793182,
0.12812329828739166,
-0.07457710802555084,
0.032304342836141586,
-0.008424051105976105,
-0.033479493111371994,
-0.055750466883182526,
-0.019727298989892006,
0.13679243624210358,
0.1252966672182083,
-0.007805902045220137,
0.10754335671663284,
-0.10875333100557327,
0.019421515986323357,
-0.00392501475289464,
-0.06443409621715546,
-0.018512306734919548,
0.12736614048480988,
0.16982875764369965,
-0.11221733689308167,
0.13149744272232056,
0.14332349598407745,
-0.06397442519664764,
0.09630238264799118,
-0.06264839321374893,
-0.0691981241106987,
-0.02818438597023487,
0.005246699787676334,
-0.03680125251412392,
0.17835883796215057,
-0.1430007368326187,
0.05774170532822609,
0.07542763650417328,
0.009632824920117855,
0.05883434787392616,
-0.11118745803833008,
-0.06861743330955505,
-0.010979168117046356,
-0.014369853772222996,
-0.08487112075090408,
0.07988366484642029,
-0.029169155284762383,
0.07025481015443802,
-0.06545660644769669,
-0.0850462019443512,
0.03195898234844208,
0.003481926629319787,
-0.09176602959632874,
0.19586600363254547,
-0.03815901651978493,
-0.2479570508003235,
-0.057398807257413864,
-0.04063788428902626,
-0.07218912988901138,
-0.04628092423081398,
0.026410497725009918,
-0.06374440342187881,
-0.07331614196300507,
-0.09056654572486877,
-0.04663693532347679,
-0.009727598167955875,
-0.001239968929439783,
-0.04795382171869278,
-0.01761018857359886,
-0.014673099853098392,
-0.10365939885377884,
0.003976334817707539,
-0.007330161519348621,
0.05918208137154579,
0.0800795927643776,
-0.062180545181035995,
0.09505674988031387,
0.0703774094581604,
-0.03414629399776459,
0.03704050928354263,
0.03617518022656441,
0.24104732275009155,
-0.0574442595243454,
0.02225588448345661,
0.17076662182807922,
-0.01605984941124916,
0.0031214370392262936,
0.1403687745332718,
0.009508535265922546,
-0.0838862732052803,
0.030727701261639595,
-0.033800043165683746,
-0.06660080701112747,
-0.233317032456398,
-0.07766973227262497,
-0.06556771695613861,
0.04179470241069794,
-0.0164444912225008,
-0.006197948474436998,
0.061926260590553284,
0.10354950278997421,
0.016861505806446075,
0.07332869619131088,
-0.06266330182552338,
0.04477165639400482,
0.07328144460916519,
0.00026107512530870736,
0.08841370791196823,
-0.07582923769950867,
-0.042886361479759216,
0.0682288110256195,
-0.021515846252441406,
0.18766288459300995,
0.005229452159255743,
0.05648324266076088,
0.0720134750008583,
0.27026355266571045,
0.10259678959846497,
0.08819256722927094,
-0.0991726666688919,
-0.021870383992791176,
-0.02545168250799179,
-0.05271700769662857,
-0.057335805147886276,
0.05696989595890045,
-0.03524255380034447,
0.024131260812282562,
0.035514701157808304,
0.036496493965387344,
0.029722271487116814,
0.11782617121934891,
0.0482497364282608,
-0.1880452185869217,
-0.04768209531903267,
0.008129069581627846,
-0.03500752151012421,
-0.030008452013134956,
0.08559754490852356,
0.06687825173139572,
-0.07133927196264267,
-0.0017052211333066225,
-0.046205755323171616,
0.08627471327781677,
-0.03748401626944542,
-0.021526528522372246,
-0.049636438488960266,
-0.007938414812088013,
0.014447071589529514,
0.13552159070968628,
-0.29221591353416443,
0.26927459239959717,
0.0036166852805763483,
0.0009277358185499907,
-0.08911336958408356,
-0.027305494993925095,
0.021901553496718407,
0.044275883585214615,
0.12397444248199463,
0.03323609381914139,
-0.07350857555866241,
-0.03755414858460426,
-0.12839287519454956,
0.04870789870619774,
0.032636430114507675,
0.08593842387199402,
0.03645886480808258,
-0.014135069213807583,
0.007351256441324949,
0.018323643133044243,
0.05152050405740738,
-0.18659868836402893,
-0.063988097012043,
0.058792710304260254,
0.013294042088091373,
-0.062416400760412216,
-0.061764832586050034,
-0.10519340634346008,
-0.061033789068460464,
0.08321424573659897,
0.031189555302262306,
-0.07084589451551437,
-0.085492342710495,
-0.011811253614723682,
0.13999296724796295,
-0.09806302189826965,
0.008116615936160088,
-0.05264294520020485,
-0.007083879318088293,
0.004148104693740606,
-0.06807047128677368,
0.054963286966085434,
-0.12784522771835327,
-0.059389736503362656,
-0.022398706525564194,
0.053879961371421814,
0.013204642571508884,
0.03332670032978058,
0.02214670181274414,
0.01546508725732565,
-0.04085376486182213,
-0.1359855681657791,
-0.09708822518587112,
0.0058608935214579105,
0.0320294052362442,
-0.00033999199513345957,
-0.015569315291941166,
0.09868607670068741,
0.00016250782937277108,
-0.1287909597158432,
0.13125599920749664,
0.11565595120191574,
-0.007975416257977486,
0.056133147329092026,
0.1615423858165741,
-0.013701814226806164,
-0.20473550260066986,
-0.08274391293525696,
-0.05026192590594292,
0.01385224424302578,
-0.0639929324388504,
-0.06182006746530533,
0.16988947987556458,
0.0801887959241867,
-0.03162504732608795,
-0.003966509830206633,
-0.12045478820800781,
-0.1183987557888031,
0.14740726351737976,
0.1309324949979782,
0.2806006073951721,
-0.07736220210790634,
-0.05283340439200401,
-0.007401670794934034,
-0.18970812857151031,
0.0859202966094017,
-0.052120279520750046,
0.02194785512983799,
-0.04634352773427963,
0.02067793719470501,
-0.009794752113521099,
-0.0835336446762085,
0.14748674631118774,
-0.02852395921945572,
0.03938152641057968,
-0.10342644900083542,
-0.03241000697016716,
-0.006307144183665514,
-0.008248181082308292,
0.16284030675888062,
-0.05117388814687729,
0.0861121341586113,
-0.09409968554973602,
-0.04312354698777199,
-0.05151364952325821,
0.020800966769456863,
-0.010188278742134571,
-0.08113973587751389,
-0.0787634551525116,
0.12018822878599167,
0.02046390436589718,
0.012637853622436523,
0.1387953907251358,
0.05538879707455635,
-0.0532991923391819,
0.03553756698966026,
0.08101470023393631,
-0.12014538794755936,
0.10856705158948898,
-0.06646121293306351,
0.004123309627175331,
0.019076470285654068,
-0.1366659253835678,
-0.024652577936649323,
0.12759366631507874,
-0.004036266822367907,
0.10392718017101288,
0.0067246840335428715,
-0.07392331212759018,
0.009143312461674213,
0.08584676682949066,
-0.10552385449409485,
-0.11416257172822952,
-0.061362460255622864,
0.033285923302173615,
0.01120844017714262,
0.002026178175583482,
0.07182618975639343,
-0.15568414330482483,
0.013812318444252014,
-0.037009719759225845,
0.019294513389468193,
-0.046932246536016464,
0.08522754907608032,
0.08584029227495193,
0.026335734874010086,
-0.1208702027797699,
0.11026117950677872,
0.07433200627565384,
-0.054393600672483444,
-0.02140098810195923,
0.09200701117515564,
-0.1201547384262085,
-0.07728621363639832,
0.014877360314130783,
0.16526001691818237,
-0.09652572125196457,
-0.10949830710887909,
-0.11393940448760986,
-0.06858403235673904,
0.019452108070254326,
0.12499377876520157,
0.08622764050960541,
-0.01973528228700161,
-0.02729969099164009,
-0.058189328759908676,
-0.05618220567703247,
0.13354168832302094,
0.04979267343878746,
0.03482683002948761,
-0.06472518295049667,
-0.08431846648454666,
0.05885947123169899,
0.10912907123565674,
-0.04561734199523926,
-0.02856612205505371,
-0.047627225518226624,
0.022744977846741676,
-0.20354756712913513,
0.01415899395942688,
-0.11198535561561584,
0.0006402710569091141,
-0.01969301886856556,
-0.08990094065666199,
-0.04950471967458725,
0.014201086014509201,
-0.08082687109708786,
0.061555635184049606,
-0.004356775898486376,
0.05565420165657997,
-0.08393494039773941,
-0.043487515300512314,
0.09438342601060867,
0.0011457757791504264,
0.043985046446323395,
0.08235771954059601,
-0.02099164016544819,
0.12352883070707321,
-0.07513970881700516,
0.00330537767149508,
-0.021669862791895866,
0.09573030471801758,
0.0933445617556572,
-0.12892399728298187,
0.005892876069992781,
0.09355900436639786,
0.033698298037052155,
0.02623508684337139,
0.01536976732313633,
-0.057597216218709946,
-0.11246524006128311,
-0.0728994607925415,
-0.08962655812501907,
-0.05652708187699318,
0.016629766672849655,
0.059255123138427734,
0.05183758959174156,
0.14085818827152252,
-0.018282152712345123,
0.04535550996661186,
-0.08501186966896057,
0.009549952112138271,
-0.00037620606599375606,
-0.08486481010913849,
-0.057566627860069275,
-0.07351218909025192,
0.017293408513069153,
-0.01776709593832493,
0.23250752687454224,
0.008970123715698719,
-0.0150383859872818,
-0.010677777230739594,
-0.06758187711238861,
0.037936288863420486,
-0.015785370022058487,
0.27302688360214233,
0.06771207600831985,
0.040024176239967346,
-0.03212755173444748,
0.03886375576257706,
0.02183385379612446,
0.030098671093583107,
0.12021755427122116,
0.23196859657764435,
0.06263072043657303,
0.08321946114301682,
0.01696556806564331,
-0.033015552908182144,
0.0159507617354393,
-0.09068337082862854,
-0.05775967612862587,
0.0392269529402256,
0.06397340446710587,
0.1787872016429901,
0.15286703407764435,
-0.1543361097574234,
0.0064542354084551334,
-0.029017312452197075,
-0.0878562331199646,
-0.07149598747491837,
-0.13975219428539276,
-0.07392428070306778,
-0.10759622603654861,
0.032386645674705505,
-0.1468818038702011,
-0.054052408784627914,
0.18181250989437103,
0.09929116070270538,
-0.05115201324224472,
0.14806686341762543,
-0.025946330279111862,
0.02333405613899231,
0.007923782803118229,
0.011557548306882381,
-0.018128009513020515,
-0.012873092666268349,
-0.04193291440606117,
-0.07898411154747009,
-0.056560445576906204,
-0.0194160845130682,
-0.04296901449561119,
-0.04678351804614067,
0.020003261044621468,
0.04910566657781601,
-0.07693486660718918,
-0.03275737538933754,
-0.006980524864047766,
0.07368844002485275,
0.15792876482009888,
0.02806040644645691,
0.07701539248228073,
0.016123605892062187,
0.12108699977397919,
-0.0004430091648828238,
-0.1143152266740799,
-0.13206377625465393,
0.09596113860607147,
-0.012410823255777359,
-0.0179959237575531,
0.040252797305583954,
-0.032754238694906235,
-0.027339600026607513,
0.2974605858325958,
0.16421881318092346,
-0.0865088403224945,
-0.02454434521496296,
-0.004694225732237101,
0.01720351167023182,
-0.07150623202323914,
0.07206135988235474,
0.13533805310726166,
0.14339812099933624,
-0.07744690775871277,
-0.05656630918383598,
-0.11558005958795547,
-0.03468986973166466,
-0.04238298907876015,
0.09615311771631241,
0.037814151495695114,
-0.09154356271028519,
-0.04731160029768944,
0.08126527070999146,
-0.11673326790332794,
-0.046534214168787,
-0.04923662170767784,
-0.09761808067560196,
-0.139974907040596,
-0.053751587867736816,
-0.00629385095089674,
0.06221476569771767,
-0.02099785767495632,
-0.0495506152510643,
0.05671793594956398,
0.028075475245714188,
0.060515496879816055,
-0.14607805013656616,
-0.09696431457996368,
0.15137475728988647,
0.11213374137878418,
0.16329537332057953,
0.04248134791851044,
0.14811499416828156,
0.09078741818666458,
0.0449373833835125,
-0.08045068383216858,
0.11336544156074524,
0.08717682957649231,
-0.010560461319983006,
-0.035862188786268234,
0.061376675963401794,
-0.009936222806572914,
0.02151431329548359,
0.05273386463522911,
-0.06754875183105469,
0.022154349833726883,
0.05657123401761055,
-0.04113404452800751,
-0.11202165484428406,
0.06188363954424858,
-0.07345891743898392,
0.1475440114736557,
0.1663053035736084,
-0.0436788909137249,
-0.016130637377500534,
-0.06108997017145157,
0.1255936175584793,
-0.016809148713946342,
-0.06352317333221436,
-0.01730543002486229,
-0.10338318347930908,
-0.01193234883248806,
-0.07063443958759308,
0.00458294153213501,
-0.14334620535373688,
-0.019565705209970474,
-0.09364604204893112,
-0.0576757937669754,
-0.04196370020508766,
0.06195458024740219,
0.11386203020811081,
0.04134003818035126,
-0.009807447902858257,
-0.004382656421512365,
-0.05408886820077896,
0.029214810580015182,
-0.08536174893379211,
-0.10932271182537079
] |
null | null | transformers |
# My Awesome Model | {"tags": ["conversational"]} | text-generation | RifsxD/DialoGPT-medium-raifu | [
"transformers",
"pytorch",
"safetensors",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #safetensors #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# My Awesome Model | [
"# My Awesome Model"
] | [
"TAGS\n#transformers #pytorch #safetensors #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# My Awesome Model"
] | [
56,
4
] | [
"passage: TAGS\n#transformers #pytorch #safetensors #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# My Awesome Model"
] | [
-0.06407400220632553,
0.05863384157419205,
-0.005954256746917963,
0.014781830832362175,
0.13735130429267883,
-0.0069021787494421005,
0.09974510222673416,
0.08554313331842422,
-0.0463373102247715,
-0.030043859034776688,
0.15464285016059875,
0.18161383271217346,
-0.02351912297308445,
0.12009468674659729,
-0.06482146680355072,
-0.2764485776424408,
0.0898810625076294,
0.034366898238658905,
0.009100403636693954,
0.12713035941123962,
0.0944036990404129,
-0.06561042368412018,
0.06984348595142365,
-0.0290907621383667,
-0.1095217615365982,
-0.010340718552470207,
0.0153609998524189,
-0.11174924671649933,
0.11685609072446823,
0.014948706142604351,
0.08273766934871674,
0.040070585906505585,
-0.05585326999425888,
-0.13448956608772278,
0.039862241595983505,
0.003669177880510688,
-0.05870913714170456,
0.05656418949365616,
0.0783250629901886,
-0.07164321839809418,
0.13303355872631073,
0.10097901523113251,
-0.037840548902750015,
0.049234308302402496,
-0.16293373703956604,
-0.006085321307182312,
0.013396821916103363,
0.02078264206647873,
0.07635772973299026,
0.11837216466665268,
-0.03566206246614456,
0.12519997358322144,
-0.10500673949718475,
0.09531965106725693,
0.0939951241016388,
-0.2973981201648712,
-0.0176120325922966,
0.08845806121826172,
0.06575068831443787,
0.04571933299303055,
-0.021343670785427094,
0.05159582942724228,
0.030008738860487938,
0.0009018753189593554,
0.04424108937382698,
-0.057039450854063034,
-0.07454381883144379,
0.01302337646484375,
-0.10835633426904678,
-0.04416370019316673,
0.18104127049446106,
-0.033991362899541855,
0.04171906039118767,
-0.08357533067464828,
-0.1261640191078186,
-0.02438364364206791,
-0.044645510613918304,
-0.030143026262521744,
-0.05440123751759529,
0.08094271272420883,
0.003118672640994191,
-0.05752701312303543,
-0.1388181447982788,
-0.026510246098041534,
-0.15371155738830566,
0.18696163594722748,
0.019067533314228058,
0.03710015118122101,
-0.18569771945476532,
0.11145126074552536,
-0.007441604044288397,
-0.09943021833896637,
0.04226033389568329,
-0.11929734796285629,
0.04592617228627205,
-0.0023757147137075663,
-0.011558443307876587,
-0.06809516996145248,
0.09518694877624512,
0.08049754053354263,
-0.04394722729921341,
0.011211669072508812,
0.0011362996883690357,
0.049195244908332825,
0.047609634697437286,
0.07423366606235504,
-0.03457256034016609,
-0.024672672152519226,
0.03942938521504402,
-0.07995833456516266,
0.017520993947982788,
-0.052037470042705536,
-0.14780540764331818,
0.012387482449412346,
0.07980675995349884,
0.08085247874259949,
0.010413412936031818,
0.10886917263269424,
-0.037903860211372375,
0.00011247768998146057,
0.09907388687133789,
-0.02108422853052616,
-0.022785678505897522,
0.011292797513306141,
0.0077031753025949,
0.061942774802446365,
-0.02447204291820526,
0.039020881056785583,
-0.1212543174624443,
0.03113129548728466,
-0.07358385622501373,
-0.01504952646791935,
-0.050594910979270935,
-0.059658076614141464,
0.0033388191368430853,
-0.05452264845371246,
0.014170140027999878,
-0.1728816032409668,
-0.18095694482326508,
-0.000364770763553679,
0.0017494133207947016,
-0.027896961197257042,
-0.06653551757335663,
-0.08777593076229095,
-0.045493729412555695,
0.022407911717891693,
-0.06275284290313721,
-0.02405167743563652,
-0.06982488185167313,
0.07410220801830292,
-0.006549667567014694,
0.049349039793014526,
-0.11730685830116272,
0.06423117965459824,
-0.14396777749061584,
-0.010662809014320374,
-0.06659938395023346,
0.08220575749874115,
0.0038155901711434126,
0.1387825608253479,
0.007751727942377329,
-0.007979484274983406,
-0.0787934884428978,
0.07456783950328827,
-0.024962356314063072,
0.23284831643104553,
-0.10067382454872131,
-0.0774865448474884,
0.28204119205474854,
-0.10889994353055954,
-0.17157073318958282,
0.11421091854572296,
-0.008220581337809563,
0.08040283620357513,
0.145512193441391,
0.1732805222272873,
0.02826845832169056,
0.021463295444846153,
0.07096429914236069,
0.08257736265659332,
-0.10098020732402802,
-0.062089595943689346,
0.009690241888165474,
-0.0022500227205455303,
-0.1359478086233139,
0.04731291905045509,
0.04379824176430702,
0.07360650599002838,
-0.057817354798316956,
-0.02679857425391674,
-0.034338634461164474,
-0.04782556742429733,
0.058107443153858185,
0.013535751961171627,
0.11656031012535095,
-0.057413965463638306,
-0.039227187633514404,
-0.05047693848609924,
0.024065829813480377,
-0.039844051003456116,
0.002399642486125231,
-0.0670861005783081,
0.11999579519033432,
-0.01926465705037117,
0.05466129630804062,
-0.1608252376317978,
-0.08621294796466827,
-0.013489986769855022,
0.11864659935235977,
0.006838178262114525,
0.08752936124801636,
0.0718686580657959,
0.01999937742948532,
0.006203795783221722,
-0.03274533152580261,
0.15175259113311768,
0.012754070572555065,
-0.06663231551647186,
-0.05862380936741829,
0.09942217171192169,
-0.06982056796550751,
0.06675431877374649,
-0.10485399514436722,
0.004467974416911602,
0.0040445830672979355,
0.09022367000579834,
0.014182332903146744,
0.028080932796001434,
0.00044714799150824547,
0.005153636448085308,
-0.060115035623311996,
0.005416953936219215,
0.12361916899681091,
-0.004119902849197388,
-0.07534393668174744,
0.20517785847187042,
-0.19068287312984467,
0.23144042491912842,
0.18347930908203125,
-0.2537880837917328,
0.019854016602039337,
-0.08817621320486069,
-0.03185267746448517,
0.038085274398326874,
0.038319285959005356,
-0.032980162650346756,
0.11726543307304382,
-0.01827290467917919,
0.19463571906089783,
-0.058015599846839905,
-0.019327327609062195,
-0.0026919469237327576,
-0.04804529249668121,
-0.011163183487951756,
0.09266159683465958,
0.08249242603778839,
-0.16877005994319916,
0.17704159021377563,
0.1684570610523224,
0.058957699686288834,
0.164209246635437,
0.010260352864861488,
-0.0006583724170923233,
0.07659466564655304,
0.022270407527685165,
-0.04839418828487396,
-0.04850438982248306,
-0.21459141373634338,
-0.013327167369425297,
0.056435227394104004,
0.04780196398496628,
0.1089622974395752,
-0.1213093176484108,
-0.043940767645835876,
0.002381487051025033,
-0.024718981236219406,
0.021873092278838158,
0.0829884335398674,
0.0103690717369318,
0.12834540009498596,
-0.04459545761346817,
-0.0695972591638565,
0.09172768890857697,
0.014464877545833588,
-0.0892140343785286,
0.21745675802230835,
-0.10490535199642181,
-0.34186339378356934,
-0.10798129439353943,
-0.1324242800474167,
-0.020278070122003555,
0.06144699454307556,
0.0860334113240242,
-0.09834972769021988,
-0.03585297614336014,
-0.05207798629999161,
0.042804136872291565,
-0.04787473380565643,
0.00127386674284935,
-0.08864922821521759,
0.041924718767404556,
-0.07403700053691864,
-0.08551176637411118,
-0.04373682662844658,
-0.032942499965429306,
-0.07246566563844681,
0.15534472465515137,
-0.09440858662128448,
0.06968547403812408,
0.19894692301750183,
0.047015827149152756,
0.04098733514547348,
-0.032665230333805084,
0.20503173768520355,
-0.0807221382856369,
0.01571846380829811,
0.18205015361309052,
-0.04324483126401901,
0.08234623074531555,
0.1384134292602539,
-0.002705642022192478,
-0.08284230530261993,
0.05500287562608719,
-0.05335723236203194,
-0.08575764298439026,
-0.21097669005393982,
-0.1413109004497528,
-0.09309978783130646,
0.10106666386127472,
0.02635340392589569,
0.06844566017389297,
0.13455961644649506,
0.11986267566680908,
-0.0047651855275034904,
-0.037723034620285034,
0.02520367130637169,
0.05543970689177513,
0.13134032487869263,
-0.04931891709566116,
0.11140097677707672,
-0.04319214075803757,
-0.19419357180595398,
0.075251504778862,
0.0182893555611372,
0.08488012850284576,
0.04287387430667877,
0.05446399375796318,
0.019810382276773453,
0.07341387122869492,
0.12050670385360718,
0.14468273520469666,
0.018904635682702065,
-0.04947858676314354,
-0.03297107294201851,
-0.03550562262535095,
-0.04191739857196808,
0.03455140441656113,
0.012232014909386635,
-0.12741655111312866,
-0.07727940380573273,
-0.04700736701488495,
0.09770945459604263,
0.08534170687198639,
0.10628404468297958,
-0.23576045036315918,
0.011502079665660858,
0.0939779281616211,
-0.03522083908319473,
-0.11521667242050171,
0.08652566373348236,
0.02831190824508667,
-0.11859457194805145,
0.0375845730304718,
-0.028631890192627907,
0.11389744281768799,
-0.0628596842288971,
0.09790948033332825,
-0.08970203995704651,
-0.04760356619954109,
0.017102830111980438,
0.129938006401062,
-0.29047468304634094,
0.21792876720428467,
-0.026851613074541092,
-0.021060502156615257,
-0.09461250901222229,
-0.0052006542682647705,
0.022782139480113983,
0.10821671038866043,
0.13125090301036835,
-0.01110907457768917,
-0.014916090294718742,
-0.04657671973109245,
-0.0785815641283989,
0.03458629548549652,
0.08379847556352615,
-0.06549288332462311,
0.011131683364510536,
-0.02575417049229145,
-0.007941235788166523,
-0.02072501927614212,
-0.04989936202764511,
-0.03255590796470642,
-0.14789502322673798,
0.04854733496904373,
0.015065152198076248,
0.11915148049592972,
-0.005611675791442394,
-0.006471460685133934,
-0.11824075877666473,
0.18629994988441467,
0.00856156274676323,
-0.08608636260032654,
-0.08320352435112,
-0.05661199241876602,
0.026664141565561295,
-0.06819191575050354,
0.03608538210391998,
-0.06370686739683151,
0.0462203174829483,
-0.0677662044763565,
-0.18262583017349243,
0.1212594211101532,
-0.12364626675844193,
-0.06454548239707947,
-0.036508895456790924,
0.20089872181415558,
-0.0056716627441346645,
-0.0246078260242939,
0.060455020517110825,
0.010635629296302795,
-0.1115102767944336,
-0.1156802698969841,
0.008025724440813065,
-0.024578066542744637,
0.031088177114725113,
0.05967460572719574,
-0.017016896978020668,
-0.072455033659935,
-0.0297648124396801,
-0.009903048165142536,
0.30236852169036865,
0.14815469086170197,
-0.04892568290233612,
0.15120473504066467,
0.1750151365995407,
-0.05950137972831726,
-0.33840009570121765,
-0.11724990606307983,
-0.12354939430952072,
-0.04191095009446144,
-0.02503066696226597,
-0.13803337514400482,
0.10592670738697052,
-0.013733580708503723,
-0.035355906933546066,
0.10103659331798553,
-0.27642205357551575,
-0.08565109968185425,
0.18941545486450195,
0.045558370649814606,
0.38127297163009644,
-0.13502050936222076,
-0.08777289092540741,
-0.06923934072256088,
-0.08524709194898605,
0.19456297159194946,
-0.1274367868900299,
0.06935609877109528,
-0.000971208093687892,
0.09786029160022736,
0.05755492299795151,
-0.04669499397277832,
0.08443346619606018,
-0.030150754377245903,
-0.013298198580741882,
-0.13033567368984222,
-0.01287262886762619,
0.014688394032418728,
-0.0022974074818193913,
0.049388863146305084,
-0.0565701462328434,
0.03799814358353615,
-0.0881485566496849,
-0.04393261671066284,
-0.06826576590538025,
0.03774624317884445,
0.01892891898751259,
-0.06066718325018883,
-0.008413310162723064,
-0.03875603526830673,
-0.006370006129145622,
0.020370151847600937,
0.15962611138820648,
-0.053078219294548035,
0.1749720573425293,
0.09014151990413666,
0.15765368938446045,
-0.14556419849395752,
0.022921573370695114,
-0.04987688362598419,
-0.0636662170290947,
0.05775195360183716,
-0.0744546428322792,
0.05907001346349716,
0.10414277017116547,
-0.06473629176616669,
0.083281010389328,
0.09406369179487228,
-0.008988839574158192,
-0.003584125079214573,
0.1348746418952942,
-0.2707206904888153,
-0.10133036971092224,
-0.08586028218269348,
0.03588482365012169,
0.08522211760282516,
0.11425703018903732,
0.1681564450263977,
-0.00597394211217761,
-0.026650721207261086,
-0.019051475450396538,
0.03305811807513237,
-0.023384777829051018,
0.06912706047296524,
0.003505158703774214,
0.024594353511929512,
-0.1479238122701645,
0.07674545049667358,
-0.012154331430792809,
-0.13927721977233887,
-0.009007186628878117,
0.14842993021011353,
-0.12886318564414978,
-0.15576045215129852,
-0.0013320166617631912,
0.13982106745243073,
-0.07560098171234131,
-0.0599851980805397,
-0.050097882747650146,
-0.1842988133430481,
0.057932011783123016,
0.12467833608388901,
0.07906411588191986,
0.08946284651756287,
-0.03648936375975609,
-0.008903779089450836,
-0.03670167550444603,
0.002688729204237461,
0.039854519069194794,
-0.00938074104487896,
-0.11798381060361862,
0.05622643232345581,
-0.021877851337194443,
0.10771402716636658,
-0.10171262174844742,
-0.0661444365978241,
-0.15177249908447266,
0.03619108721613884,
-0.11226718127727509,
-0.05750388652086258,
-0.1278703212738037,
-0.04652465134859085,
-0.009460370056331158,
0.0070678540505468845,
-0.024504583328962326,
-0.05356309935450554,
-0.10698384791612625,
0.052093666046857834,
-0.04164401814341545,
-0.013624696061015129,
-0.06708891689777374,
0.028359906747937202,
0.057826340198516846,
-0.03718153387308121,
0.15160411596298218,
0.11042722314596176,
-0.08742155134677887,
0.09450625628232956,
-0.22254687547683716,
-0.05933985114097595,
0.10234057903289795,
-0.007619466166943312,
0.03848179802298546,
0.05162724107503891,
0.031684041023254395,
0.0829564705491066,
0.02176576666533947,
0.03662821277976036,
0.025123372673988342,
-0.0946967601776123,
0.030442673712968826,
0.026436496526002884,
-0.11523568630218506,
-0.050872474908828735,
-0.04892555996775627,
0.018321385607123375,
-0.013051887974143028,
0.13483473658561707,
-0.06546894460916519,
0.08600296080112457,
-0.07346061617136002,
0.027264278382062912,
0.02576225996017456,
-0.1808207929134369,
-0.061981201171875,
-0.08084799349308014,
0.03579394519329071,
-0.006403956096619368,
0.23551100492477417,
0.050012413412332535,
0.0014981504064053297,
0.015497465617954731,
0.06148660555481911,
0.092196524143219,
0.03803722560405731,
0.21213190257549286,
0.08003492653369904,
-0.05847524479031563,
-0.13030754029750824,
0.06530606746673584,
0.029789062216877937,
0.0017167255282402039,
0.12560373544692993,
0.0033701565116643906,
-0.024676157161593437,
0.09800666570663452,
0.004943003412336111,
0.04720645397901535,
-0.0869777500629425,
-0.1718961000442505,
-0.04514159634709358,
0.03856710344552994,
-0.02931709960103035,
0.11169802397489548,
0.1469319760799408,
-0.007879162207245827,
0.017931094393134117,
-0.013393430039286613,
-0.06177692860364914,
-0.1728074997663498,
-0.1638350486755371,
-0.09783531725406647,
-0.11106740683317184,
0.0007634670473635197,
-0.12130019068717957,
-0.004013596102595329,
0.06225483864545822,
0.058183565735816956,
-0.07706790417432785,
0.1293313205242157,
0.08667277544736862,
-0.08666373789310455,
0.08633748441934586,
-0.032067008316516876,
0.06491682678461075,
0.042839035391807556,
-0.016127631068229675,
-0.12377709150314331,
0.02551080659031868,
-0.014069225639104843,
0.044106632471084595,
-0.07920852303504944,
0.04339780658483505,
-0.11630216240882874,
-0.0739513635635376,
-0.06330002844333649,
0.06315742433071136,
-0.03325044363737106,
0.13806584477424622,
0.012727512046694756,
-0.04580513387918472,
0.039176277816295624,
0.2386738657951355,
-0.0571017786860466,
-0.06798969954252243,
-0.05001010745763779,
0.2331542670726776,
0.02116471901535988,
0.11472639441490173,
-0.004066412802785635,
0.0056513468734920025,
-0.04859917610883713,
0.34401369094848633,
0.33225423097610474,
-0.10373461246490479,
0.023635437712073326,
-0.003707406111061573,
0.028719857335090637,
0.08622574061155319,
0.13398706912994385,
0.0758170634508133,
0.31419625878334045,
-0.07449331134557724,
0.009311165660619736,
-0.021655084565281868,
-0.023109523579478264,
-0.07354910671710968,
0.09261824190616608,
0.045717209577560425,
-0.024499520659446716,
-0.039128098636865616,
0.12793534994125366,
-0.26092272996902466,
0.1153697669506073,
-0.16317975521087646,
-0.12712210416793823,
-0.05586245656013489,
0.0029120163526386023,
0.11588822305202484,
0.027559036388993263,
0.10225921869277954,
-0.020271245390176773,
-0.06454423069953918,
0.02718701958656311,
0.022887473925948143,
-0.18403887748718262,
0.017574135214090347,
0.058542340993881226,
-0.04749198257923126,
0.055635400116443634,
-0.008477335795760155,
0.040493279695510864,
0.08164025098085403,
0.005779177416115999,
-0.058053866028785706,
0.10623355209827423,
-0.0042803045362234116,
-0.06349916011095047,
0.05255361646413803,
0.03930153697729111,
0.017325855791568756,
-0.1309146285057068,
0.06458037346601486,
-0.1734851449728012,
0.05544646084308624,
-0.0025506578385829926,
-0.058381833136081696,
-0.01772061176598072,
0.0344461128115654,
-0.05828716233372688,
0.058927446603775024,
0.07496225833892822,
-0.023781659081578255,
0.016647210344672203,
-0.05238018557429314,
0.018704459071159363,
-0.07558104395866394,
-0.07008261233568192,
-0.06292367726564407,
-0.15460272133350372,
-0.09148415923118591,
0.13783513009548187,
-0.0006016927072778344,
-0.23081696033477783,
0.029623636975884438,
-0.12334808707237244,
0.04536272957921028,
-0.15784692764282227,
0.07402198016643524,
0.12989994883537292,
0.02488398179411888,
-0.008615556173026562,
0.047005899250507355,
0.013123968616127968,
0.09982003271579742,
-0.06778299808502197,
-0.11392518877983093
] |
null | null | null |
<div align="left">
## You Only Look Once for Panoptic Driving Perception
> [**You Only Look at Once for Panoptic driving Perception**](https://arxiv.org/abs/2108.11250)
>
> by Dong Wu, Manwen Liao, Weitian Zhang, [Xinggang Wang](https://xinggangw.info/) [*School of EIC, HUST*](http://eic.hust.edu.cn/English/Home.htm)
>
> *arXiv technical report ([arXiv 2108.11250](https://arxiv.org/abs/2108.11250))*
---
### The Illustration of YOLOP

### Contributions
* We put forward an efficient multi-task network that can jointly handle three crucial tasks in autonomous driving: object detection, drivable area segmentation and lane detection to save computational costs, reduce inference time as well as improve the performance of each task. Our work is the first to reach real-time on embedded devices while maintaining state-of-the-art level performance on the `BDD100K `dataset.
* We design the ablative experiments to verify the effectiveness of our multi-tasking scheme. It is proved that the three tasks can be learned jointly without tedious alternating optimization.
### Results
#### Traffic Object Detection Result
| Model | Recall(%) | mAP50(%) | Speed(fps) |
| -------------- | --------- | -------- | ---------- |
| `Multinet` | 81.3 | 60.2 | 8.6 |
| `DLT-Net` | 89.4 | 68.4 | 9.3 |
| `Faster R-CNN` | 77.2 | 55.6 | 5.3 |
| `YOLOv5s` | 86.8 | 77.2 | 82 |
| `YOLOP(ours)` | 89.2 | 76.5 | 41 |
#### Drivable Area Segmentation Result
| Model | mIOU(%) | Speed(fps) |
| ------------- | ------- | ---------- |
| `Multinet` | 71.6 | 8.6 |
| `DLT-Net` | 71.3 | 9.3 |
| `PSPNet` | 89.6 | 11.1 |
| `YOLOP(ours)` | 91.5 | 41 |
#### Lane Detection Result:
| Model | mIOU(%) | IOU(%) |
| ------------- | ------- | ------ |
| `ENet` | 34.12 | 14.64 |
| `SCNN` | 35.79 | 15.84 |
| `ENet-SAD` | 36.56 | 16.02 |
| `YOLOP(ours)` | 70.50 | 26.20 |
#### Ablation Studies 1: End-to-end v.s. Step-by-step:
| Training_method | Recall(%) | AP(%) | mIoU(%) | Accuracy(%) | IoU(%) |
| --------------- | --------- | ----- | ------- | ----------- | ------ |
| `ES-W` | 87.0 | 75.3 | 90.4 | 66.8 | 26.2 |
| `ED-W` | 87.3 | 76.0 | 91.6 | 71.2 | 26.1 |
| `ES-D-W` | 87.0 | 75.1 | 91.7 | 68.6 | 27.0 |
| `ED-S-W` | 87.5 | 76.1 | 91.6 | 68.0 | 26.8 |
| `End-to-end` | 89.2 | 76.5 | 91.5 | 70.5 | 26.2 |
#### Ablation Studies 2: Multi-task v.s. Single task:
| Training_method | Recall(%) | AP(%) | mIoU(%) | Accuracy(%) | IoU(%) | Speed(ms/frame) |
| --------------- | --------- | ----- | ------- | ----------- | ------ | --------------- |
| `Det(only)` | 88.2 | 76.9 | - | - | - | 15.7 |
| `Da-Seg(only)` | - | - | 92.0 | - | - | 14.8 |
| `Ll-Seg(only)` | - | - | - | 79.6 | 27.9 | 14.8 |
| `Multitask` | 89.2 | 76.5 | 91.5 | 70.5 | 26.2 | 24.4 |
**Notes**:
- The works we has use for reference including `Multinet` ([paper](https://arxiv.org/pdf/1612.07695.pdf?utm_campaign=affiliate-ir-Optimise%20media%28%20South%20East%20Asia%29%20Pte.%20ltd._156_-99_national_R_all_ACQ_cpa_en&utm_content=&utm_source=%20388939),[code](https://github.com/MarvinTeichmann/MultiNet)),`DLT-Net` ([paper](https://ieeexplore.ieee.org/abstract/document/8937825)),`Faster R-CNN` ([paper](https://proceedings.neurips.cc/paper/2015/file/14bfa6bb14875e45bba028a21ed38046-Paper.pdf),[code](https://github.com/ShaoqingRen/faster_rcnn)),`YOLOv5s`([code](https://github.com/ultralytics/yolov5)) ,`PSPNet`([paper](https://openaccess.thecvf.com/content_cvpr_2017/papers/Zhao_Pyramid_Scene_Parsing_CVPR_2017_paper.pdf),[code](https://github.com/hszhao/PSPNet)) ,`ENet`([paper](https://arxiv.org/pdf/1606.02147.pdf),[code](https://github.com/osmr/imgclsmob)) `SCNN`([paper](https://www.aaai.org/ocs/index.php/AAAI/AAAI18/paper/download/16802/16322),[code](https://github.com/XingangPan/SCNN)) `SAD-ENet`([paper](https://openaccess.thecvf.com/content_ICCV_2019/papers/Hou_Learning_Lightweight_Lane_Detection_CNNs_by_Self_Attention_Distillation_ICCV_2019_paper.pdf),[code](https://github.com/cardwing/Codes-for-Lane-Detection)). Thanks for their wonderful works.
- In table 4, E, D, S and W refer to Encoder, Detect head, two Segment heads and whole network. So the Algorithm (First, we only train Encoder and Detect head. Then we freeze the Encoder and Detect head as well as train two Segmentation heads. Finally, the entire network is trained jointly for all three tasks.) can be marked as ED-S-W, and the same for others.
---
### Visualization
#### Traffic Object Detection Result

#### Drivable Area Segmentation Result

#### Lane Detection Result

**Notes**:
- The visualization of lane detection result has been post processed by quadratic fitting.
---
### Project Structure
```python
├─inference
│ ├─images # inference images
│ ├─output # inference result
├─lib
│ ├─config/default # configuration of training and validation
│ ├─core
│ │ ├─activations.py # activation function
│ │ ├─evaluate.py # calculation of metric
│ │ ├─function.py # training and validation of model
│ │ ├─general.py #calculation of metric、nms、conversion of data-format、visualization
│ │ ├─loss.py # loss function
│ │ ├─postprocess.py # postprocess(refine da-seg and ll-seg, unrelated to paper)
│ ├─dataset
│ │ ├─AutoDriveDataset.py # Superclass dataset,general function
│ │ ├─bdd.py # Subclass dataset,specific function
│ │ ├─hust.py # Subclass dataset(Campus scene, unrelated to paper)
│ │ ├─convect.py
│ │ ├─DemoDataset.py # demo dataset(image, video and stream)
│ ├─models
│ │ ├─YOLOP.py # Setup and Configuration of model
│ │ ├─light.py # Model lightweight(unrelated to paper, zwt)
│ │ ├─commom.py # calculation module
│ ├─utils
│ │ ├─augmentations.py # data augumentation
│ │ ├─autoanchor.py # auto anchor(k-means)
│ │ ├─split_dataset.py # (Campus scene, unrelated to paper)
│ │ ├─utils.py # logging、device_select、time_measure、optimizer_select、model_save&initialize 、Distributed training
│ ├─run
│ │ ├─dataset/training time # Visualization, logging and model_save
├─tools
│ │ ├─demo.py # demo(folder、camera)
│ │ ├─test.py
│ │ ├─train.py
├─toolkits
│ │ ├─depoly # Deployment of model
├─weights # Pretraining model
```
---
### Requirement
This codebase has been developed with python version 3.7, PyTorch 1.7+ and torchvision 0.8+:
```
conda install pytorch==1.7.0 torchvision==0.8.0 cudatoolkit=10.2 -c pytorch
```
See `requirements.txt` for additional dependencies and version requirements.
```setup
pip install -r requirements.txt
```
### Data preparation
#### Download
- Download the images from [images](https://bdd-data.berkeley.edu/).
- Download the annotations of detection from [det_annotations](https://drive.google.com/file/d/1Ge-R8NTxG1eqd4zbryFo-1Uonuh0Nxyl/view?usp=sharing).
- Download the annotations of drivable area segmentation from [da_seg_annotations](https://drive.google.com/file/d/1xy_DhUZRHR8yrZG3OwTQAHhYTnXn7URv/view?usp=sharing).
- Download the annotations of lane line segmentation from [ll_seg_annotations](https://drive.google.com/file/d/1lDNTPIQj_YLNZVkksKM25CvCHuquJ8AP/view?usp=sharing).
We recommend the dataset directory structure to be the following:
```
# The id represent the correspondence relation
├─dataset root
│ ├─images
│ │ ├─train
│ │ ├─val
│ ├─det_annotations
│ │ ├─train
│ │ ├─val
│ ├─da_seg_annotations
│ │ ├─train
│ │ ├─val
│ ├─ll_seg_annotations
│ │ ├─train
│ │ ├─val
```
Update the your dataset path in the `./lib/config/default.py`.
### Training
You can set the training configuration in the `./lib/config/default.py`. (Including: the loading of preliminary model, loss, data augmentation, optimizer, warm-up and cosine annealing, auto-anchor, training epochs, batch_size).
If you want try alternating optimization or train model for single task, please modify the corresponding configuration in `./lib/config/default.py` to `True`. (As following, all configurations is `False`, which means training multiple tasks end to end).
```python
# Alternating optimization
_C.TRAIN.SEG_ONLY = False # Only train two segmentation branchs
_C.TRAIN.DET_ONLY = False # Only train detection branch
_C.TRAIN.ENC_SEG_ONLY = False # Only train encoder and two segmentation branchs
_C.TRAIN.ENC_DET_ONLY = False # Only train encoder and detection branch
# Single task
_C.TRAIN.DRIVABLE_ONLY = False # Only train da_segmentation task
_C.TRAIN.LANE_ONLY = False # Only train ll_segmentation task
_C.TRAIN.DET_ONLY = False # Only train detection task
```
Start training:
```shell
python tools/train.py
```
### Evaluation
You can set the evaluation configuration in the `./lib/config/default.py`. (Including: batch_size and threshold value for nms).
Start evaluating:
```shell
python tools/test.py --weights weights/End-to-end.pth
```
### Demo Test
We provide two testing method.
#### Folder
You can store the image or video in `--source`, and then save the reasoning result to `--save-dir`
```shell
python tools/demo --source inference/images
```
#### Camera
If there are any camera connected to your computer, you can set the `source` as the camera number(The default is 0).
```shell
python tools/demo --source 0
```
### Deployment
Our model can reason in real-time on `Jetson Tx2`, with `Zed Camera` to capture image. We use `TensorRT` tool for speeding up. We provide code for deployment and reasoning of model in `./toolkits/deploy`.
## Citation
If you find our paper and code useful for your research, please consider giving a star and citation:
```BibTeX
@misc{2108.11250,
Author = {Dong Wu and Manwen Liao and Weitian Zhang and Xinggang Wang},
Title = {YOLOP: You Only Look Once for Panoptic Driving Perception},
Year = {2021},
Eprint = {arXiv:2108.11250},
}
```
| {"tags": ["object-detection"]} | object-detection | Riser/YOLOP | [
"object-detection",
"arxiv:2108.11250",
"arxiv:1612.07695",
"arxiv:1606.02147",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [
"2108.11250",
"1612.07695",
"1606.02147"
] | [] | TAGS
#object-detection #arxiv-2108.11250 #arxiv-1612.07695 #arxiv-1606.02147 #region-us
|
You Only Look Once for Panoptic Driving Perception
----------------------------------------------------
>
> You Only Look at Once for Panoptic driving Perception
>
>
> by Dong Wu, Manwen Liao, Weitian Zhang, Xinggang Wang *School of EIC, HUST*
>
>
> *arXiv technical report (arXiv 2108.11250)*
>
>
>
---
### The Illustration of YOLOP
!yolop
### Contributions
* We put forward an efficient multi-task network that can jointly handle three crucial tasks in autonomous driving: object detection, drivable area segmentation and lane detection to save computational costs, reduce inference time as well as improve the performance of each task. Our work is the first to reach real-time on embedded devices while maintaining state-of-the-art level performance on the 'BDD100K 'dataset.
* We design the ablative experiments to verify the effectiveness of our multi-tasking scheme. It is proved that the three tasks can be learned jointly without tedious alternating optimization.
### Results
#### Traffic Object Detection Result
#### Drivable Area Segmentation Result
Model: 'Multinet', mIOU(%): 71.6, Speed(fps): 8.6
Model: 'DLT-Net', mIOU(%): 71.3, Speed(fps): 9.3
Model: 'PSPNet', mIOU(%): 89.6, Speed(fps): 11.1
Model: 'YOLOP(ours)', mIOU(%): 91.5, Speed(fps): 41
#### Lane Detection Result:
Model: 'ENet', mIOU(%): 34.12, IOU(%): 14.64
Model: 'SCNN', mIOU(%): 35.79, IOU(%): 15.84
Model: 'ENet-SAD', mIOU(%): 36.56, IOU(%): 16.02
Model: 'YOLOP(ours)', mIOU(%): 70.50, IOU(%): 26.20
#### Ablation Studies 1: End-to-end v.s. Step-by-step:
#### Ablation Studies 2: Multi-task v.s. Single task:
Notes:
* The works we has use for reference including 'Multinet' (paper,code),'DLT-Net' (paper),'Faster R-CNN' (paper,code),'YOLOv5s'(code) ,'PSPNet'(paper,code) ,'ENet'(paper,code) 'SCNN'(paper,code) 'SAD-ENet'(paper,code). Thanks for their wonderful works.
* In table 4, E, D, S and W refer to Encoder, Detect head, two Segment heads and whole network. So the Algorithm (First, we only train Encoder and Detect head. Then we freeze the Encoder and Detect head as well as train two Segmentation heads. Finally, the entire network is trained jointly for all three tasks.) can be marked as ED-S-W, and the same for others.
---
### Visualization
#### Traffic Object Detection Result
!detect result
#### Drivable Area Segmentation Result

#### Lane Detection Result

Notes:
* The visualization of lane detection result has been post processed by quadratic fitting.
---
### Project Structure
---
### Requirement
This codebase has been developed with python version 3.7, PyTorch 1.7+ and torchvision 0.8+:
See 'URL' for additional dependencies and version requirements.
### Data preparation
#### Download
* Download the images from images.
* Download the annotations of detection from det\_annotations.
* Download the annotations of drivable area segmentation from da\_seg\_annotations.
* Download the annotations of lane line segmentation from ll\_seg\_annotations.
We recommend the dataset directory structure to be the following:
Update the your dataset path in the './lib/config/URL'.
### Training
You can set the training configuration in the './lib/config/URL'. (Including: the loading of preliminary model, loss, data augmentation, optimizer, warm-up and cosine annealing, auto-anchor, training epochs, batch\_size).
If you want try alternating optimization or train model for single task, please modify the corresponding configuration in './lib/config/URL' to 'True'. (As following, all configurations is 'False', which means training multiple tasks end to end).
Start training:
### Evaluation
You can set the evaluation configuration in the './lib/config/URL'. (Including: batch\_size and threshold value for nms).
Start evaluating:
### Demo Test
We provide two testing method.
#### Folder
You can store the image or video in '--source', and then save the reasoning result to '--save-dir'
#### Camera
If there are any camera connected to your computer, you can set the 'source' as the camera number(The default is 0).
### Deployment
Our model can reason in real-time on 'Jetson Tx2', with 'Zed Camera' to capture image. We use 'TensorRT' tool for speeding up. We provide code for deployment and reasoning of model in './toolkits/deploy'.
If you find our paper and code useful for your research, please consider giving a star and citation:
| [
"### The Illustration of YOLOP\n\n\n!yolop",
"### Contributions\n\n\n* We put forward an efficient multi-task network that can jointly handle three crucial tasks in autonomous driving: object detection, drivable area segmentation and lane detection to save computational costs, reduce inference time as well as improve the performance of each task. Our work is the first to reach real-time on embedded devices while maintaining state-of-the-art level performance on the 'BDD100K 'dataset.\n* We design the ablative experiments to verify the effectiveness of our multi-tasking scheme. It is proved that the three tasks can be learned jointly without tedious alternating optimization.",
"### Results",
"#### Traffic Object Detection Result",
"#### Drivable Area Segmentation Result\n\n\nModel: 'Multinet', mIOU(%): 71.6, Speed(fps): 8.6\nModel: 'DLT-Net', mIOU(%): 71.3, Speed(fps): 9.3\nModel: 'PSPNet', mIOU(%): 89.6, Speed(fps): 11.1\nModel: 'YOLOP(ours)', mIOU(%): 91.5, Speed(fps): 41",
"#### Lane Detection Result:\n\n\nModel: 'ENet', mIOU(%): 34.12, IOU(%): 14.64\nModel: 'SCNN', mIOU(%): 35.79, IOU(%): 15.84\nModel: 'ENet-SAD', mIOU(%): 36.56, IOU(%): 16.02\nModel: 'YOLOP(ours)', mIOU(%): 70.50, IOU(%): 26.20",
"#### Ablation Studies 1: End-to-end v.s. Step-by-step:",
"#### Ablation Studies 2: Multi-task v.s. Single task:\n\n\n\nNotes:\n\n\n* The works we has use for reference including 'Multinet' (paper,code),'DLT-Net' (paper),'Faster R-CNN' (paper,code),'YOLOv5s'(code) ,'PSPNet'(paper,code) ,'ENet'(paper,code) 'SCNN'(paper,code) 'SAD-ENet'(paper,code). Thanks for their wonderful works.\n* In table 4, E, D, S and W refer to Encoder, Detect head, two Segment heads and whole network. So the Algorithm (First, we only train Encoder and Detect head. Then we freeze the Encoder and Detect head as well as train two Segmentation heads. Finally, the entire network is trained jointly for all three tasks.) can be marked as ED-S-W, and the same for others.\n\n\n\n\n---",
"### Visualization",
"#### Traffic Object Detection Result\n\n\n!detect result",
"#### Drivable Area Segmentation Result\n\n\n",
"#### Lane Detection Result\n\n\n\n\n\nNotes:\n\n\n* The visualization of lane detection result has been post processed by quadratic fitting.\n\n\n\n\n---",
"### Project Structure\n\n\n\n\n---",
"### Requirement\n\n\nThis codebase has been developed with python version 3.7, PyTorch 1.7+ and torchvision 0.8+:\n\n\nSee 'URL' for additional dependencies and version requirements.",
"### Data preparation",
"#### Download\n\n\n* Download the images from images.\n* Download the annotations of detection from det\\_annotations.\n* Download the annotations of drivable area segmentation from da\\_seg\\_annotations.\n* Download the annotations of lane line segmentation from ll\\_seg\\_annotations.\n\n\nWe recommend the dataset directory structure to be the following:\n\n\nUpdate the your dataset path in the './lib/config/URL'.",
"### Training\n\n\nYou can set the training configuration in the './lib/config/URL'. (Including: the loading of preliminary model, loss, data augmentation, optimizer, warm-up and cosine annealing, auto-anchor, training epochs, batch\\_size).\n\n\nIf you want try alternating optimization or train model for single task, please modify the corresponding configuration in './lib/config/URL' to 'True'. (As following, all configurations is 'False', which means training multiple tasks end to end).\n\n\nStart training:",
"### Evaluation\n\n\nYou can set the evaluation configuration in the './lib/config/URL'. (Including: batch\\_size and threshold value for nms).\n\n\nStart evaluating:",
"### Demo Test\n\n\nWe provide two testing method.",
"#### Folder\n\n\nYou can store the image or video in '--source', and then save the reasoning result to '--save-dir'",
"#### Camera\n\n\nIf there are any camera connected to your computer, you can set the 'source' as the camera number(The default is 0).",
"### Deployment\n\n\nOur model can reason in real-time on 'Jetson Tx2', with 'Zed Camera' to capture image. We use 'TensorRT' tool for speeding up. We provide code for deployment and reasoning of model in './toolkits/deploy'.\n\n\nIf you find our paper and code useful for your research, please consider giving a star and citation:"
] | [
"TAGS\n#object-detection #arxiv-2108.11250 #arxiv-1612.07695 #arxiv-1606.02147 #region-us \n",
"### The Illustration of YOLOP\n\n\n!yolop",
"### Contributions\n\n\n* We put forward an efficient multi-task network that can jointly handle three crucial tasks in autonomous driving: object detection, drivable area segmentation and lane detection to save computational costs, reduce inference time as well as improve the performance of each task. Our work is the first to reach real-time on embedded devices while maintaining state-of-the-art level performance on the 'BDD100K 'dataset.\n* We design the ablative experiments to verify the effectiveness of our multi-tasking scheme. It is proved that the three tasks can be learned jointly without tedious alternating optimization.",
"### Results",
"#### Traffic Object Detection Result",
"#### Drivable Area Segmentation Result\n\n\nModel: 'Multinet', mIOU(%): 71.6, Speed(fps): 8.6\nModel: 'DLT-Net', mIOU(%): 71.3, Speed(fps): 9.3\nModel: 'PSPNet', mIOU(%): 89.6, Speed(fps): 11.1\nModel: 'YOLOP(ours)', mIOU(%): 91.5, Speed(fps): 41",
"#### Lane Detection Result:\n\n\nModel: 'ENet', mIOU(%): 34.12, IOU(%): 14.64\nModel: 'SCNN', mIOU(%): 35.79, IOU(%): 15.84\nModel: 'ENet-SAD', mIOU(%): 36.56, IOU(%): 16.02\nModel: 'YOLOP(ours)', mIOU(%): 70.50, IOU(%): 26.20",
"#### Ablation Studies 1: End-to-end v.s. Step-by-step:",
"#### Ablation Studies 2: Multi-task v.s. Single task:\n\n\n\nNotes:\n\n\n* The works we has use for reference including 'Multinet' (paper,code),'DLT-Net' (paper),'Faster R-CNN' (paper,code),'YOLOv5s'(code) ,'PSPNet'(paper,code) ,'ENet'(paper,code) 'SCNN'(paper,code) 'SAD-ENet'(paper,code). Thanks for their wonderful works.\n* In table 4, E, D, S and W refer to Encoder, Detect head, two Segment heads and whole network. So the Algorithm (First, we only train Encoder and Detect head. Then we freeze the Encoder and Detect head as well as train two Segmentation heads. Finally, the entire network is trained jointly for all three tasks.) can be marked as ED-S-W, and the same for others.\n\n\n\n\n---",
"### Visualization",
"#### Traffic Object Detection Result\n\n\n!detect result",
"#### Drivable Area Segmentation Result\n\n\n",
"#### Lane Detection Result\n\n\n\n\n\nNotes:\n\n\n* The visualization of lane detection result has been post processed by quadratic fitting.\n\n\n\n\n---",
"### Project Structure\n\n\n\n\n---",
"### Requirement\n\n\nThis codebase has been developed with python version 3.7, PyTorch 1.7+ and torchvision 0.8+:\n\n\nSee 'URL' for additional dependencies and version requirements.",
"### Data preparation",
"#### Download\n\n\n* Download the images from images.\n* Download the annotations of detection from det\\_annotations.\n* Download the annotations of drivable area segmentation from da\\_seg\\_annotations.\n* Download the annotations of lane line segmentation from ll\\_seg\\_annotations.\n\n\nWe recommend the dataset directory structure to be the following:\n\n\nUpdate the your dataset path in the './lib/config/URL'.",
"### Training\n\n\nYou can set the training configuration in the './lib/config/URL'. (Including: the loading of preliminary model, loss, data augmentation, optimizer, warm-up and cosine annealing, auto-anchor, training epochs, batch\\_size).\n\n\nIf you want try alternating optimization or train model for single task, please modify the corresponding configuration in './lib/config/URL' to 'True'. (As following, all configurations is 'False', which means training multiple tasks end to end).\n\n\nStart training:",
"### Evaluation\n\n\nYou can set the evaluation configuration in the './lib/config/URL'. (Including: batch\\_size and threshold value for nms).\n\n\nStart evaluating:",
"### Demo Test\n\n\nWe provide two testing method.",
"#### Folder\n\n\nYou can store the image or video in '--source', and then save the reasoning result to '--save-dir'",
"#### Camera\n\n\nIf there are any camera connected to your computer, you can set the 'source' as the camera number(The default is 0).",
"### Deployment\n\n\nOur model can reason in real-time on 'Jetson Tx2', with 'Zed Camera' to capture image. We use 'TensorRT' tool for speeding up. We provide code for deployment and reasoning of model in './toolkits/deploy'.\n\n\nIf you find our paper and code useful for your research, please consider giving a star and citation:"
] | [
36,
12,
147,
3,
7,
106,
110,
21,
228,
4,
11,
19,
41,
7,
44,
4,
105,
135,
45,
10,
33,
30,
93
] | [
"passage: TAGS\n#object-detection #arxiv-2108.11250 #arxiv-1612.07695 #arxiv-1606.02147 #region-us \n### The Illustration of YOLOP\n\n\n!yolop### Contributions\n\n\n* We put forward an efficient multi-task network that can jointly handle three crucial tasks in autonomous driving: object detection, drivable area segmentation and lane detection to save computational costs, reduce inference time as well as improve the performance of each task. Our work is the first to reach real-time on embedded devices while maintaining state-of-the-art level performance on the 'BDD100K 'dataset.\n* We design the ablative experiments to verify the effectiveness of our multi-tasking scheme. It is proved that the three tasks can be learned jointly without tedious alternating optimization.### Results#### Traffic Object Detection Result#### Drivable Area Segmentation Result\n\n\nModel: 'Multinet', mIOU(%): 71.6, Speed(fps): 8.6\nModel: 'DLT-Net', mIOU(%): 71.3, Speed(fps): 9.3\nModel: 'PSPNet', mIOU(%): 89.6, Speed(fps): 11.1\nModel: 'YOLOP(ours)', mIOU(%): 91.5, Speed(fps): 41#### Lane Detection Result:\n\n\nModel: 'ENet', mIOU(%): 34.12, IOU(%): 14.64\nModel: 'SCNN', mIOU(%): 35.79, IOU(%): 15.84\nModel: 'ENet-SAD', mIOU(%): 36.56, IOU(%): 16.02\nModel: 'YOLOP(ours)', mIOU(%): 70.50, IOU(%): 26.20#### Ablation Studies 1: End-to-end v.s. Step-by-step:",
"passage: #### Ablation Studies 2: Multi-task v.s. Single task:\n\n\n\nNotes:\n\n\n* The works we has use for reference including 'Multinet' (paper,code),'DLT-Net' (paper),'Faster R-CNN' (paper,code),'YOLOv5s'(code) ,'PSPNet'(paper,code) ,'ENet'(paper,code) 'SCNN'(paper,code) 'SAD-ENet'(paper,code). Thanks for their wonderful works.\n* In table 4, E, D, S and W refer to Encoder, Detect head, two Segment heads and whole network. So the Algorithm (First, we only train Encoder and Detect head. Then we freeze the Encoder and Detect head as well as train two Segmentation heads. Finally, the entire network is trained jointly for all three tasks.) can be marked as ED-S-W, and the same for others.\n\n\n\n\n---### Visualization#### Traffic Object Detection Result\n\n\n!detect result#### Drivable Area Segmentation Result\n\n\n#### Lane Detection Result\n\n\n\n\n\nNotes:\n\n\n* The visualization of lane detection result has been post processed by quadratic fitting.\n\n\n\n\n---### Project Structure\n\n\n\n\n---### Requirement\n\n\nThis codebase has been developed with python version 3.7, PyTorch 1.7+ and torchvision 0.8+:\n\n\nSee 'URL' for additional dependencies and version requirements.### Data preparation#### Download\n\n\n* Download the images from images.\n* Download the annotations of detection from det\\_annotations.\n* Download the annotations of drivable area segmentation from da\\_seg\\_annotations.\n* Download the annotations of lane line segmentation from ll\\_seg\\_annotations.\n\n\nWe recommend the dataset directory structure to be the following:\n\n\nUpdate the your dataset path in the './lib/config/URL'.### Training\n\n\nYou can set the training configuration in the './lib/config/URL'. (Including: the loading of preliminary model, loss, data augmentation, optimizer, warm-up and cosine annealing, auto-anchor, training epochs, batch\\_size).\n\n\nIf you want try alternating optimization or train model for single task, please modify the corresponding configuration in './lib/config/URL' to 'True'. (As following, all configurations is 'False', which means training multiple tasks end to end).\n\n\nStart training:### Evaluation\n\n\nYou can set the evaluation configuration in the './lib/config/URL'. (Including: batch\\_size and threshold value for nms).\n\n\nStart evaluating:### Demo Test\n\n\nWe provide two testing method.#### Folder\n\n\nYou can store the image or video in '--source', and then save the reasoning result to '--save-dir'#### Camera\n\n\nIf there are any camera connected to your computer, you can set the 'source' as the camera number(The default is 0)."
] | [
-0.06576700508594513,
0.11310334503650665,
-0.007154106628149748,
0.0320669487118721,
0.0772407129406929,
-0.014837589114904404,
0.06914056837558746,
0.10542227327823639,
-0.011408170685172081,
0.12244175374507904,
0.006673121824860573,
0.0659874975681305,
0.0810675173997879,
0.10282570123672485,
0.028528613969683647,
-0.12968075275421143,
0.0419757142663002,
-0.10389138758182526,
-0.08342084288597107,
0.08503374457359314,
0.0896720439195633,
-0.09079769253730774,
0.08555404841899872,
-0.0005014287307858467,
-0.033206626772880554,
0.004723379388451576,
-0.0653834193944931,
-0.045153822749853134,
0.0656936839222908,
0.06365345418453217,
0.0755983293056488,
0.03557559847831726,
0.043292850255966187,
-0.2348896563053131,
-0.0047066956758499146,
0.10118592530488968,
0.022924600169062614,
0.024071509018540382,
0.1407681405544281,
-0.01672821305692196,
0.07919912040233612,
-0.03191341087222099,
0.058856893330812454,
0.025222688913345337,
-0.07984256744384766,
-0.1602606475353241,
-0.10747131705284119,
0.07162923365831375,
0.1144668236374855,
0.048821769654750824,
-0.021477004513144493,
0.10012144595384598,
-0.05170048400759697,
0.06463664770126343,
0.09199142456054688,
-0.19184738397598267,
-0.04703830927610397,
0.05963515490293503,
0.028175821527838707,
0.028077030554413795,
-0.0843236893415451,
-0.001005577389150858,
0.01177111268043518,
0.03077453002333641,
-0.010313984006643295,
0.007641686126589775,
0.10123023390769958,
0.019666798412799835,
-0.13397380709648132,
-0.07333707809448242,
0.1685468852519989,
0.022574756294488907,
-0.06369654834270477,
-0.09022299945354462,
-0.04018346220254898,
-0.019071213901042938,
-0.036634065210819244,
-0.026122907176613808,
0.0014892085455358028,
0.009275391697883606,
0.022831296548247337,
0.00007555261254310608,
-0.1191948652267456,
-0.02104998379945755,
0.006673024967312813,
0.09057244658470154,
0.07225064933300018,
0.021543404087424278,
0.0036752165760844946,
0.13778837025165558,
0.03528466820716858,
-0.07728255540132523,
-0.046288102865219116,
-0.08424381166696548,
-0.09598582237958908,
-0.010519688948988914,
0.001483139581978321,
-0.10717341303825378,
0.08930552005767822,
0.15175700187683105,
0.04090123623609543,
0.0741618424654007,
-0.04452568292617798,
0.03180565685033798,
0.030491914600133896,
0.15634626150131226,
-0.057592421770095825,
-0.05217012017965317,
-0.04694398492574692,
0.09214094281196594,
-0.024059249088168144,
-0.0032751448452472687,
0.016463542357087135,
0.06766827404499054,
0.06010040268301964,
0.05479852855205536,
0.07736985385417938,
0.06680011004209518,
-0.043575212359428406,
-0.02330225519835949,
0.07151562720537186,
-0.1630481779575348,
0.01883942075073719,
0.043809324502944946,
-0.024567987769842148,
0.025703532621264458,
0.06150003895163536,
-0.02866029553115368,
-0.0756341964006424,
0.04507648944854736,
-0.04517567902803421,
-0.011980386450886726,
-0.10167916119098663,
-0.08096185326576233,
0.019134581089019775,
0.006188143044710159,
-0.0687875747680664,
-0.10943837463855743,
-0.09314355254173279,
-0.09028232842683792,
0.06824962049722672,
-0.058576930314302444,
0.044497545808553696,
-0.019140349701046944,
-0.07134248316287994,
0.0058936672285199165,
0.030344931408762932,
0.04804560914635658,
-0.04998987913131714,
0.0590893030166626,
-0.02983769215643406,
0.034856610000133514,
0.11490805447101593,
0.028396151959896088,
-0.05860443413257599,
0.07704450190067291,
-0.1048729345202446,
0.12512460350990295,
-0.13904476165771484,
0.05289827287197113,
-0.11497055739164352,
-0.0037783728912472725,
-0.053850144147872925,
0.03236484155058861,
0.02386181429028511,
0.10256204009056091,
-0.14791083335876465,
-0.03514296934008598,
0.13153676688671112,
-0.11629973351955414,
-0.06067654862999916,
0.12231088429689407,
0.0043080588802695274,
-0.005186873488128185,
0.04224751889705658,
0.10262434184551239,
0.10917109251022339,
-0.1245349794626236,
-0.049004532396793365,
-0.0021204627119004726,
-0.02063574828207493,
0.12245500832796097,
0.07747868448495865,
-0.032913848757743835,
0.02565198764204979,
0.03715767711400986,
-0.03916463628411293,
-0.019665991887450218,
-0.038363080471754074,
-0.07505406439304352,
-0.006473922170698643,
-0.028810948133468628,
-0.05121534317731857,
-0.0035075806081295013,
-0.02713841013610363,
-0.038582220673561096,
-0.10627634823322296,
-0.08393771201372147,
0.11224045604467392,
-0.03478711098432541,
0.012641525827348232,
-0.1042168065905571,
0.07426439225673676,
-0.034388285130262375,
0.022545907646417618,
-0.14268866181373596,
-0.08611880242824554,
0.036432672291994095,
-0.11435133218765259,
-0.016187775880098343,
0.02000012807548046,
0.02454022876918316,
0.06865760684013367,
-0.002954579424113035,
-0.03523802012205124,
-0.026696674525737762,
0.00003451621159911156,
-0.038916610181331635,
-0.1723909080028534,
-0.04944683238863945,
-0.04092865437269211,
0.10049257427453995,
-0.1382271647453308,
0.019067445769906044,
0.0917300283908844,
0.1357961893081665,
0.02895398996770382,
-0.03672907501459122,
0.006847328506410122,
0.0014443508116528392,
-0.04817618429660797,
-0.042249660938978195,
-0.016801197081804276,
-0.0077442508190870285,
0.0014062889385968447,
0.04968056082725525,
-0.08777613192796707,
-0.012422922998666763,
0.08060846477746964,
-0.003223557723686099,
-0.07208600640296936,
0.02306399494409561,
-0.06962352991104126,
-0.057854361832141876,
-0.06506885588169098,
-0.0678180456161499,
0.022879406809806824,
0.054154105484485626,
0.06772305816411972,
-0.03796900063753128,
-0.03649231046438217,
0.0006647128611803055,
-0.025236966088414192,
-0.057238537818193436,
0.09238968044519424,
0.07976949214935303,
-0.09010942280292511,
0.08968853950500488,
0.03244244307279587,
0.007246565073728561,
0.0952407568693161,
-0.03265243396162987,
-0.10413046181201935,
-0.007496503181755543,
0.07673029601573944,
0.010704951360821724,
0.05693906173110008,
-0.03403087705373764,
0.015090521425008774,
0.04761696606874466,
-0.0015575096476823092,
0.04222385585308075,
-0.11858285218477249,
0.06142184138298035,
0.030107807368040085,
0.0024630706757307053,
0.05750641971826553,
-0.023112542927265167,
-0.0005517099052667618,
0.06998501718044281,
-0.007771689910441637,
-0.044266194105148315,
-0.016932271420955658,
-0.06759603321552277,
-0.08044108003377914,
0.14703598618507385,
-0.06098901107907295,
-0.2482599914073944,
-0.15725566446781158,
-0.023330338299274445,
-0.05945878475904465,
-0.008299172855913639,
0.03958549350500107,
-0.04430081695318222,
-0.09635791182518005,
-0.0937243327498436,
0.010855183005332947,
0.0373699776828289,
-0.036250051110982895,
0.021571576595306396,
0.0721789300441742,
0.03913135826587677,
-0.11756503582000732,
-0.014588365331292152,
-0.000043837353587150574,
-0.02790084294974804,
0.0404890775680542,
0.03052026778459549,
0.09292459487915039,
0.10429777204990387,
0.05650739744305611,
0.0052164760418236256,
0.02741755172610283,
0.15599393844604492,
-0.08754542469978333,
0.06329357624053955,
0.07337792217731476,
-0.021715858951210976,
0.07710768282413483,
0.1236414685845375,
0.026651009917259216,
-0.06170414015650749,
0.025063415989279747,
0.06786292046308517,
-0.00001579616218805313,
-0.23677018284797668,
-0.06363123655319214,
-0.03720889240503311,
0.01302849967032671,
0.061401039361953735,
0.07362931966781616,
-0.005831517744809389,
0.009157462976872921,
-0.044676803052425385,
0.021037960425019264,
0.05910216644406319,
0.07239757478237152,
0.11054791510105133,
-0.002499248366802931,
0.07606032490730286,
-0.060058824717998505,
-0.04097701236605644,
0.0694054439663887,
0.040066175162792206,
0.19519494473934174,
-0.01989717222750187,
0.025384359061717987,
0.1010575145483017,
0.051280297338962555,
0.010476955212652683,
0.01466620247811079,
-0.009787624701857567,
0.05005739629268646,
-0.020188461989164352,
-0.08345664292573929,
-0.013264002278447151,
0.08707103133201599,
0.07748857885599136,
-0.078959159553051,
0.009806313551962376,
0.06283987313508987,
0.1049279272556305,
0.1898479461669922,
0.08259676396846771,
-0.19754843413829803,
0.008019734174013138,
0.022817939519882202,
-0.06061752885580063,
-0.06678047776222229,
0.019740890711545944,
0.07594648003578186,
-0.10067181289196014,
0.027455128729343414,
-0.03472176194190979,
0.05981788784265518,
-0.08301421254873276,
-0.005614348687231541,
0.055678971111774445,
0.05846264585852623,
-0.0041953264735639095,
0.058859847486019135,
-0.1603449583053589,
0.1382165104150772,
-0.008332539349794388,
0.06547202169895172,
-0.013245857320725918,
0.0776558518409729,
0.04458014294505119,
0.06888139992952347,
0.11023770272731781,
-0.0031560389325022697,
-0.08272729068994522,
-0.0637010931968689,
-0.15827229619026184,
0.020678924396634102,
0.08207452297210693,
-0.05821049585938454,
0.11273187398910522,
-0.01327885128557682,
-0.04280497878789902,
-0.044572070240974426,
-0.032473817467689514,
-0.09725724160671234,
-0.1510126143693924,
0.08483027666807175,
-0.10721733421087265,
0.009956994093954563,
-0.0809689462184906,
-0.05150105431675911,
-0.08409328758716583,
0.16374923288822174,
-0.11988995224237442,
-0.08352763950824738,
-0.11015897244215012,
0.012735212221741676,
0.10612896084785461,
-0.03753121197223663,
0.01164345070719719,
-0.00891299918293953,
0.08282902091741562,
0.0013998495414853096,
-0.0714353621006012,
0.050383877009153366,
-0.08642896264791489,
-0.15627524256706238,
-0.06481490284204483,
0.11777853965759277,
-0.01578555814921856,
0.027712509036064148,
-0.024815646931529045,
0.01485281903296709,
0.01023656316101551,
-0.08028359711170197,
0.045010462403297424,
0.15789392590522766,
0.0016547497361898422,
0.060901373624801636,
-0.06286080181598663,
-0.10856969654560089,
-0.08967216312885284,
-0.04721345752477646,
0.03205680102109909,
0.20091350376605988,
-0.04675906151533127,
0.0879259929060936,
0.1247292011976242,
-0.09965769201517105,
-0.23917725682258606,
-0.08017531037330627,
0.03447314351797104,
-0.00568936113268137,
0.018005233258008957,
-0.14775237441062927,
0.060248710215091705,
0.06902705878019333,
-0.028646990656852722,
0.07846876978874207,
-0.30499395728111267,
-0.10476075112819672,
0.06609170138835907,
0.014246547594666481,
-0.050436582416296005,
-0.13350416719913483,
-0.05227728560566902,
-0.006085726898163557,
-0.16123345494270325,
-0.010192879475653172,
0.06773984432220459,
0.07407283037900925,
0.002640475519001484,
-0.05933182314038277,
0.048979464918375015,
-0.07396680116653442,
0.14881497621536255,
-0.0007595206843689084,
0.06238877400755882,
-0.04850988835096359,
0.01798017881810665,
0.005929555743932724,
-0.053050439804792404,
0.12772442400455475,
0.027711985632777214,
0.039188411086797714,
-0.04819990694522858,
-0.038076452910900116,
-0.04353812709450722,
0.03201228007674217,
-0.03306925296783447,
-0.031909070909023285,
-0.06435497105121613,
0.07132423669099808,
0.07583794742822647,
0.000945711974054575,
0.04472489282488823,
0.004798445850610733,
-0.029903609305620193,
0.16096436977386475,
0.04848513752222061,
0.026782678440213203,
-0.15151569247245789,
-0.03824488818645477,
-0.004242990165948868,
0.05958167091012001,
-0.12997733056545258,
0.06854507327079773,
0.11600662767887115,
-0.017878782004117966,
0.1288643628358841,
0.02738386020064354,
-0.16087189316749573,
-0.023736823350191116,
0.10841012001037598,
-0.11323946714401245,
-0.1674710512161255,
-0.040273960679769516,
0.019964437931776047,
-0.038868505507707596,
-0.01274867169559002,
0.10638496279716492,
-0.049458131194114685,
0.0014385422691702843,
-0.0018132636323571205,
0.11270630359649658,
0.00390278035774827,
0.15180276334285736,
0.03412289172410965,
0.040328267961740494,
-0.046102263033390045,
0.13568808138370514,
0.09229611605405807,
-0.10896328091621399,
0.021359166130423546,
0.07675705850124359,
-0.07985465228557587,
-0.058699049055576324,
-0.021495576947927475,
0.09266673028469086,
0.061742622405290604,
-0.07559850811958313,
-0.0605107918381691,
-0.0689709484577179,
0.058890607208013535,
0.03052438795566559,
0.042731668800115585,
0.11095449328422546,
0.017330829054117203,
-0.011453264392912388,
-0.05557393282651901,
0.1382429003715515,
0.07236850261688232,
0.05333206057548523,
-0.12604740262031555,
0.039614856243133545,
0.01889549195766449,
0.025275714695453644,
-0.010394943878054619,
-0.04686460644006729,
-0.09336117655038834,
0.011028699576854706,
-0.13488911092281342,
0.011999361217021942,
-0.041414521634578705,
-0.022018786519765854,
0.04901823401451111,
-0.006076308898627758,
-0.01556006446480751,
0.04972701519727707,
-0.08182469010353088,
-0.055655792355537415,
-0.034627340734004974,
0.0924900472164154,
-0.13467326760292053,
-0.024796713143587112,
0.058502197265625,
-0.12698593735694885,
0.04685647413134575,
0.01687045581638813,
-0.010495456866919994,
0.017294693738222122,
-0.056548647582530975,
-0.00906579103320837,
0.01442199107259512,
0.04927339404821396,
0.005811164155602455,
-0.11534875631332397,
0.023618316277861595,
-0.014658625237643719,
-0.05762416869401932,
-0.013864079490303993,
0.055001504719257355,
-0.11629313230514526,
0.05142543464899063,
-0.014976844191551208,
-0.08622269332408905,
-0.04555835574865341,
0.022745300084352493,
0.032542429864406586,
0.07430559396743774,
0.1469898372888565,
-0.07334473729133606,
0.007647900842130184,
-0.1330103725194931,
-0.02227022871375084,
0.018099870532751083,
0.0052740019746124744,
0.05488124489784241,
-0.03206806257367134,
0.054345086216926575,
-0.06769278645515442,
0.11447367817163467,
0.04554535076022148,
-0.057501938194036484,
0.024910978972911835,
-0.09091313183307648,
-0.06102066487073898,
0.06946549564599991,
-0.026871690526604652,
0.010800197720527649,
0.010279764421284199,
-0.015764381736516953,
-0.08214861899614334,
0.028571367263793945,
0.013560367748141289,
0.06286298483610153,
0.1485646814107895,
0.10118827223777771,
0.07510437816381454,
0.0784468874335289,
-0.08518248051404953,
-0.15331125259399414,
0.060194894671440125,
-0.10742539167404175,
0.10330383479595184,
-0.07351426780223846,
0.04595349356532097,
0.1083720326423645,
-0.11871002614498138,
0.0778469368815422,
-0.07515860348939896,
-0.0459417887032032,
-0.06377368420362473,
-0.1283295899629593,
-0.06264397501945496,
-0.07076497375965118,
0.01188137661665678,
-0.055703915655612946,
0.049264777451753616,
0.09103605151176453,
0.04371486231684685,
-0.01542142778635025,
0.1145959347486496,
-0.05771617591381073,
-0.03504398837685585,
0.03650011122226715,
-0.006104856729507446,
-0.010830639861524105,
0.03305090591311455,
0.017910830676555634,
-0.004856438376009464,
0.048148609697818756,
0.03852842003107071,
0.06279440224170685,
0.012800287455320358,
0.05722011253237724,
-0.017086582258343697,
-0.06496186554431915,
0.009002961218357086,
-0.0679553896188736,
-0.029082532972097397,
0.03894700109958649,
0.08175491541624069,
-0.030174873769283295,
0.013298693113029003,
0.1657622754573822,
-0.05189857631921768,
-0.10324317216873169,
-0.18438535928726196,
0.16388779878616333,
0.02990100346505642,
0.04579688236117363,
0.025588292628526688,
-0.12019722163677216,
-0.019916126504540443,
0.14641477167606354,
0.13882791996002197,
0.0014742035418748856,
-0.015328355133533478,
0.04805412143468857,
0.001903024036437273,
-0.035542529076337814,
0.07565250247716904,
0.03526393324136734,
0.19631564617156982,
-0.0380987748503685,
0.031724125146865845,
-0.04978892579674721,
-0.03588620945811272,
-0.05382564663887024,
0.10958905518054962,
-0.009583957493305206,
0.015611518174409866,
-0.0903686136007309,
0.08151353895664215,
-0.015265099704265594,
-0.17654502391815186,
0.0726746916770935,
-0.047793999314308167,
-0.1214141994714737,
0.018496669828891754,
0.08453523367643356,
-0.003224531188607216,
0.04806628078222275,
-0.014451776631176472,
-0.04686376824975014,
0.17018640041351318,
0.0053614117205142975,
-0.0308659840375185,
-0.04292387515306473,
0.07712308317422867,
-0.015326876193284988,
0.23051130771636963,
0.02605338580906391,
0.04815676808357239,
0.09101397544145584,
-0.004071047995239496,
-0.10463269054889679,
0.034785762429237366,
0.06384792178869247,
-0.06695745885372162,
0.018929215148091316,
0.09729432314634323,
0.00933582428842783,
0.1298171430826187,
0.04815470427274704,
0.023545194417238235,
0.006989859975874424,
0.014415784738957882,
-0.0013774093240499496,
-0.07750353962182999,
0.027042310684919357,
-0.08169209957122803,
0.09406665712594986,
0.1669347882270813,
-0.012570862658321857,
0.027576591819524765,
-0.06066080182790756,
0.05353124439716339,
-0.020660288631916046,
0.05622861534357071,
-0.009776370599865913,
-0.18118324875831604,
0.05853059142827988,
-0.05965699627995491,
0.08389148116111755,
-0.12674248218536377,
-0.09547887742519379,
0.061784692108631134,
-0.011765080504119396,
-0.051816485822200775,
0.12398327887058258,
0.11454547196626663,
0.02543955110013485,
-0.05365350842475891,
-0.10707126557826996,
0.004117097705602646,
0.08319003880023956,
-0.0876636952161789,
-0.05522208660840988
] |
null | null | transformers |
# Rick Morty DialogGPT Model | {"tags": ["conversational"]} | text-generation | RishabhRawatt/DialoGPT-small-Rickmorty | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Rick Morty DialogGPT Model | [
"# Rick Morty DialogGPT Model"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Rick Morty DialogGPT Model"
] | [
51,
8
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Rick Morty DialogGPT Model"
] | [
-0.023798981681466103,
0.11538524925708771,
-0.005656353197991848,
0.022828223183751106,
0.11626420170068741,
-0.0034535943996161222,
0.1369960755109787,
0.1319374442100525,
-0.006769521161913872,
-0.04442106932401657,
0.16814489662647247,
0.2238686978816986,
-0.019758660346269608,
0.06239933893084526,
-0.0693882405757904,
-0.30583837628364563,
0.05562753230333328,
0.04279051721096039,
-0.04822838678956032,
0.11313346028327942,
0.09340415149927139,
-0.04052134230732918,
0.07901844382286072,
0.013686655089259148,
-0.12343241274356842,
0.004550142679363489,
0.007839604280889034,
-0.09531959146261215,
0.12006363272666931,
0.07797135412693024,
0.001651199534535408,
0.050972241908311844,
-0.057349614799022675,
-0.10632959753274918,
0.05332929268479347,
0.005380680784583092,
-0.05373387038707733,
0.07912775874137878,
0.008507350459694862,
-0.08918952196836472,
0.13344618678092957,
0.12728388607501984,
-0.0011566592147573829,
0.05084497109055519,
-0.16611993312835693,
-0.02799624763429165,
-0.015695953741669655,
0.041620418429374695,
0.050260938704013824,
0.10818368196487427,
-0.037374332547187805,
0.1154436394572258,
-0.052827246487140656,
0.11334727704524994,
0.11772149056196213,
-0.2879379689693451,
-0.019036857411265373,
0.18753179907798767,
0.03285595774650574,
0.034668419510126114,
-0.048750605434179306,
0.09664832800626755,
0.0053856633603572845,
-0.001547905383631587,
-0.027306731790304184,
-0.07416978478431702,
-0.046665120869874954,
0.05524205416440964,
-0.07570357620716095,
-0.0017061455873772502,
0.2392030954360962,
-0.017250636592507362,
0.09698506444692612,
-0.06311807036399841,
-0.08891307562589645,
-0.006160805467516184,
-0.038904834538698196,
-0.03640293702483177,
-0.09991926699876785,
0.0749812200665474,
-0.06401310116052628,
-0.10203509777784348,
-0.10566772520542145,
-0.018753532320261,
-0.15033598244190216,
0.15633241832256317,
0.022168515250086784,
0.04396628588438034,
-0.20398403704166412,
0.07866771519184113,
-0.04299751669168472,
-0.0870501846075058,
0.006363013759255409,
-0.09682308882474899,
-0.00011609470675466582,
0.011411107145249844,
-0.037969838827848434,
-0.007807722315192223,
0.09230833500623703,
0.14514489471912384,
-0.007027007173746824,
0.0240456722676754,
-0.007096671033650637,
0.05028802156448364,
0.0583503283560276,
0.04147806763648987,
-0.046825725585222244,
-0.0037441716995090246,
0.03025037609040737,
-0.08885262161493301,
-0.006665239110589027,
-0.07305952906608582,
-0.205366313457489,
0.013345073908567429,
0.060701459646224976,
0.04754005745053291,
0.027391210198402405,
0.14699222147464752,
-0.0041152555495500565,
-0.053532179445028305,
0.04931226372718811,
0.003954550717025995,
-0.02152150310575962,
0.020250679925084114,
-0.008234773762524128,
0.139596089720726,
0.031267132610082626,
0.045823678374290466,
-0.09320481866598129,
0.018911007791757584,
-0.05272631347179413,
-0.016834350302815437,
-0.03772267326712608,
-0.04272796958684921,
0.010507272556424141,
0.002128424821421504,
0.011431071907281876,
-0.1279793381690979,
-0.15511080622673035,
0.0019243289716541767,
0.004154948517680168,
-0.04831090569496155,
-0.10690688341856003,
-0.10364111512899399,
-0.029509324580430984,
0.03441038727760315,
-0.051268745213747025,
0.044517844915390015,
-0.04735834524035454,
0.09116756916046143,
-0.028583256527781487,
0.08396720886230469,
-0.09993933141231537,
0.08725494891405106,
-0.06228969618678093,
-0.04146892949938774,
-0.05484386533498764,
0.10510008037090302,
0.002526267431676388,
0.05038241297006607,
-0.029650870710611343,
-0.016145877540111542,
-0.11097126454114914,
0.05487862974405289,
-0.04194694012403488,
0.2429833561182022,
-0.07043967396020889,
-0.0946897491812706,
0.22190052270889282,
-0.03378250449895859,
-0.14474467933177948,
0.1132299154996872,
-0.03191240876913071,
0.10624157637357712,
0.12090636789798737,
0.15750381350517273,
-0.01273275911808014,
-0.009117589332163334,
0.11721146106719971,
0.12792561948299408,
-0.09467656165361404,
0.003999055363237858,
0.030336474999785423,
-0.01392664946615696,
-0.09955185651779175,
0.0266424473375082,
0.09664425253868103,
0.04499254748225212,
-0.06288263201713562,
-0.013001440092921257,
0.011155455373227596,
-0.0010730738285928965,
0.09326063841581345,
-0.016036922112107277,
0.11669124662876129,
-0.03211761265993118,
-0.084321029484272,
-0.0371408686041832,
0.01941716857254505,
-0.044002942740917206,
0.01778298057615757,
-0.08709535747766495,
0.04304376617074013,
-0.03372626006603241,
0.07601133733987808,
-0.17090719938278198,
-0.11101865023374557,
-0.06228896602988243,
0.2283589243888855,
0.07016918808221817,
0.15516173839569092,
0.06582777947187424,
-0.059895385056734085,
-0.0035075214691460133,
0.03314945474267006,
0.1739855706691742,
-0.009297165088355541,
-0.07414663583040237,
-0.11902651935815811,
0.07338305562734604,
-0.0801195353269577,
0.07471033930778503,
-0.048199404031038284,
0.01578322798013687,
0.03748081624507904,
0.0923939049243927,
-0.03144502267241478,
0.036635708063840866,
0.028396591544151306,
-0.044622134417295456,
-0.061227429658174515,
-0.0051789358258247375,
0.10256627947092056,
0.01651703380048275,
-0.10215827822685242,
0.21385471522808075,
-0.1936112344264984,
0.14261451363563538,
0.1749666929244995,
-0.2334447205066681,
-0.008162683807313442,
-0.09906495362520218,
-0.029238976538181305,
0.011812048964202404,
0.03443733975291252,
-0.042150966823101044,
0.20674335956573486,
-0.007128484081476927,
0.17103008925914764,
-0.040630992501974106,
-0.05698157846927643,
-0.0478520467877388,
-0.04118584096431732,
0.009696438908576965,
0.11889749020338058,
0.11135134845972061,
-0.17241734266281128,
0.19219717383384705,
0.049342215061187744,
0.05062036216259003,
0.1417219489812851,
0.03958354517817497,
0.023729359731078148,
0.04920651391148567,
-0.007090768776834011,
-0.044424232095479965,
-0.06257873773574829,
-0.23583924770355225,
-0.003446677466854453,
0.07460802793502808,
0.03715734928846359,
0.1030859649181366,
-0.09831619262695312,
-0.04169463366270065,
-0.0069644274190068245,
-0.02431431971490383,
0.036235511302948,
0.14814482629299164,
0.008586262352764606,
0.12021113932132721,
-0.002182797761633992,
-0.04465716332197189,
0.09022246301174164,
0.021038660779595375,
-0.0901055634021759,
0.18253818154335022,
-0.11033112555742264,
-0.3595621585845947,
-0.09151599556207657,
-0.19112490117549896,
-0.05979954078793526,
0.048033636063337326,
0.10657436400651932,
-0.1480279117822647,
-0.026027539744973183,
0.0016708004986867309,
0.0836573913693428,
-0.134345144033432,
0.0007740210858173668,
-0.014320463873445988,
-0.002061164006590843,
-0.13679572939872742,
-0.10858533531427383,
-0.05246845632791519,
-0.05930319428443909,
-0.03165712580084801,
0.11830806732177734,
-0.13926160335540771,
0.01855466328561306,
0.21090112626552582,
0.0701032429933548,
0.08381089568138123,
-0.0303349569439888,
0.21526333689689636,
-0.08208819478750229,
0.023306570947170258,
0.23385874927043915,
-0.048866767436265945,
0.07294248044490814,
0.08585695922374725,
-0.006184068042784929,
-0.05746687948703766,
0.032274480909109116,
0.007156252861022949,
-0.08339208364486694,
-0.2003960907459259,
-0.10222119092941284,
-0.11770506948232651,
0.06816671788692474,
0.04503703489899635,
0.05129661411046982,
0.18450967967510223,
0.07070311903953552,
-0.053121183067560196,
-0.010961112566292286,
0.08148954808712006,
0.08848558366298676,
0.25194862484931946,
-0.07817363739013672,
0.14283180236816406,
-0.027168773114681244,
-0.15683989226818085,
0.07917951792478561,
0.060117170214653015,
0.07691940665245056,
0.07117146253585815,
0.10918966680765152,
0.016225187107920647,
0.0103374682366848,
0.12647204101085663,
0.05874604731798172,
-0.009363888762891293,
-0.04633002355694771,
-0.05232762545347214,
-0.0483977235853672,
-0.027621179819107056,
0.05281519517302513,
0.03747386857867241,
-0.1584656685590744,
-0.009496533311903477,
-0.010755893774330616,
0.06176813691854477,
0.002360154641792178,
0.08630092442035675,
-0.20600196719169617,
-0.04964066296815872,
0.059740420430898666,
0.005698580760508776,
-0.08763275295495987,
0.07304059714078903,
-0.001365966279990971,
-0.11734941601753235,
0.028771692886948586,
-0.04600793868303299,
0.12995941936969757,
-0.07055945694446564,
0.07829878479242325,
-0.11336953938007355,
-0.028897330164909363,
-0.004193078726530075,
0.1200387179851532,
-0.3061801493167877,
0.183177188038826,
-0.00873593520373106,
-0.05992840230464935,
-0.11832515895366669,
-0.02100168541073799,
0.03519313782453537,
0.10921826213598251,
0.10857312381267548,
-0.025727443397045135,
-0.02738841436803341,
0.04950248450040817,
-0.044055961072444916,
0.0376293770968914,
0.08162161707878113,
-0.05426359176635742,
-0.01610817015171051,
-0.046951405704021454,
-0.001053909887559712,
0.013914775103330612,
-0.08636660128831863,
0.007693349849432707,
-0.1962202489376068,
0.09697730094194412,
0.06037209555506706,
0.08345779776573181,
0.04448588937520981,
-0.026460297405719757,
-0.11184696853160858,
0.2953369915485382,
0.02828034572303295,
-0.10830303281545639,
-0.11628440022468567,
0.05761335417628288,
0.03588029369711876,
-0.06119169667363167,
-0.02629360742866993,
-0.08828183263540268,
0.06323269009590149,
-0.06395841389894485,
-0.20794063806533813,
0.11094511300325394,
-0.10159972310066223,
-0.042940378189086914,
-0.03025936335325241,
0.21480606496334076,
-0.023477595299482346,
0.012889159843325615,
0.03511429205536842,
-0.002015046076849103,
-0.1374935656785965,
-0.10167433321475983,
0.02350354567170143,
-0.008012956939637661,
0.02009625732898712,
-0.006274967920035124,
-0.037411488592624664,
0.019295914098620415,
-0.04159228131175041,
0.004390363581478596,
0.30782634019851685,
0.12021180987358093,
-0.05174273997545242,
0.16918712854385376,
0.10356927663087845,
-0.04491060972213745,
-0.27951639890670776,
-0.08767277747392654,
-0.09457243233919144,
-0.07374752312898636,
-0.06713063269853592,
-0.20293939113616943,
0.08234453946352005,
-0.036875832825899124,
-0.010489669628441334,
0.08977393060922623,
-0.2818257212638855,
-0.0861637145280838,
0.15716786682605743,
-0.015575188212096691,
0.4567500352859497,
-0.1073562502861023,
-0.0744287446141243,
-0.04838741943240166,
-0.15880335867404938,
0.1871701031923294,
-0.017923884093761444,
0.1011919155716896,
0.0006583869690075517,
0.2125057429075241,
0.05448555946350098,
-0.003779921680688858,
0.07815150916576385,
0.0024728321004658937,
-0.042969293892383575,
-0.09746213257312775,
-0.09619030356407166,
0.002763275057077408,
0.007197749800980091,
0.03295663371682167,
-0.08779384940862656,
0.045233242213726044,
-0.09444547444581985,
-0.06453240662813187,
-0.08720697462558746,
0.03119601123034954,
0.014697038568556309,
-0.07863115519285202,
-0.00011286497465334833,
-0.03533096984028816,
-0.010096583515405655,
0.00753434794023633,
0.20614340901374817,
-0.10635748505592346,
0.1546023190021515,
0.028065618127584457,
0.13242968916893005,
-0.12448831647634506,
-0.05088629201054573,
-0.040817804634571075,
-0.05804337561130524,
0.07422187179327011,
-0.1332109123468399,
0.032370466738939285,
0.09345696866512299,
-0.026285167783498764,
0.09138666838407516,
0.11867507547140121,
-0.026775436475872993,
0.004281732253730297,
0.07224863022565842,
-0.2623112201690674,
-0.09488929063081741,
-0.07250186800956726,
0.062460482120513916,
0.07801000028848648,
0.11923647671937943,
0.21483013033866882,
0.02051672339439392,
-0.03149786964058876,
0.034872863441705704,
0.03703220561146736,
-0.015756992623209953,
0.019237946718931198,
0.006191765423864126,
0.02978113666176796,
-0.14926892518997192,
0.031185150146484375,
-0.01670323684811592,
-0.09441053122282028,
0.013900503516197205,
0.1559465527534485,
-0.11473001539707184,
-0.1320420354604721,
-0.08039148896932602,
0.15602324903011322,
-0.11686959117650986,
0.0028406709898263216,
-0.041747018694877625,
-0.1140897125005722,
0.08685281127691269,
0.11174067109823227,
0.057727497071027756,
0.04553021863102913,
-0.0844043418765068,
-0.02758883871138096,
-0.03700871393084526,
0.011865542270243168,
0.00022014502610545605,
-0.02824968844652176,
-0.033510662615299225,
0.01907658576965332,
-0.03576526790857315,
0.11881402879953384,
-0.09015026688575745,
-0.10892713069915771,
-0.17691470682621002,
0.038229551166296005,
-0.04623476043343544,
-0.09719280898571014,
-0.12037461996078491,
-0.044197190552949905,
0.006029927637428045,
-0.03497300669550896,
-0.02215508557856083,
-0.019879531115293503,
-0.1038883700966835,
0.03829245641827583,
-0.04193326458334923,
-0.00450808135792613,
-0.08524288982152939,
0.023917557671666145,
0.046859368681907654,
-0.03165588155388832,
0.1585223525762558,
0.11417738348245621,
-0.12803992629051208,
0.07911508530378342,
-0.18715772032737732,
-0.06318745762109756,
0.0914878323674202,
0.018578041344881058,
0.044736627489328384,
0.03501724824309349,
0.006014477461576462,
0.0387314036488533,
0.0649004653096199,
0.045538801699876785,
0.05557422712445259,
-0.08029241114854813,
0.06162654981017113,
-0.04891183599829674,
-0.10321800410747528,
-0.050394173711538315,
-0.016714325174689293,
-0.0016442742198705673,
0.06320880353450775,
0.11279057711362839,
-0.04142478480935097,
0.09512277692556381,
-0.05688010901212692,
0.04804857820272446,
0.03463352471590042,
-0.1665230542421341,
-0.009803002700209618,
-0.09014918655157089,
0.054200898855924606,
-0.004442691802978516,
0.18925540149211884,
0.016100436449050903,
-0.039559781551361084,
0.03308398649096489,
0.052956193685531616,
0.06094586104154587,
-0.012211913242936134,
0.16464059054851532,
0.09356739372015,
-0.03789687901735306,
-0.0842437595129013,
0.10526186227798462,
0.037834834307432175,
0.042557213455438614,
0.181358203291893,
-0.03235119953751564,
-0.004258992150425911,
0.06393580883741379,
-0.0012542108306661248,
0.021771101281046867,
-0.06848332285881042,
-0.09624437987804413,
-0.023780550807714462,
0.028394624590873718,
-0.04122820496559143,
0.1024758443236351,
0.1570950150489807,
0.006305345334112644,
0.021357951685786247,
-0.019086776301264763,
-0.04700441285967827,
-0.18283507227897644,
-0.2042151242494583,
-0.09570885449647903,
-0.13881100714206696,
0.009403745643794537,
-0.13563555479049683,
0.034587398171424866,
0.04926532506942749,
0.10568464547395706,
-0.04805496707558632,
0.02968650870025158,
0.03615003824234009,
-0.10194458067417145,
0.0300851259380579,
-0.046982377767562866,
0.07135210931301117,
-0.030566664412617683,
0.01073827501386404,
-0.05872552469372749,
0.032973624765872955,
0.011353881098330021,
0.04211796075105667,
-0.018503056839108467,
0.012526389211416245,
-0.11507082730531693,
-0.08914811164140701,
-0.06910025328397751,
0.06280001997947693,
-0.006708692759275436,
0.18563491106033325,
0.006522355601191521,
-0.024309067055583,
0.03197906166315079,
0.2117907851934433,
-0.05863218754529953,
-0.08671505749225616,
-0.08893375098705292,
0.1952502727508545,
-0.026317564770579338,
0.08666965365409851,
-0.036490630358457565,
0.01468501053750515,
-0.0690336599946022,
0.3421747088432312,
0.3011961579322815,
-0.10185328125953674,
0.0090860016644001,
0.00027344340924173594,
0.04235728085041046,
0.13437524437904358,
0.07730095088481903,
0.10484389960765839,
0.2565236985683441,
-0.07556919753551483,
-0.07091352343559265,
-0.006016858853399754,
-0.02315925434231758,
-0.05420056730508804,
0.03975437581539154,
0.05635395273566246,
-0.06102481111884117,
-0.018238332122564316,
0.11166704446077347,
-0.2584437429904938,
0.04915463179349899,
-0.13435469567775726,
-0.1539543867111206,
-0.07207808643579483,
-0.0010578609071671963,
0.0857405811548233,
-0.003999157343059778,
0.09940474480390549,
-0.013319055549800396,
-0.07410050928592682,
0.05786523595452309,
0.025369083508849144,
-0.20393668115139008,
-0.003335252171382308,
0.09156692028045654,
-0.021407954394817352,
-0.057005833834409714,
-0.02727084793150425,
0.08719262480735779,
0.10072382539510727,
0.021201500669121742,
-0.027625735849142075,
0.04862821847200394,
0.0011868448927998543,
-0.07157713174819946,
0.0092546958476305,
0.003555761417374015,
0.02462819777429104,
-0.05019499734044075,
0.06967010349035263,
-0.17968682944774628,
0.02342568151652813,
-0.009579677134752274,
-0.0557190477848053,
-0.0028482882771641016,
0.036172814667224884,
-0.059282705187797546,
0.08246975392103195,
0.08537662029266357,
-0.020291492342948914,
-0.03674669936299324,
-0.026885442435741425,
-0.007880412973463535,
-0.033664725720882416,
-0.10741671174764633,
-0.11613517254590988,
-0.16292908787727356,
-0.0994349792599678,
0.0633213222026825,
-0.001649370533414185,
-0.2217893749475479,
0.011757832951843739,
-0.12931087613105774,
0.056664083153009415,
-0.09835952520370483,
0.08663330227136612,
0.08248607814311981,
0.018925735726952553,
-0.00009301872341893613,
0.037283118814229965,
0.039343737065792084,
0.07346159964799881,
-0.1296456903219223,
-0.08382939547300339
] |
null | null | transformers |
# Kela DialoGPT Model | {"tags": ["conversational"]} | text-generation | RishabhRawatt/DialoGPT-small-kela | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Kela DialoGPT Model | [
"# Kela DialoGPT Model"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Kela DialoGPT Model"
] | [
51,
7
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Kela DialoGPT Model"
] | [
0.0065862806513905525,
0.011472102254629135,
-0.005753062199801207,
-0.005551018286496401,
0.1246688961982727,
0.006418779958039522,
0.16271132230758667,
0.13330766558647156,
0.0013793043326586485,
-0.04087405651807785,
0.11995735764503479,
0.2000672072172165,
-0.0023558640386909246,
0.15824182331562042,
-0.06626096367835999,
-0.3039206862449646,
0.060308899730443954,
0.045169178396463394,
0.07938119769096375,
0.11422015726566315,
0.09173010289669037,
-0.04340258240699768,
0.07748151570558548,
0.007448249496519566,
-0.16029419004917145,
-0.002393044764176011,
0.004142934922128916,
-0.13855107128620148,
0.10405027121305466,
0.04949670284986496,
0.04345952346920967,
-0.0041982694528996944,
-0.07791290432214737,
-0.17885936796665192,
0.03300226107239723,
-0.022961147129535675,
-0.01775234378874302,
0.017930004745721817,
-0.007250380236655474,
-0.08229897916316986,
0.1117677167057991,
0.0785100981593132,
-0.020220521837472916,
0.017899267375469208,
-0.1394069492816925,
0.03501807153224945,
0.0003605177334975451,
0.0674765557050705,
0.11065950244665146,
0.10325247794389725,
-0.04498753324151039,
0.09338096529245377,
-0.08496133238077164,
0.11153384298086166,
0.08906968683004379,
-0.32498782873153687,
-0.01081741601228714,
0.1105833649635315,
0.007197564467787743,
0.020323092117905617,
-0.031644418835639954,
0.094619520008564,
0.011680559255182743,
-0.03187280520796776,
-0.03822953253984451,
-0.09711577743291855,
-0.13048754632472992,
-0.005685938522219658,
-0.0812905877828598,
-0.00024123225011862814,
0.24964840710163116,
-0.04345433786511421,
0.07061729580163956,
-0.08503887057304382,
-0.08889838308095932,
0.024552607908844948,
-0.03919408097863197,
-0.035735879093408585,
-0.08004669845104218,
0.04524615779519081,
0.01247695367783308,
-0.11175982654094696,
-0.11412957310676575,
-0.03140600025653839,
-0.18715497851371765,
0.20480622351169586,
0.06570617854595184,
0.006218547001481056,
-0.20600289106369019,
0.10889752954244614,
-0.012989903800189495,
-0.10349195450544357,
-0.005797092337161303,
-0.0719858780503273,
0.043971624225378036,
0.01699654757976532,
-0.011378674767911434,
-0.03730522468686104,
0.07065648585557938,
0.07312684506177902,
0.029170870780944824,
0.00460011325776577,
-0.00006005913019180298,
0.05407748371362686,
0.04774833470582962,
0.0978013351559639,
0.002638131845742464,
-0.059998199343681335,
0.02645222842693329,
-0.09778524190187454,
0.0006142797064967453,
-0.054892703890800476,
-0.16571924090385437,
-0.041749559342861176,
0.019705872982740402,
0.07069175690412521,
0.014695440419018269,
0.13860684633255005,
-0.006236833054572344,
-0.037999413907527924,
-0.01577974669635296,
-0.03168804198503494,
-0.03097514994442463,
-0.013829823583364487,
-0.01007686834782362,
0.1384047269821167,
0.00969985593110323,
0.055660881102085114,
-0.12997496128082275,
0.015507696196436882,
-0.05193072184920311,
0.017484256997704506,
-0.023706329986453056,
-0.03112652152776718,
-0.016568796709179878,
-0.07305625081062317,
0.014365850947797298,
-0.16213223338127136,
-0.16597655415534973,
-0.009593806229531765,
-0.02178374119102955,
-0.051550913602113724,
-0.10945422947406769,
-0.1044507846236229,
-0.031247084960341454,
0.036954112350940704,
-0.06660051643848419,
-0.06582114100456238,
-0.05969121679663658,
0.05659066513180733,
-0.02249814197421074,
0.09509801119565964,
-0.08133519440889359,
0.08104207366704941,
-0.09215853363275528,
-0.026317384093999863,
-0.15322162210941315,
0.1394803375005722,
0.00047259466373361647,
0.04786323010921478,
-0.04516536369919777,
-0.0003309382009319961,
-0.09965913742780685,
0.06260164827108383,
-0.037325043231248856,
0.23893846571445465,
-0.0874660313129425,
-0.1075122058391571,
0.27151811122894287,
-0.03321703150868416,
-0.12315405905246735,
0.101908840239048,
0.003020001109689474,
0.13276107609272003,
0.13018196821212769,
0.23789337277412415,
0.03482001647353172,
-0.02762727625668049,
0.05817705765366554,
0.12176112830638885,
-0.06532532721757889,
-0.02295513264834881,
0.013985619880259037,
-0.033831626176834106,
-0.09136223047971725,
0.03912315517663956,
0.042175207287073135,
0.07474137842655182,
-0.03891481086611748,
-0.022855399176478386,
0.002723413286730647,
-0.01019111555069685,
0.051521528512239456,
-0.05056494101881981,
0.14012964069843292,
-0.002163770142942667,
-0.023460181429982185,
0.043308813124895096,
0.03662931174039841,
-0.04873529449105263,
0.03988679498434067,
-0.0617518424987793,
0.10069119185209274,
-0.020593609660863876,
0.06664750725030899,
-0.0981643944978714,
-0.07687454670667648,
-0.021860947832465172,
0.11867821961641312,
0.06562890857458115,
0.10246315598487854,
0.02823229320347309,
-0.061792753636837006,
-0.019627287983894348,
0.025328397750854492,
0.15270546078681946,
-0.004522471223026514,
-0.09801170974969864,
-0.084995336830616,
0.13145172595977783,
-0.044948168098926544,
0.09057793766260147,
-0.03725923225283623,
-0.000507097109220922,
-0.006342081353068352,
0.11845655739307404,
-0.02884979359805584,
0.051068421453237534,
0.03413962572813034,
-0.006597544066607952,
-0.04244763404130936,
0.010361682623624802,
0.09229423105716705,
-0.015018920414149761,
-0.0658428966999054,
0.2729897201061249,
-0.16106407344341278,
0.12862415611743927,
0.17236310243606567,
-0.1567658931016922,
0.003310158848762512,
-0.11223533004522324,
-0.024847306311130524,
-0.010040401481091976,
0.047012705355882645,
-0.0007505291723646224,
0.2603665292263031,
-0.008409482426941395,
0.16362817585468292,
-0.04573885723948479,
-0.0637756958603859,
-0.02981695719063282,
-0.0736197829246521,
0.012490794993937016,
0.08486364781856537,
0.07269015163183212,
-0.16943316161632538,
0.14531269669532776,
0.11404819786548615,
0.00975827593356371,
0.22898726165294647,
0.03481212258338928,
0.00014862034004181623,
0.04004573076963425,
0.013622911646962166,
-0.06920401006937027,
-0.05068889632821083,
-0.25187844038009644,
-0.04440543055534363,
0.06495414674282074,
0.021366233006119728,
0.11392972618341446,
-0.08012755215167999,
-0.033088717609643936,
-0.020921481773257256,
-0.02402939274907112,
0.05417638272047043,
0.15196380019187927,
0.03395327553153038,
0.11582737416028976,
-0.004721502307802439,
-0.04855339601635933,
0.0499902069568634,
0.02335994504392147,
-0.04924241825938225,
0.21091295778751373,
-0.12986838817596436,
-0.32936322689056396,
-0.09347553551197052,
-0.1159583181142807,
-0.06927509605884552,
0.021385200321674347,
0.08041795343160629,
-0.1166687160730362,
-0.008625010028481483,
0.0023614063393324614,
0.12019804865121841,
-0.06466490775346756,
0.0015251414151862264,
-0.04343768209218979,
-0.029404278844594955,
-0.13537073135375977,
-0.1003011092543602,
-0.05721645429730415,
-0.03078380785882473,
-0.08106592297554016,
0.10066543519496918,
-0.11988366395235062,
0.030244801193475723,
0.23732484877109528,
0.03835973143577576,
0.04690701141953468,
-0.03948567062616348,
0.18658122420310974,
-0.11836647242307663,
0.006935718934983015,
0.1413474678993225,
-0.028144696727395058,
0.0577087476849556,
0.15736302733421326,
-0.027457568794488907,
-0.06742120534181595,
0.04007548838853836,
0.0081512201577425,
-0.05775727704167366,
-0.227072075009346,
-0.12507393956184387,
-0.0967194214463234,
0.08878970891237259,
-0.014363315887749195,
0.04049532860517502,
0.18355678021907806,
0.08704876154661179,
-0.03141303360462189,
0.0234840027987957,
0.0406787246465683,
0.07928895950317383,
0.3121143877506256,
-0.07301105558872223,
0.131311297416687,
-0.022682372480630875,
-0.16515573859214783,
0.05658315122127533,
0.08475734293460846,
0.09331373870372772,
0.02010553702712059,
0.0019741496071219444,
0.03610288351774216,
0.039885152131319046,
0.11405841261148453,
0.04009897634387016,
-0.0009856309043243527,
-0.0361943282186985,
-0.03015611134469509,
-0.03404509276151657,
0.0026020435616374016,
0.05394117906689644,
0.09441131353378296,
-0.16208447515964508,
-0.01501955185085535,
0.014112846925854683,
0.08682392537593842,
0.04749467223882675,
0.12295167148113251,
-0.16510754823684692,
-0.011675083078444004,
0.06463830918073654,
-0.05520256236195564,
-0.13282035291194916,
0.08420107513666153,
0.02634529024362564,
-0.1510351449251175,
0.06082950159907341,
-0.00933435931801796,
0.08430884033441544,
-0.11367900669574738,
0.05985157564282417,
-0.10019341856241226,
-0.060705624520778656,
0.012282483279705048,
0.1052033081650734,
-0.2882932424545288,
0.23658211529254913,
-0.002848056610673666,
-0.03817833214998245,
-0.0984405130147934,
-0.020865527912974358,
0.015393922105431557,
0.07418190687894821,
0.11822908371686935,
-0.003337469883263111,
0.07932782173156738,
-0.02469080500304699,
-0.0684003084897995,
0.04028620570898056,
0.10864301025867462,
-0.06989291310310364,
-0.009409877471625805,
-0.034967485815286636,
-0.0015569961396977305,
-0.033244647085666656,
-0.0748867467045784,
-0.04354133829474449,
-0.16268979012966156,
0.08085060864686966,
0.11057605594396591,
0.055673543363809586,
0.039350491017103195,
-0.03352324664592743,
-0.041294243186712265,
0.2312707006931305,
0.00819102581590414,
-0.0958002582192421,
-0.07544261962175369,
-0.08963582664728165,
0.07364542037248611,
-0.09028621017932892,
0.044766493141651154,
-0.07825422286987305,
0.009642230346798897,
-0.05373072624206543,
-0.16708813607692719,
0.09527996182441711,
-0.0931781455874443,
-0.03494754433631897,
-0.003662517061457038,
0.19842426478862762,
-0.022319583222270012,
-0.002620552433654666,
0.06579228490591049,
0.004833701997995377,
-0.08461187034845352,
-0.0955640897154808,
-0.02604678086936474,
0.07393987476825714,
0.010748174972832203,
0.03906325250864029,
-0.04203936457633972,
-0.013786534778773785,
-0.09047246724367142,
-0.02679932489991188,
0.3099651336669922,
0.13634392619132996,
-0.034537091851234436,
0.16989846527576447,
0.11715604364871979,
-0.08549728244543076,
-0.29222598671913147,
-0.11593003571033478,
-0.073028065264225,
-0.03875867277383804,
-0.0907474160194397,
-0.18667590618133545,
0.09716112911701202,
-0.04479900002479553,
-0.024315370246767998,
0.027015695348381996,
-0.2831001877784729,
-0.1327575445175171,
0.2210637778043747,
-0.0285202544182539,
0.427783727645874,
-0.0808982253074646,
-0.09233923256397247,
-0.051964666694402695,
-0.1140768900513649,
0.132159024477005,
-0.045061059296131134,
0.13558994233608246,
-0.01737367734313011,
0.19096174836158752,
0.052259720861911774,
0.010108564980328083,
0.07616795599460602,
0.07681576162576675,
-0.058133020997047424,
-0.11489492654800415,
-0.08499757945537567,
-0.028515834361314774,
-0.010054192505776882,
0.08691855520009995,
-0.05075491592288017,
0.04996132105588913,
-0.14663931727409363,
-0.048289790749549866,
-0.08583836257457733,
0.06388910859823227,
0.04411107674241066,
-0.07538218051195145,
-0.03684106469154358,
-0.04484201595187187,
-0.0065010287798941135,
0.014618088491261005,
0.14219532907009125,
-0.049767836928367615,
0.1256941705942154,
0.044403910636901855,
0.11994099617004395,
-0.058079496026039124,
0.006890340242534876,
-0.04504651203751564,
-0.05426511541008949,
0.07322737574577332,
-0.054828934371471405,
-0.006394923198968172,
0.11102080345153809,
-0.029456308111548424,
0.06728750467300415,
0.07051094621419907,
-0.019899344071745872,
0.017584215849637985,
0.09154242277145386,
-0.23700229823589325,
-0.10206900537014008,
-0.0707496926188469,
-0.027873419225215912,
0.09527596831321716,
0.10699407011270523,
0.22617806494235992,
-0.0331588089466095,
-0.031353909522295,
0.0004844233044423163,
0.009798983111977577,
-0.04048100486397743,
0.09103880822658539,
0.009545927867293358,
0.01415815856307745,
-0.13364091515541077,
0.07503136992454529,
-0.011213060468435287,
-0.06490164995193481,
0.06300698220729828,
0.10496360063552856,
-0.10639309883117676,
-0.13046036660671234,
-0.016484079882502556,
0.10408353060483932,
-0.11815132945775986,
-0.011354660615324974,
-0.019397277384996414,
-0.1232372373342514,
0.05205722898244858,
0.1771702766418457,
0.04567613825201988,
0.08623526990413666,
-0.0930292084813118,
-0.010743782855570316,
-0.020823923870921135,
0.022095579653978348,
0.058215752243995667,
-0.012381748296320438,
-0.09202049672603607,
0.06305503100156784,
-0.040075186640024185,
0.1449747532606125,
-0.08816017210483551,
-0.10623640567064285,
-0.16152127087116241,
0.027402643114328384,
-0.0589420348405838,
-0.07915116846561432,
-0.10188713669776917,
-0.05046064034104347,
-0.010636580176651478,
-0.03700923174619675,
-0.04090007767081261,
-0.04174035042524338,
-0.10790383070707321,
0.04369044303894043,
-0.04024817794561386,
0.055934254080057144,
-0.08165962249040604,
0.014313989318907261,
0.055410806089639664,
-0.016107574105262756,
0.16555678844451904,
0.1532488614320755,
-0.08671518415212631,
0.09986957162618637,
-0.12720456719398499,
-0.040540389716625214,
0.12276967614889145,
0.009918496944010258,
0.04272489622235298,
0.019770538434386253,
0.008999223820865154,
0.04321212321519852,
0.10958405584096909,
0.06200306862592697,
0.06748058646917343,
-0.0893954485654831,
0.005730479024350643,
-0.06975437700748444,
-0.13693562150001526,
-0.037211500108242035,
-0.020419709384441376,
0.039406146854162216,
0.04452146962285042,
0.08236769586801529,
-0.0691164880990982,
0.08066960424184799,
-0.024976614862680435,
0.03841539844870567,
0.01411752961575985,
-0.1799663007259369,
0.013324026018381119,
-0.09516798704862595,
0.03702383488416672,
0.008481764234602451,
0.21712815761566162,
0.04572049900889397,
-0.03745996206998825,
0.017156096175312996,
0.034212760627269745,
0.0125956442207098,
0.010005424730479717,
0.15793682634830475,
0.1191064640879631,
-0.05234232917428017,
-0.12381072342395782,
0.07654669880867004,
0.011329407803714275,
0.006362673826515675,
0.1175651103258133,
-0.016497788950800896,
-0.023208845406770706,
0.11462371051311493,
0.020262934267520905,
0.004115685820579529,
-0.11050429195165634,
-0.1658356934785843,
-0.09119956195354462,
0.07500311732292175,
-0.0625489205121994,
0.12737351655960083,
0.13917599618434906,
-0.041945818811655045,
0.013489619828760624,
-0.0009556380100548267,
-0.08607299625873566,
-0.20689189434051514,
-0.18729275465011597,
-0.07519309967756271,
-0.14260165393352509,
0.017616191878914833,
-0.11859889328479767,
0.04651634022593498,
0.011353245005011559,
0.0896715596318245,
-0.05592779442667961,
0.1685430109500885,
0.022686835378408432,
-0.11976120620965958,
0.13182519376277924,
-0.02280070073902607,
0.0633760318160057,
-0.07718852907419205,
0.020790131762623787,
-0.10612255334854126,
0.0334927998483181,
-0.005085852462798357,
0.03910234197974205,
-0.06311194598674774,
0.015629278495907784,
-0.11162333935499191,
-0.07009652256965637,
-0.06043102964758873,
0.0751420110464096,
0.014881282113492489,
0.10520131140947342,
0.035529974848032,
-0.05851360037922859,
0.02640864998102188,
0.23283715546131134,
-0.06493351608514786,
-0.10234275460243225,
-0.06415855884552002,
0.18812301754951477,
0.013177098706364632,
0.09769484400749207,
-0.04643722623586655,
0.0218635443598032,
-0.10913842916488647,
0.33172038197517395,
0.27229034900665283,
-0.0775330439209938,
0.019572779536247253,
0.006193004082888365,
0.0458640418946743,
0.09997642785310745,
0.09630671888589859,
0.09616153687238693,
0.316739946603775,
-0.04755445942282677,
-0.022924300283193588,
0.007617560680955648,
-0.04886360093951225,
-0.0655287355184555,
0.051713988184928894,
0.036586254835128784,
-0.0857148990035057,
-0.03342108055949211,
0.10224736481904984,
-0.29094773530960083,
0.11025043576955795,
-0.16941355168819427,
-0.17449729144573212,
-0.0900137647986412,
-0.0102442167699337,
0.08949925750494003,
0.010535799898207188,
0.0877622738480568,
-0.006829163525253534,
-0.08492053300142288,
0.093442901968956,
0.03981109708547592,
-0.1992308646440506,
-0.014807615429162979,
0.07421897351741791,
-0.020927028730511665,
-0.0458100289106369,
-0.02378140576183796,
0.09380780905485153,
0.060719963163137436,
0.0407116562128067,
-0.022279972210526466,
0.042986832559108734,
0.005987771321088076,
-0.04201161488890648,
0.03623943775892258,
0.05608682334423065,
0.012715003453195095,
-0.09500445425510406,
0.10777227580547333,
-0.112774558365345,
0.06721865385770798,
-0.020943358540534973,
-0.022335458546876907,
-0.06055256724357605,
0.035492610186338425,
-0.09547857940196991,
0.07728519290685654,
0.10442469269037247,
-0.01881450042128563,
-0.006105759646743536,
-0.0010259569389745593,
-0.022593313828110695,
-0.029293624684214592,
-0.08880644291639328,
-0.07202180474996567,
-0.12983185052871704,
-0.11844002455472946,
0.010933625511825085,
0.017493709921836853,
-0.15839339792728424,
0.005587694235146046,
-0.11479460448026657,
0.05866586044430733,
-0.13920491933822632,
0.11509591341018677,
0.0885603278875351,
0.008677576668560505,
-0.00023394798336084932,
-0.07883922755718231,
0.03026801161468029,
0.07214516401290894,
-0.1433548927307129,
-0.09471488744020462
] |
null | null | transformers |
# Rick and Morty DialoGPT Model | {"tags": ["conversational"]} | text-generation | Ritchie/DialoGPT-small-Rickandmorty | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Rick and Morty DialoGPT Model | [
"# Rick and Morty DialoGPT Model"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Rick and Morty DialoGPT Model"
] | [
51,
10
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Rick and Morty DialoGPT Model"
] | [
-0.01990443281829357,
0.10367733240127563,
-0.006012056488543749,
0.013662099838256836,
0.1287931650876999,
0.004103946499526501,
0.13405320048332214,
0.13470496237277985,
-0.029608309268951416,
-0.0377325713634491,
0.1409052610397339,
0.2081032246351242,
-0.009616929106414318,
0.025026321411132812,
-0.08027864247560501,
-0.33285143971443176,
0.04419311136007309,
0.04611847549676895,
-0.04805411398410797,
0.11171722412109375,
0.09962809830904007,
-0.03511058911681175,
0.07650627940893173,
0.012189619243144989,
-0.11959464848041534,
0.014523470774292946,
0.01571112684905529,
-0.09889741986989975,
0.11399844288825989,
0.07783890515565872,
0.031239205971360207,
0.033389654010534286,
-0.042143791913986206,
-0.13308840990066528,
0.04855761677026749,
-0.0014628645731136203,
-0.03996938467025757,
0.06519230455160141,
0.0068825362250208855,
-0.09896008670330048,
0.13105708360671997,
0.11774895340204239,
-0.001342291128821671,
0.030811335891485214,
-0.1546017825603485,
-0.03095608949661255,
-0.013916928321123123,
0.04583658277988434,
0.05571185424923897,
0.1092928797006607,
-0.03970988467335701,
0.11546611040830612,
-0.046847838908433914,
0.11656361073255539,
0.13404695689678192,
-0.27711591124534607,
-0.013774634338915348,
0.14150507748126984,
0.03755388408899307,
0.031246060505509377,
-0.03764049708843231,
0.09234841167926788,
0.010574371553957462,
-0.009135077707469463,
-0.054559025913476944,
-0.07839421927928925,
-0.06956472247838974,
0.03881034255027771,
-0.08538595587015152,
-0.0028573249001055956,
0.22309143841266632,
-0.029777048155665398,
0.0931403860449791,
-0.061110686510801315,
-0.083645299077034,
0.0022445949725806713,
-0.04396601766347885,
-0.031562261283397675,
-0.0995510146021843,
0.08443354815244675,
-0.04024428874254227,
-0.08693728595972061,
-0.10731299221515656,
-0.022938303649425507,
-0.15873323380947113,
0.16214832663536072,
0.03501884266734123,
0.03956814110279083,
-0.21219894289970398,
0.07603893429040909,
-0.04213596507906914,
-0.10128775984048843,
0.025763655081391335,
-0.0809730738401413,
0.0031352867372334003,
0.01420458871871233,
-0.034850042313337326,
-0.01257789321243763,
0.09354974329471588,
0.11913833022117615,
-0.002085368847474456,
0.028482265770435333,
-0.03459439426660538,
0.04555915296077728,
0.04445279389619827,
0.04635937884449959,
-0.030874032527208328,
-0.005519113503396511,
0.024999095126986504,
-0.0903957337141037,
-0.010871811769902706,
-0.060442280024290085,
-0.1946737915277481,
0.013364237733185291,
0.05735969915986061,
0.055262304842472076,
0.030765585601329803,
0.13551434874534607,
0.0010974886827170849,
-0.0475224107503891,
0.03023342229425907,
-0.020769428461790085,
-0.016528211534023285,
0.029149476438760757,
-0.0072809201665222645,
0.1526104062795639,
0.022983204573392868,
0.05690442770719528,
-0.11451500654220581,
0.012773441150784492,
-0.03330712020397186,
-0.006917042192071676,
-0.03216493874788284,
-0.061537809669971466,
0.003289242973551154,
0.0014469954185187817,
0.013694697991013527,
-0.12761977314949036,
-0.15719962120056152,
-0.003717299085110426,
0.00613630935549736,
-0.05369097366929054,
-0.10004933178424835,
-0.10542158782482147,
-0.03153182193636894,
0.046352777630090714,
-0.053748197853565216,
0.03198752924799919,
-0.039340607821941376,
0.09383489936590195,
-0.03441528603434563,
0.0691300630569458,
-0.0863635316491127,
0.0905333161354065,
-0.06098577380180359,
-0.04111234471201897,
-0.0643690675497055,
0.12356391549110413,
0.011561519466340542,
0.04442533850669861,
-0.03781363368034363,
-0.01636880449950695,
-0.11087207496166229,
0.06495212018489838,
-0.03516015037894249,
0.22487092018127441,
-0.08996163308620453,
-0.09683383256196976,
0.22284504771232605,
-0.04562665522098541,
-0.12769415974617004,
0.12243670970201492,
-0.03600937873125076,
0.09682484716176987,
0.11536505818367004,
0.16257616877555847,
0.03866875544190407,
-0.0002237519365735352,
0.10846788436174393,
0.10610917955636978,
-0.07603283226490021,
0.006744202226400375,
0.0250004380941391,
-0.02382737584412098,
-0.09139634668827057,
0.015165179036557674,
0.07776524871587753,
0.04803644120693207,
-0.05478836968541145,
-0.015317765064537525,
0.015090391971170902,
-0.003627530997619033,
0.06564177572727203,
-0.017049036920070648,
0.11691898107528687,
-0.03955721855163574,
-0.07620245963335037,
-0.014626736752688885,
0.028113901615142822,
-0.06986767798662186,
0.026787258684635162,
-0.07962338626384735,
0.02948051132261753,
-0.01967560686171055,
0.06687499582767487,
-0.16950036585330963,
-0.09430424869060516,
-0.06010226905345917,
0.23349159955978394,
0.07496993243694305,
0.11698364466428757,
0.06350064277648926,
-0.056928664445877075,
0.0006459777359850705,
0.037900060415267944,
0.19767099618911743,
-0.006904584355652332,
-0.07503941655158997,
-0.11777795851230621,
0.10312607139348984,
-0.07375676929950714,
0.06138577312231064,
-0.0416308231651783,
0.007855354808270931,
0.019795136526226997,
0.11127804219722748,
-0.04220014438033104,
0.039965033531188965,
0.012499134056270123,
-0.03696384280920029,
-0.05908297002315521,
0.0004571304307319224,
0.09440597146749496,
-0.0005542659782804549,
-0.10514124482870102,
0.2379530370235443,
-0.21215155720710754,
0.12180843949317932,
0.1799643337726593,
-0.2256188690662384,
0.008836638182401657,
-0.10462760180234909,
-0.016665222123265266,
0.01030759233981371,
0.03996801748871803,
-0.040312353521585464,
0.24249082803726196,
-0.014560520648956299,
0.17035135626792908,
-0.04880015179514885,
-0.05010494217276573,
-0.0440804697573185,
-0.05291803553700447,
0.0003277618088759482,
0.12486644089221954,
0.09157522767782211,
-0.18372175097465515,
0.17465431988239288,
0.06325390189886093,
0.03004654310643673,
0.1566917598247528,
0.022896459326148033,
0.020663797855377197,
0.05599488690495491,
-0.0012882096925750375,
-0.03033529780805111,
-0.07880529016256332,
-0.20945574343204498,
-0.012111871503293514,
0.07547834515571594,
0.04618273675441742,
0.10363037884235382,
-0.1018955409526825,
-0.030724551528692245,
-0.006948297843337059,
-0.030821966007351875,
0.03848150745034218,
0.13554143905639648,
0.015318007208406925,
0.12024796009063721,
-0.019162237644195557,
-0.06668011844158173,
0.0741129145026207,
0.01461794413626194,
-0.09263674914836884,
0.18050695955753326,
-0.1221487745642662,
-0.3382752537727356,
-0.10329627990722656,
-0.20327065885066986,
-0.04040617123246193,
0.0422586165368557,
0.11002974957227707,
-0.1460546851158142,
-0.029720865190029144,
0.0010455691954120994,
0.08435780555009842,
-0.1366978883743286,
0.006720550823956728,
-0.017843635752797127,
-0.01294276025146246,
-0.1374056041240692,
-0.09384968876838684,
-0.04747654125094414,
-0.060003772377967834,
-0.03218422830104828,
0.10381519794464111,
-0.1596987098455429,
0.007801016326993704,
0.230968177318573,
0.04797196388244629,
0.07053504139184952,
-0.036995481699705124,
0.17910921573638916,
-0.08220451325178146,
0.016473548486828804,
0.24478016793727875,
-0.05610832944512367,
0.0740312784910202,
0.10560029745101929,
-0.005553957540541887,
-0.052998270839452744,
0.03756273165345192,
0.00788428820669651,
-0.0785532221198082,
-0.21784749627113342,
-0.1030275970697403,
-0.11046822369098663,
0.04284128174185753,
0.05120398849248886,
0.04543844982981682,
0.1585974246263504,
0.06446543335914612,
-0.05187172442674637,
-0.011306295171380043,
0.08315242826938629,
0.08576013147830963,
0.24794787168502808,
-0.06311704963445663,
0.1473274976015091,
-0.020790869370102882,
-0.16434483230113983,
0.07334780693054199,
0.06416254490613937,
0.07227631658315659,
0.06913222372531891,
0.11215730756521225,
0.0020037174690514803,
0.017364054918289185,
0.12614323198795319,
0.05889604985713959,
-0.011050567030906677,
-0.031410302966833115,
-0.04586650803685188,
-0.04347039759159088,
-0.020151739940047264,
0.041160233318805695,
0.05188119783997536,
-0.1600257307291031,
-0.02415069006383419,
0.022831739857792854,
0.046689603477716446,
-0.003216250566765666,
0.08608495444059372,
-0.19217506051063538,
-0.018159521743655205,
0.06477150321006775,
-0.0016290671192109585,
-0.09313707798719406,
0.08108778297901154,
-0.009849769994616508,
-0.09697907418012619,
0.03780587762594223,
-0.03585495799779892,
0.1301390826702118,
-0.0750122219324112,
0.07286842167377472,
-0.1119815781712532,
-0.02080838568508625,
-0.0087605444714427,
0.11860883235931396,
-0.3024371266365051,
0.1707288920879364,
-0.0030656929593533278,
-0.04842326417565346,
-0.11293680220842361,
-0.015061003156006336,
0.03821004554629326,
0.08916047215461731,
0.10371578484773636,
-0.030773809179663658,
-0.06436607241630554,
0.0791664570569992,
-0.050910793244838715,
0.03525971621274948,
0.10187692940235138,
-0.04662879928946495,
-0.014911266043782234,
-0.05685164034366608,
0.0027524156030267477,
0.02270045317709446,
-0.10804066807031631,
0.014929873868823051,
-0.19113284349441528,
0.07794220000505447,
0.0811065286397934,
0.0722472071647644,
0.04095001146197319,
-0.029467018321156502,
-0.1261810064315796,
0.2744207978248596,
0.007417048793286085,
-0.09985779225826263,
-0.11269644647836685,
0.04465123638510704,
0.05646880716085434,
-0.07145541161298752,
-0.028514720499515533,
-0.07924950867891312,
0.052012015134096146,
-0.07113154232501984,
-0.1981293261051178,
0.11338871717453003,
-0.09873685240745544,
-0.04736494645476341,
-0.03962721675634384,
0.2276533544063568,
-0.027753405272960663,
0.02130931057035923,
0.0393831804394722,
-0.001616212772205472,
-0.12734149396419525,
-0.09492160379886627,
0.004517016001045704,
-0.0013660878175869584,
0.02586340345442295,
0.022777099162340164,
-0.04388801380991936,
0.0049570053815841675,
-0.06949588656425476,
-0.0037953434512019157,
0.3158918023109436,
0.10998717695474625,
-0.04474896565079689,
0.1561327874660492,
0.10242960602045059,
-0.06360200047492981,
-0.28859275579452515,
-0.11298105865716934,
-0.07240703701972961,
-0.05466444417834282,
-0.0838940367102623,
-0.18133240938186646,
0.08497140556573868,
-0.042584747076034546,
-0.00881777424365282,
0.042027126997709274,
-0.2644155025482178,
-0.09412363916635513,
0.18815293908119202,
-0.01533579919487238,
0.4300551414489746,
-0.11307147145271301,
-0.07450833916664124,
-0.05387028306722641,
-0.13561248779296875,
0.18766070902347565,
-0.018648525699973106,
0.0966244488954544,
0.00443116994574666,
0.20654869079589844,
0.05815155804157257,
-0.0008219819865189493,
0.0747876986861229,
0.011587066575884819,
-0.0452013723552227,
-0.09014920890331268,
-0.09217863529920578,
-0.020688166841864586,
0.005974666681140661,
0.034957773983478546,
-0.0941787138581276,
0.05258546397089958,
-0.11336535215377808,
-0.05589618906378746,
-0.07209338247776031,
0.026715638116002083,
0.02418643794953823,
-0.06410122662782669,
-0.006407043896615505,
-0.048794936388731,
-0.0010418962920084596,
0.00979152973741293,
0.21295785903930664,
-0.11305148899555206,
0.12096642702817917,
0.04414689913392067,
0.1508360654115677,
-0.08366664499044418,
-0.03614836558699608,
-0.04910365119576454,
-0.05565084517002106,
0.0676501989364624,
-0.1319035291671753,
0.04462771117687225,
0.10053624957799911,
-0.030742639675736427,
0.0898696631193161,
0.11227817088365555,
-0.02972952462732792,
0.0016581144882366061,
0.07279330492019653,
-0.23832836747169495,
-0.08509121090173721,
-0.07718803733587265,
0.05435929819941521,
0.057659514248371124,
0.09007556736469269,
0.21964938938617706,
0.011087107472121716,
-0.023847850039601326,
0.027587326243519783,
0.029717741534113884,
-0.01658647321164608,
0.05797221511602402,
0.008770608343183994,
0.031205764040350914,
-0.14632299542427063,
0.04562913626432419,
-0.010501107200980186,
-0.07197817414999008,
0.03429242596030235,
0.16717956960201263,
-0.10209374874830246,
-0.12234743684530258,
-0.04288604483008385,
0.17517046630382538,
-0.13247300684452057,
-0.017495078966021538,
-0.05478521063923836,
-0.1241658553481102,
0.07977617532014847,
0.11423204839229584,
0.05072414129972458,
0.042339734733104706,
-0.09691346436738968,
-0.03881148621439934,
-0.05552472919225693,
0.01957569271326065,
0.018891409039497375,
-0.030404040589928627,
-0.037885911762714386,
0.025801094248890877,
-0.04172535613179207,
0.11203933507204056,
-0.087384894490242,
-0.09792038798332214,
-0.16838693618774414,
0.03925701230764389,
-0.049022991210222244,
-0.07899222522974014,
-0.09344983100891113,
-0.03523614630103111,
0.014231358654797077,
-0.03348008170723915,
-0.018664700910449028,
-0.02225758694112301,
-0.0958842933177948,
0.03419994190335274,
-0.048781368881464005,
-0.005008503329008818,
-0.08496184647083282,
0.017331385985016823,
0.04781922325491905,
-0.023604100570082664,
0.1431105136871338,
0.12453559041023254,
-0.11789791285991669,
0.10031480342149734,
-0.16611437499523163,
-0.06820093840360641,
0.09455996751785278,
0.02471991442143917,
0.043245621025562286,
0.028927266597747803,
0.005174829158931971,
0.04808570072054863,
0.05950818210840225,
0.03694291412830353,
0.041101954877376556,
-0.07111897319555283,
0.061451081186532974,
-0.06278520077466965,
-0.11226452142000198,
-0.04257739707827568,
-0.005422866903245449,
0.00011432790051912889,
0.07346735894680023,
0.11052975058555603,
-0.05098198726773262,
0.09580544382333755,
-0.050767768174409866,
0.046003878116607666,
0.0289035402238369,
-0.16526201367378235,
0.008764104917645454,
-0.08482556790113449,
0.05248309671878815,
0.0030253108125180006,
0.15688744187355042,
0.028536081314086914,
-0.03175791725516319,
0.02630779519677162,
0.05105529725551605,
0.06318540126085281,
-0.00840448122471571,
0.19050461053848267,
0.09726009517908096,
-0.04487645998597145,
-0.09418396651744843,
0.08849480748176575,
0.05022666975855827,
0.05143674090504646,
0.1403687596321106,
-0.020687401294708252,
0.012512898072600365,
0.07724163681268692,
0.014415515586733818,
0.017872430384159088,
-0.07756411284208298,
-0.09487451612949371,
-0.011494439095258713,
0.025514457374811172,
-0.02882363088428974,
0.1138797178864479,
0.16729387640953064,
-0.0008394720498472452,
0.013234704732894897,
-0.01801590994000435,
-0.05735309422016144,
-0.20129387080669403,
-0.1959676295518875,
-0.09400797635316849,
-0.13690303266048431,
-0.0009418319095857441,
-0.13835963606834412,
0.03616710752248764,
0.042394787073135376,
0.09917435795068741,
-0.039446551352739334,
0.019261397421360016,
0.026794444769620895,
-0.10323353111743927,
0.039175424724817276,
-0.04838612675666809,
0.09421038627624512,
-0.007761404849588871,
0.005773975048214197,
-0.046786144375801086,
0.02436385303735733,
0.02127891033887863,
0.038409680128097534,
-0.012736459262669086,
0.024856114760041237,
-0.11602245271205902,
-0.09478921443223953,
-0.058010075241327286,
0.0558818019926548,
0.0046934462152421474,
0.18179026246070862,
0.02449701726436615,
-0.03384847193956375,
0.0275272186845541,
0.19317778944969177,
-0.06196035072207451,
-0.09709009528160095,
-0.08241496980190277,
0.2182236760854721,
-0.018931716680526733,
0.09253086894750595,
-0.035876765847206116,
0.012440751306712627,
-0.07121489197015762,
0.33243879675865173,
0.29320472478866577,
-0.10524016618728638,
0.010426074266433716,
-0.0019151283195242286,
0.0405552051961422,
0.1290767937898636,
0.07575080543756485,
0.11663594841957092,
0.256552129983902,
-0.06501701474189758,
-0.057690393179655075,
-0.014668738469481468,
-0.027142031118273735,
-0.06502988189458847,
0.04214107245206833,
0.04939494654536247,
-0.07117093354463577,
-0.00912293791770935,
0.12242040783166885,
-0.24606983363628387,
0.04577518254518509,
-0.13518153131008148,
-0.14807558059692383,
-0.0726354643702507,
0.002261551097035408,
0.09914402663707733,
0.010166509076952934,
0.08546656370162964,
-0.014570544473826885,
-0.0710548534989357,
0.03896206244826317,
0.021210450679063797,
-0.2144380509853363,
0.021960165351629257,
0.07259857654571533,
-0.028754761442542076,
-0.07154250144958496,
-0.013138728216290474,
0.08338925242424011,
0.09720319509506226,
0.03173141926527023,
-0.009079075418412685,
0.04570826143026352,
-0.0000614441087236628,
-0.06747788935899734,
0.035688117146492004,
0.022403022274374962,
0.01331246830523014,
-0.05491582676768303,
0.07895619422197342,
-0.17176033556461334,
0.020258452743291855,
-0.03599786013364792,
-0.06506339460611343,
-0.006352625321596861,
0.02872123196721077,
-0.06236473098397255,
0.0810769721865654,
0.08681372553110123,
-0.010693355463445187,
-0.015406738966703415,
-0.019259916618466377,
-0.012411676347255707,
-0.028850549831986427,
-0.07069326192140579,
-0.09390060603618622,
-0.15529757738113403,
-0.12466321885585785,
0.08110006153583527,
-0.008061634376645088,
-0.2096063792705536,
0.012769150547683239,
-0.13104628026485443,
0.04622570425271988,
-0.10809949785470963,
0.09371429681777954,
0.08394473046064377,
0.020185640081763268,
-0.007141938898712397,
0.003890183288604021,
0.036074474453926086,
0.07894916087388992,
-0.13067346811294556,
-0.08049263805150986
] |
null | null | transformers |
#harry potter DialoGPT model | {"tags": ["conversational"]} | text-generation | RizqFarIDN/DialoGPT-medium-harrypotter | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
#harry potter DialoGPT model | [] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] | [
51
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] | [
-0.009697278961539268,
0.03208012506365776,
-0.007204889785498381,
0.004809224978089333,
0.16726240515708923,
0.014898733235895634,
0.09765533357858658,
0.13672804832458496,
-0.007841327227652073,
-0.031050153076648712,
0.14490588009357452,
0.20411323010921478,
-0.006439372431486845,
0.0661218985915184,
-0.07572533935308456,
-0.2683109939098358,
0.05759621039032936,
0.046649303287267685,
0.016515716910362244,
0.1200079694390297,
0.08573378622531891,
-0.05473608896136284,
0.08714032918214798,
-0.014583407901227474,
-0.150366872549057,
0.017733458429574966,
0.043394338339567184,
-0.12260226160287857,
0.11910516023635864,
0.05462685227394104,
0.07063519209623337,
0.014929565601050854,
-0.07541623711585999,
-0.1631229966878891,
0.03031250834465027,
0.01425902172923088,
-0.0594632662832737,
0.04757995903491974,
0.059961482882499695,
-0.10165371745824814,
0.10819483548402786,
0.09530027210712433,
-0.013078106567263603,
0.06798283755779266,
-0.16849711537361145,
-0.020869607105851173,
-0.01446688175201416,
0.009899779222905636,
0.05550243332982063,
0.09964893013238907,
-0.03413357585668564,
0.10497362166643143,
-0.09214533120393753,
0.11017382889986038,
0.10932035744190216,
-0.32057443261146545,
-0.005767723545432091,
0.09167823940515518,
0.039358653128147125,
0.07352814823389053,
-0.04467793554067612,
0.06258884817361832,
0.018015462905168533,
0.017986174672842026,
-0.014015024527907372,
-0.07283061742782593,
-0.11612214148044586,
0.04717336222529411,
-0.08668071031570435,
-0.059868961572647095,
0.2244078367948532,
-0.05464440956711769,
0.06881742179393768,
-0.05281897634267807,
-0.10522868484258652,
-0.04308144748210907,
-0.029833965003490448,
0.00475557055324316,
-0.07660607248544693,
0.08692064881324768,
0.00869679357856512,
-0.09547875821590424,
-0.1376667022705078,
-0.02496783249080181,
-0.1776352822780609,
0.16140350699424744,
0.02465328387916088,
0.05232657864689827,
-0.2027255892753601,
0.09623090922832489,
0.017906051129102707,
-0.08045592904090881,
0.022091427817940712,
-0.10046248883008957,
0.029131146147847176,
0.013760408386588097,
-0.04754498973488808,
-0.061387211084365845,
0.0843690037727356,
0.11199145019054413,
-0.01731434464454651,
0.025486016646027565,
-0.039331406354904175,
0.08100687712430954,
0.03553595021367073,
0.09077847748994827,
0.007288969587534666,
-0.028338588774204254,
0.025842782109975815,
-0.13719046115875244,
-0.003647835226729512,
-0.07116208970546722,
-0.16572439670562744,
-0.021088803187012672,
0.02994808368384838,
0.08289173990488052,
0.015449047088623047,
0.11682453751564026,
-0.03272046521306038,
-0.025152435526251793,
0.03602350503206253,
-0.047656361013650894,
-0.012649794109165668,
0.016648368909955025,
0.013163427822291851,
0.12399329990148544,
-0.0022096503525972366,
0.03235051408410072,
-0.13653022050857544,
0.031423524022102356,
-0.06793295592069626,
-0.003740974934771657,
-0.03486552834510803,
-0.040637075901031494,
0.009043924510478973,
-0.06862333416938782,
0.003486064961180091,
-0.15030112862586975,
-0.15063877403736115,
0.007587034720927477,
-0.007836631499230862,
-0.04107699543237686,
-0.06370922178030014,
-0.06952770054340363,
-0.013550350442528725,
0.04251532256603241,
-0.07093454152345657,
-0.011352915316820145,
-0.06403283774852753,
0.11004766076803207,
-0.03197755664587021,
0.07921615242958069,
-0.11953279376029968,
0.08390819281339645,
-0.11260783672332764,
-0.02386913076043129,
-0.060801517218351364,
0.09317506104707718,
-0.0006014376995153725,
0.09549830108880997,
-0.006563255097717047,
-0.017931854352355003,
-0.07981178909540176,
0.06445012241601944,
-0.042872510850429535,
0.21701598167419434,
-0.0615808479487896,
-0.11181682348251343,
0.28781595826148987,
-0.052628401666879654,
-0.1370542049407959,
0.11647392809391022,
0.008682746440172195,
0.05777018144726753,
0.10703510791063309,
0.19733482599258423,
-0.015276194550096989,
0.004040541127324104,
0.09471915662288666,
0.11263324320316315,
-0.11276852339506149,
-0.033160366117954254,
0.013019153848290443,
-0.04081077128648758,
-0.10867965966463089,
0.04689536616206169,
0.09810488671064377,
0.07090286910533905,
-0.04786505550146103,
-0.03377414867281914,
-0.01366397924721241,
0.0052589005790650845,
0.08885077387094498,
-0.007157256826758385,
0.10962837189435959,
-0.05819983780384064,
-0.03796621412038803,
-0.029282379895448685,
-0.012126247398555279,
-0.03951939567923546,
0.03137664496898651,
-0.043376367539167404,
0.10821941494941711,
-0.011204327456653118,
0.06364280730485916,
-0.16185984015464783,
-0.07691477984189987,
-0.017002692446112633,
0.1581239402294159,
0.024538565427064896,
0.09859629720449448,
0.0552486926317215,
-0.040398042649030685,
-0.0012767292791977525,
0.012792680412530899,
0.15581141412258148,
-0.022091681137681007,
-0.065607450902462,
-0.052166227251291275,
0.08642971515655518,
-0.05641226842999458,
0.04504093527793884,
-0.05937713757157326,
0.012367865070700645,
0.05064384639263153,
0.10342344641685486,
-0.00018274025933351368,
0.03323284164071083,
-0.008164864964783192,
0.002145637758076191,
-0.058205123990774155,
0.007405933458358049,
0.10799351334571838,
0.00036868182360194623,
-0.07365862280130386,
0.22074243426322937,
-0.17796069383621216,
0.1765957772731781,
0.1893044263124466,
-0.299345999956131,
0.017949223518371582,
-0.10759581625461578,
-0.04561871662735939,
0.014407722279429436,
0.05567655712366104,
-0.0454222597181797,
0.1703362911939621,
-0.009871348738670349,
0.18874616920948029,
-0.04946064203977585,
-0.04464937001466751,
-0.0200483538210392,
-0.05118836089968681,
-0.0024189651012420654,
0.07781197130680084,
0.10685696452856064,
-0.13992026448249817,
0.1964332014322281,
0.1621224284172058,
0.048237916082143784,
0.19945049285888672,
0.015346456319093704,
-0.011589210480451584,
0.0909530371427536,
0.005220826715230942,
-0.058739423751831055,
-0.07409929484128952,
-0.2594851851463318,
-0.030033592134714127,
0.07992640137672424,
0.0422382652759552,
0.1212305948138237,
-0.11349532753229141,
-0.038956157863140106,
-0.01763172075152397,
-0.023146281018853188,
0.021672505885362625,
0.0914369598031044,
0.06075398623943329,
0.13201528787612915,
-0.001710098935291171,
-0.007300339173525572,
0.10524573177099228,
0.01783694699406624,
-0.09354141354560852,
0.18308524787425995,
-0.13652534782886505,
-0.37097251415252686,
-0.13911493122577667,
-0.18057456612586975,
-0.05449081212282181,
0.05712554603815079,
0.11679314076900482,
-0.12011238187551498,
-0.018752124160528183,
0.01578843593597412,
0.10931742936372757,
-0.08449502289295197,
0.0021454424131661654,
-0.06880278885364532,
0.0321490578353405,
-0.10310184955596924,
-0.09194442629814148,
-0.055416494607925415,
-0.031392451375722885,
-0.08001253753900528,
0.1423761546611786,
-0.10777941346168518,
0.04476889222860336,
0.20262959599494934,
0.04653622955083847,
0.05625178664922714,
-0.044105201959609985,
0.19377262890338898,
-0.11264272034168243,
-0.01661740615963936,
0.19215328991413116,
-0.048360925167798996,
0.07476246356964111,
0.1232115849852562,
-0.006348740309476852,
-0.08765771239995956,
0.03011748194694519,
-0.02085109055042267,
-0.07988511025905609,
-0.23219464719295502,
-0.13938382267951965,
-0.12429051846265793,
0.09477275609970093,
0.028005298227071762,
0.056365787982940674,
0.17219258844852448,
0.06577219814062119,
-0.038416244089603424,
0.006410336587578058,
0.02959546446800232,
0.08237514644861221,
0.23417828977108002,
-0.06035616248846054,
0.1364797055721283,
-0.03420931473374367,
-0.14982740581035614,
0.08169995993375778,
0.0713929831981659,
0.10213395953178406,
0.06678459793329239,
0.0804823637008667,
0.0149586396291852,
0.06188136339187622,
0.1311223804950714,
0.08191446959972382,
0.019586285576224327,
-0.02480296604335308,
-0.03388110175728798,
-0.025523077696561813,
-0.05937909707427025,
0.040128443390131,
0.06589099019765854,
-0.16763372719287872,
-0.039227183908224106,
-0.09338314831256866,
0.09657008945941925,
0.0873042419552803,
0.06609832495450974,
-0.1842060089111328,
-0.008006223477423191,
0.08488986641168594,
-0.03854905813932419,
-0.13727426528930664,
0.09535189718008041,
0.01523482333868742,
-0.15144726634025574,
0.03139317408204079,
-0.04061909019947052,
0.12188644707202911,
-0.07804752141237259,
0.09809603542089462,
-0.08108244836330414,
-0.07448557764291763,
0.02123199962079525,
0.1261177361011505,
-0.30527687072753906,
0.20240111649036407,
-0.0024993624538183212,
-0.06486981362104416,
-0.1243603527545929,
-0.0032166161108762026,
0.002410882618278265,
0.07357452809810638,
0.10519039630889893,
-0.007196315098553896,
0.001897757756523788,
-0.06300821900367737,
-0.01829923689365387,
0.032471053302288055,
0.13080233335494995,
-0.0401318334043026,
-0.021158374845981598,
-0.050194524228572845,
-0.001653497340157628,
-0.03173094615340233,
-0.06934895366430283,
0.02002747356891632,
-0.19509181380271912,
0.08751901984214783,
0.04166261479258537,
0.09648149460554123,
0.029994789510965347,
0.004265148192644119,
-0.09651939570903778,
0.24698667228221893,
-0.07148019969463348,
-0.10072879493236542,
-0.10919588059186935,
-0.046813901513814926,
0.03569883480668068,
-0.05628936365246773,
0.04309194162487984,
-0.0788632407784462,
0.028997479006648064,
-0.06352769583463669,
-0.19235502183437347,
0.12410202622413635,
-0.09027006477117538,
-0.04412810131907463,
-0.02371402643620968,
0.2110891044139862,
-0.05598580464720726,
0.010335659608244896,
0.02930437959730625,
0.01208863127976656,
-0.11645778268575668,
-0.09678568691015244,
0.031018631532788277,
-0.007351789623498917,
0.050603240728378296,
0.041841957718133926,
-0.05915454775094986,
-0.017138581722974777,
-0.052199993282556534,
-0.022926922887563705,
0.3496883809566498,
0.14231905341148376,
-0.043836336582899094,
0.19347235560417175,
0.12347975373268127,
-0.07452994585037231,
-0.3159443140029907,
-0.1066238060593605,
-0.10937739163637161,
-0.04680149629712105,
-0.07012093812227249,
-0.2002030611038208,
0.06474938243627548,
0.00662544509395957,
-0.013415241613984108,
0.12749312818050385,
-0.2561831772327423,
-0.07571036368608475,
0.15906259417533875,
-0.017980827018618584,
0.3745945692062378,
-0.1168576180934906,
-0.10926306992769241,
-0.03950892388820648,
-0.14175476133823395,
0.16968177258968353,
-0.01989765651524067,
0.11221715062856674,
-0.009765521623194218,
0.14388824999332428,
0.05548359826207161,
-0.023479344323277473,
0.08544106781482697,
0.004999885335564613,
-0.03290518373250961,
-0.10304180532693863,
-0.05676887184381485,
0.007092386484146118,
0.02477436140179634,
0.018026655539870262,
-0.041834570467472076,
0.02227151393890381,
-0.11731979995965958,
-0.04657655209302902,
-0.08982590585947037,
0.04431166127324104,
0.03899754583835602,
-0.07325074821710587,
-0.002380647463724017,
-0.07165111601352692,
-0.012272949330508709,
0.022334342822432518,
0.20356793701648712,
-0.08029330521821976,
0.16448934376239777,
0.09239562600851059,
0.12419285625219345,
-0.14376309514045715,
-0.00019283240544609725,
-0.0762530043721199,
-0.05611240118741989,
0.07737895101308823,
-0.09433035552501678,
0.058893077075481415,
0.10901971161365509,
-0.04567738622426987,
0.08828683942556381,
0.10377411544322968,
0.008936077356338501,
0.003213887568563223,
0.10916902124881744,
-0.2667325437068939,
-0.0296600554138422,
-0.07532413303852081,
0.000883326749317348,
0.09092561900615692,
0.08562852442264557,
0.18840822577476501,
0.025361526757478714,
-0.04293036088347435,
-0.002770674182102084,
0.028597986325621605,
-0.039021048694849014,
0.051667019724845886,
0.001123449532315135,
0.01947369985282421,
-0.1530752182006836,
0.072522833943367,
0.01490565575659275,
-0.15215420722961426,
0.021316176280379295,
0.16572684049606323,
-0.11656328290700912,
-0.1283872276544571,
-0.06520111113786697,
0.08313824236392975,
-0.11755692958831787,
-0.01578943058848381,
-0.03279297426342964,
-0.13145680725574493,
0.07992171496152878,
0.12629036605358124,
0.05557859688997269,
0.0972496047616005,
-0.06061713397502899,
-0.020469192415475845,
-0.018721895292401314,
-0.014099318534135818,
-0.012384648434817791,
-0.007667020428925753,
-0.055978111922740936,
0.0590752474963665,
-0.026677248999476433,
0.1425808072090149,
-0.09221141785383224,
-0.1037059873342514,
-0.16142144799232483,
0.0374140702188015,
-0.11013076454401016,
-0.08825794607400894,
-0.08821134269237518,
-0.050188567489385605,
0.002360827289521694,
-0.019856395199894905,
-0.04037635400891304,
-0.05829505994915962,
-0.12300454825162888,
0.0338277705013752,
-0.040771447122097015,
0.024727050215005875,
-0.07512269169092178,
0.015856385231018066,
0.08507686108350754,
-0.03285100311040878,
0.15655414760112762,
0.1450488418340683,
-0.1006515845656395,
0.10741901397705078,
-0.14806775748729706,
-0.09138492494821548,
0.11116421222686768,
0.015329592861235142,
0.0449691042304039,
0.09723787009716034,
0.013362943194806576,
0.0635865181684494,
0.032776717096567154,
0.05308786407113075,
0.027619892731308937,
-0.11959987878799438,
0.06483134627342224,
-0.03626115620136261,
-0.14700546860694885,
-0.049338050186634064,
-0.05282869189977646,
0.01647452637553215,
0.013054544106125832,
0.09622690081596375,
-0.05301849544048309,
0.10698331147432327,
-0.04055701196193695,
0.0346808135509491,
0.017554637044668198,
-0.1730053424835205,
-0.03816922754049301,
-0.08538098633289337,
0.03681723028421402,
0.014741539023816586,
0.25266793370246887,
0.030072299763560295,
0.012416383251547813,
0.032671261578798294,
0.08285367488861084,
0.03899408504366875,
0.010228337720036507,
0.17482228577136993,
0.1162426546216011,
-0.06621865928173065,
-0.10445023328065872,
0.0729617029428482,
0.016332454979419708,
0.01286179106682539,
0.13617953658103943,
0.008365051820874214,
0.005795429926365614,
0.08649782836437225,
-0.016865963116288185,
0.009968153201043606,
-0.10052056610584259,
-0.13426925241947174,
-0.022176474332809448,
0.05151832848787308,
-0.04655967652797699,
0.11727844923734665,
0.1406494379043579,
-0.01806013658642769,
0.03222079202532768,
-0.021771740168333054,
-0.05699979141354561,
-0.1683429479598999,
-0.1429590880870819,
-0.06883849948644638,
-0.13416796922683716,
0.00897989235818386,
-0.11180389672517776,
0.05395037308335304,
0.06001098081469536,
0.06750501692295074,
-0.06899319589138031,
0.10220931470394135,
0.04626858979463577,
-0.11440542340278625,
0.06264589726924896,
-0.0296088308095932,
0.09430401772260666,
-0.02759445086121559,
-0.019505485892295837,
-0.09039592742919922,
0.014574515633285046,
0.011419114656746387,
0.06245238706469536,
-0.04707273095846176,
0.007463190704584122,
-0.14696238934993744,
-0.08972041308879852,
-0.0523175448179245,
0.0718572810292244,
-0.050409089773893356,
0.14282815158367157,
0.00775480642914772,
-0.0170906875282526,
0.039554283022880554,
0.22787313163280487,
-0.07476283609867096,
-0.04778539761900902,
-0.05269690603017807,
0.20717895030975342,
0.02975541539490223,
0.1171872541308403,
-0.022938819602131844,
-0.006106364540755749,
-0.0919521227478981,
0.3764844834804535,
0.30030161142349243,
-0.09031439572572708,
0.011794124729931355,
0.02137952297925949,
0.04502861574292183,
0.1316293478012085,
0.1216534823179245,
0.10318691283464432,
0.3006802201271057,
-0.07452366501092911,
-0.04653361067175865,
-0.012629742734134197,
-0.023858042433857918,
-0.09059546142816544,
0.1021224707365036,
0.04839762672781944,
-0.06382183730602264,
-0.03313443064689636,
0.0954432487487793,
-0.25862133502960205,
0.1277991235256195,
-0.12311873584985733,
-0.17578600347042084,
-0.06654827296733856,
0.009760108776390553,
0.10465722531080246,
0.015642458572983742,
0.0946015790104866,
0.007128213066607714,
-0.11252258718013763,
0.06305865943431854,
0.03397420793771744,
-0.22762253880500793,
0.0006893770187161863,
0.06642123311758041,
-0.07006710022687912,
-0.0024247700348496437,
-0.026499588042497635,
0.05657242611050606,
0.0656052976846695,
0.054629553109407425,
-0.00971333310008049,
0.03816632181406021,
0.0034184439573436975,
-0.0585215799510479,
0.016623929142951965,
0.05121519789099693,
0.02472509816288948,
-0.09763528406620026,
0.06927435845136642,
-0.1574270874261856,
0.04766253009438515,
-0.0030655991286039352,
-0.04124255105853081,
0.006064958870410919,
0.008823691867291927,
-0.06491616368293762,
0.05165379121899605,
0.07916834205389023,
-0.0016257909592241049,
-0.0062433634884655476,
-0.057178743183612823,
-0.02632102556526661,
-0.027755750343203545,
-0.09291748702526093,
-0.10495562851428986,
-0.14682936668395996,
-0.11640441417694092,
0.09368976950645447,
-0.01011267676949501,
-0.1848134547472,
0.022154374048113823,
-0.08606051653623581,
0.08319322764873505,
-0.1670055389404297,
0.08040720224380493,
0.07041648775339127,
0.013038921169936657,
-0.0031511052511632442,
-0.02002427540719509,
0.054132770746946335,
0.086809903383255,
-0.10407156497240067,
-0.07400695979595184
] |
null | null | transformers |
#harry potter DialoGPT model | {"tags": ["conversational"]} | text-generation | RizqFarIDN/DialoGPT-small-harrypotter | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
#harry potter DialoGPT model | [] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] | [
51
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] | [
-0.009697278961539268,
0.03208012506365776,
-0.007204889785498381,
0.004809224978089333,
0.16726240515708923,
0.014898733235895634,
0.09765533357858658,
0.13672804832458496,
-0.007841327227652073,
-0.031050153076648712,
0.14490588009357452,
0.20411323010921478,
-0.006439372431486845,
0.0661218985915184,
-0.07572533935308456,
-0.2683109939098358,
0.05759621039032936,
0.046649303287267685,
0.016515716910362244,
0.1200079694390297,
0.08573378622531891,
-0.05473608896136284,
0.08714032918214798,
-0.014583407901227474,
-0.150366872549057,
0.017733458429574966,
0.043394338339567184,
-0.12260226160287857,
0.11910516023635864,
0.05462685227394104,
0.07063519209623337,
0.014929565601050854,
-0.07541623711585999,
-0.1631229966878891,
0.03031250834465027,
0.01425902172923088,
-0.0594632662832737,
0.04757995903491974,
0.059961482882499695,
-0.10165371745824814,
0.10819483548402786,
0.09530027210712433,
-0.013078106567263603,
0.06798283755779266,
-0.16849711537361145,
-0.020869607105851173,
-0.01446688175201416,
0.009899779222905636,
0.05550243332982063,
0.09964893013238907,
-0.03413357585668564,
0.10497362166643143,
-0.09214533120393753,
0.11017382889986038,
0.10932035744190216,
-0.32057443261146545,
-0.005767723545432091,
0.09167823940515518,
0.039358653128147125,
0.07352814823389053,
-0.04467793554067612,
0.06258884817361832,
0.018015462905168533,
0.017986174672842026,
-0.014015024527907372,
-0.07283061742782593,
-0.11612214148044586,
0.04717336222529411,
-0.08668071031570435,
-0.059868961572647095,
0.2244078367948532,
-0.05464440956711769,
0.06881742179393768,
-0.05281897634267807,
-0.10522868484258652,
-0.04308144748210907,
-0.029833965003490448,
0.00475557055324316,
-0.07660607248544693,
0.08692064881324768,
0.00869679357856512,
-0.09547875821590424,
-0.1376667022705078,
-0.02496783249080181,
-0.1776352822780609,
0.16140350699424744,
0.02465328387916088,
0.05232657864689827,
-0.2027255892753601,
0.09623090922832489,
0.017906051129102707,
-0.08045592904090881,
0.022091427817940712,
-0.10046248883008957,
0.029131146147847176,
0.013760408386588097,
-0.04754498973488808,
-0.061387211084365845,
0.0843690037727356,
0.11199145019054413,
-0.01731434464454651,
0.025486016646027565,
-0.039331406354904175,
0.08100687712430954,
0.03553595021367073,
0.09077847748994827,
0.007288969587534666,
-0.028338588774204254,
0.025842782109975815,
-0.13719046115875244,
-0.003647835226729512,
-0.07116208970546722,
-0.16572439670562744,
-0.021088803187012672,
0.02994808368384838,
0.08289173990488052,
0.015449047088623047,
0.11682453751564026,
-0.03272046521306038,
-0.025152435526251793,
0.03602350503206253,
-0.047656361013650894,
-0.012649794109165668,
0.016648368909955025,
0.013163427822291851,
0.12399329990148544,
-0.0022096503525972366,
0.03235051408410072,
-0.13653022050857544,
0.031423524022102356,
-0.06793295592069626,
-0.003740974934771657,
-0.03486552834510803,
-0.040637075901031494,
0.009043924510478973,
-0.06862333416938782,
0.003486064961180091,
-0.15030112862586975,
-0.15063877403736115,
0.007587034720927477,
-0.007836631499230862,
-0.04107699543237686,
-0.06370922178030014,
-0.06952770054340363,
-0.013550350442528725,
0.04251532256603241,
-0.07093454152345657,
-0.011352915316820145,
-0.06403283774852753,
0.11004766076803207,
-0.03197755664587021,
0.07921615242958069,
-0.11953279376029968,
0.08390819281339645,
-0.11260783672332764,
-0.02386913076043129,
-0.060801517218351364,
0.09317506104707718,
-0.0006014376995153725,
0.09549830108880997,
-0.006563255097717047,
-0.017931854352355003,
-0.07981178909540176,
0.06445012241601944,
-0.042872510850429535,
0.21701598167419434,
-0.0615808479487896,
-0.11181682348251343,
0.28781595826148987,
-0.052628401666879654,
-0.1370542049407959,
0.11647392809391022,
0.008682746440172195,
0.05777018144726753,
0.10703510791063309,
0.19733482599258423,
-0.015276194550096989,
0.004040541127324104,
0.09471915662288666,
0.11263324320316315,
-0.11276852339506149,
-0.033160366117954254,
0.013019153848290443,
-0.04081077128648758,
-0.10867965966463089,
0.04689536616206169,
0.09810488671064377,
0.07090286910533905,
-0.04786505550146103,
-0.03377414867281914,
-0.01366397924721241,
0.0052589005790650845,
0.08885077387094498,
-0.007157256826758385,
0.10962837189435959,
-0.05819983780384064,
-0.03796621412038803,
-0.029282379895448685,
-0.012126247398555279,
-0.03951939567923546,
0.03137664496898651,
-0.043376367539167404,
0.10821941494941711,
-0.011204327456653118,
0.06364280730485916,
-0.16185984015464783,
-0.07691477984189987,
-0.017002692446112633,
0.1581239402294159,
0.024538565427064896,
0.09859629720449448,
0.0552486926317215,
-0.040398042649030685,
-0.0012767292791977525,
0.012792680412530899,
0.15581141412258148,
-0.022091681137681007,
-0.065607450902462,
-0.052166227251291275,
0.08642971515655518,
-0.05641226842999458,
0.04504093527793884,
-0.05937713757157326,
0.012367865070700645,
0.05064384639263153,
0.10342344641685486,
-0.00018274025933351368,
0.03323284164071083,
-0.008164864964783192,
0.002145637758076191,
-0.058205123990774155,
0.007405933458358049,
0.10799351334571838,
0.00036868182360194623,
-0.07365862280130386,
0.22074243426322937,
-0.17796069383621216,
0.1765957772731781,
0.1893044263124466,
-0.299345999956131,
0.017949223518371582,
-0.10759581625461578,
-0.04561871662735939,
0.014407722279429436,
0.05567655712366104,
-0.0454222597181797,
0.1703362911939621,
-0.009871348738670349,
0.18874616920948029,
-0.04946064203977585,
-0.04464937001466751,
-0.0200483538210392,
-0.05118836089968681,
-0.0024189651012420654,
0.07781197130680084,
0.10685696452856064,
-0.13992026448249817,
0.1964332014322281,
0.1621224284172058,
0.048237916082143784,
0.19945049285888672,
0.015346456319093704,
-0.011589210480451584,
0.0909530371427536,
0.005220826715230942,
-0.058739423751831055,
-0.07409929484128952,
-0.2594851851463318,
-0.030033592134714127,
0.07992640137672424,
0.0422382652759552,
0.1212305948138237,
-0.11349532753229141,
-0.038956157863140106,
-0.01763172075152397,
-0.023146281018853188,
0.021672505885362625,
0.0914369598031044,
0.06075398623943329,
0.13201528787612915,
-0.001710098935291171,
-0.007300339173525572,
0.10524573177099228,
0.01783694699406624,
-0.09354141354560852,
0.18308524787425995,
-0.13652534782886505,
-0.37097251415252686,
-0.13911493122577667,
-0.18057456612586975,
-0.05449081212282181,
0.05712554603815079,
0.11679314076900482,
-0.12011238187551498,
-0.018752124160528183,
0.01578843593597412,
0.10931742936372757,
-0.08449502289295197,
0.0021454424131661654,
-0.06880278885364532,
0.0321490578353405,
-0.10310184955596924,
-0.09194442629814148,
-0.055416494607925415,
-0.031392451375722885,
-0.08001253753900528,
0.1423761546611786,
-0.10777941346168518,
0.04476889222860336,
0.20262959599494934,
0.04653622955083847,
0.05625178664922714,
-0.044105201959609985,
0.19377262890338898,
-0.11264272034168243,
-0.01661740615963936,
0.19215328991413116,
-0.048360925167798996,
0.07476246356964111,
0.1232115849852562,
-0.006348740309476852,
-0.08765771239995956,
0.03011748194694519,
-0.02085109055042267,
-0.07988511025905609,
-0.23219464719295502,
-0.13938382267951965,
-0.12429051846265793,
0.09477275609970093,
0.028005298227071762,
0.056365787982940674,
0.17219258844852448,
0.06577219814062119,
-0.038416244089603424,
0.006410336587578058,
0.02959546446800232,
0.08237514644861221,
0.23417828977108002,
-0.06035616248846054,
0.1364797055721283,
-0.03420931473374367,
-0.14982740581035614,
0.08169995993375778,
0.0713929831981659,
0.10213395953178406,
0.06678459793329239,
0.0804823637008667,
0.0149586396291852,
0.06188136339187622,
0.1311223804950714,
0.08191446959972382,
0.019586285576224327,
-0.02480296604335308,
-0.03388110175728798,
-0.025523077696561813,
-0.05937909707427025,
0.040128443390131,
0.06589099019765854,
-0.16763372719287872,
-0.039227183908224106,
-0.09338314831256866,
0.09657008945941925,
0.0873042419552803,
0.06609832495450974,
-0.1842060089111328,
-0.008006223477423191,
0.08488986641168594,
-0.03854905813932419,
-0.13727426528930664,
0.09535189718008041,
0.01523482333868742,
-0.15144726634025574,
0.03139317408204079,
-0.04061909019947052,
0.12188644707202911,
-0.07804752141237259,
0.09809603542089462,
-0.08108244836330414,
-0.07448557764291763,
0.02123199962079525,
0.1261177361011505,
-0.30527687072753906,
0.20240111649036407,
-0.0024993624538183212,
-0.06486981362104416,
-0.1243603527545929,
-0.0032166161108762026,
0.002410882618278265,
0.07357452809810638,
0.10519039630889893,
-0.007196315098553896,
0.001897757756523788,
-0.06300821900367737,
-0.01829923689365387,
0.032471053302288055,
0.13080233335494995,
-0.0401318334043026,
-0.021158374845981598,
-0.050194524228572845,
-0.001653497340157628,
-0.03173094615340233,
-0.06934895366430283,
0.02002747356891632,
-0.19509181380271912,
0.08751901984214783,
0.04166261479258537,
0.09648149460554123,
0.029994789510965347,
0.004265148192644119,
-0.09651939570903778,
0.24698667228221893,
-0.07148019969463348,
-0.10072879493236542,
-0.10919588059186935,
-0.046813901513814926,
0.03569883480668068,
-0.05628936365246773,
0.04309194162487984,
-0.0788632407784462,
0.028997479006648064,
-0.06352769583463669,
-0.19235502183437347,
0.12410202622413635,
-0.09027006477117538,
-0.04412810131907463,
-0.02371402643620968,
0.2110891044139862,
-0.05598580464720726,
0.010335659608244896,
0.02930437959730625,
0.01208863127976656,
-0.11645778268575668,
-0.09678568691015244,
0.031018631532788277,
-0.007351789623498917,
0.050603240728378296,
0.041841957718133926,
-0.05915454775094986,
-0.017138581722974777,
-0.052199993282556534,
-0.022926922887563705,
0.3496883809566498,
0.14231905341148376,
-0.043836336582899094,
0.19347235560417175,
0.12347975373268127,
-0.07452994585037231,
-0.3159443140029907,
-0.1066238060593605,
-0.10937739163637161,
-0.04680149629712105,
-0.07012093812227249,
-0.2002030611038208,
0.06474938243627548,
0.00662544509395957,
-0.013415241613984108,
0.12749312818050385,
-0.2561831772327423,
-0.07571036368608475,
0.15906259417533875,
-0.017980827018618584,
0.3745945692062378,
-0.1168576180934906,
-0.10926306992769241,
-0.03950892388820648,
-0.14175476133823395,
0.16968177258968353,
-0.01989765651524067,
0.11221715062856674,
-0.009765521623194218,
0.14388824999332428,
0.05548359826207161,
-0.023479344323277473,
0.08544106781482697,
0.004999885335564613,
-0.03290518373250961,
-0.10304180532693863,
-0.05676887184381485,
0.007092386484146118,
0.02477436140179634,
0.018026655539870262,
-0.041834570467472076,
0.02227151393890381,
-0.11731979995965958,
-0.04657655209302902,
-0.08982590585947037,
0.04431166127324104,
0.03899754583835602,
-0.07325074821710587,
-0.002380647463724017,
-0.07165111601352692,
-0.012272949330508709,
0.022334342822432518,
0.20356793701648712,
-0.08029330521821976,
0.16448934376239777,
0.09239562600851059,
0.12419285625219345,
-0.14376309514045715,
-0.00019283240544609725,
-0.0762530043721199,
-0.05611240118741989,
0.07737895101308823,
-0.09433035552501678,
0.058893077075481415,
0.10901971161365509,
-0.04567738622426987,
0.08828683942556381,
0.10377411544322968,
0.008936077356338501,
0.003213887568563223,
0.10916902124881744,
-0.2667325437068939,
-0.0296600554138422,
-0.07532413303852081,
0.000883326749317348,
0.09092561900615692,
0.08562852442264557,
0.18840822577476501,
0.025361526757478714,
-0.04293036088347435,
-0.002770674182102084,
0.028597986325621605,
-0.039021048694849014,
0.051667019724845886,
0.001123449532315135,
0.01947369985282421,
-0.1530752182006836,
0.072522833943367,
0.01490565575659275,
-0.15215420722961426,
0.021316176280379295,
0.16572684049606323,
-0.11656328290700912,
-0.1283872276544571,
-0.06520111113786697,
0.08313824236392975,
-0.11755692958831787,
-0.01578943058848381,
-0.03279297426342964,
-0.13145680725574493,
0.07992171496152878,
0.12629036605358124,
0.05557859688997269,
0.0972496047616005,
-0.06061713397502899,
-0.020469192415475845,
-0.018721895292401314,
-0.014099318534135818,
-0.012384648434817791,
-0.007667020428925753,
-0.055978111922740936,
0.0590752474963665,
-0.026677248999476433,
0.1425808072090149,
-0.09221141785383224,
-0.1037059873342514,
-0.16142144799232483,
0.0374140702188015,
-0.11013076454401016,
-0.08825794607400894,
-0.08821134269237518,
-0.050188567489385605,
0.002360827289521694,
-0.019856395199894905,
-0.04037635400891304,
-0.05829505994915962,
-0.12300454825162888,
0.0338277705013752,
-0.040771447122097015,
0.024727050215005875,
-0.07512269169092178,
0.015856385231018066,
0.08507686108350754,
-0.03285100311040878,
0.15655414760112762,
0.1450488418340683,
-0.1006515845656395,
0.10741901397705078,
-0.14806775748729706,
-0.09138492494821548,
0.11116421222686768,
0.015329592861235142,
0.0449691042304039,
0.09723787009716034,
0.013362943194806576,
0.0635865181684494,
0.032776717096567154,
0.05308786407113075,
0.027619892731308937,
-0.11959987878799438,
0.06483134627342224,
-0.03626115620136261,
-0.14700546860694885,
-0.049338050186634064,
-0.05282869189977646,
0.01647452637553215,
0.013054544106125832,
0.09622690081596375,
-0.05301849544048309,
0.10698331147432327,
-0.04055701196193695,
0.0346808135509491,
0.017554637044668198,
-0.1730053424835205,
-0.03816922754049301,
-0.08538098633289337,
0.03681723028421402,
0.014741539023816586,
0.25266793370246887,
0.030072299763560295,
0.012416383251547813,
0.032671261578798294,
0.08285367488861084,
0.03899408504366875,
0.010228337720036507,
0.17482228577136993,
0.1162426546216011,
-0.06621865928173065,
-0.10445023328065872,
0.0729617029428482,
0.016332454979419708,
0.01286179106682539,
0.13617953658103943,
0.008365051820874214,
0.005795429926365614,
0.08649782836437225,
-0.016865963116288185,
0.009968153201043606,
-0.10052056610584259,
-0.13426925241947174,
-0.022176474332809448,
0.05151832848787308,
-0.04655967652797699,
0.11727844923734665,
0.1406494379043579,
-0.01806013658642769,
0.03222079202532768,
-0.021771740168333054,
-0.05699979141354561,
-0.1683429479598999,
-0.1429590880870819,
-0.06883849948644638,
-0.13416796922683716,
0.00897989235818386,
-0.11180389672517776,
0.05395037308335304,
0.06001098081469536,
0.06750501692295074,
-0.06899319589138031,
0.10220931470394135,
0.04626858979463577,
-0.11440542340278625,
0.06264589726924896,
-0.0296088308095932,
0.09430401772260666,
-0.02759445086121559,
-0.019505485892295837,
-0.09039592742919922,
0.014574515633285046,
0.011419114656746387,
0.06245238706469536,
-0.04707273095846176,
0.007463190704584122,
-0.14696238934993744,
-0.08972041308879852,
-0.0523175448179245,
0.0718572810292244,
-0.050409089773893356,
0.14282815158367157,
0.00775480642914772,
-0.0170906875282526,
0.039554283022880554,
0.22787313163280487,
-0.07476283609867096,
-0.04778539761900902,
-0.05269690603017807,
0.20717895030975342,
0.02975541539490223,
0.1171872541308403,
-0.022938819602131844,
-0.006106364540755749,
-0.0919521227478981,
0.3764844834804535,
0.30030161142349243,
-0.09031439572572708,
0.011794124729931355,
0.02137952297925949,
0.04502861574292183,
0.1316293478012085,
0.1216534823179245,
0.10318691283464432,
0.3006802201271057,
-0.07452366501092911,
-0.04653361067175865,
-0.012629742734134197,
-0.023858042433857918,
-0.09059546142816544,
0.1021224707365036,
0.04839762672781944,
-0.06382183730602264,
-0.03313443064689636,
0.0954432487487793,
-0.25862133502960205,
0.1277991235256195,
-0.12311873584985733,
-0.17578600347042084,
-0.06654827296733856,
0.009760108776390553,
0.10465722531080246,
0.015642458572983742,
0.0946015790104866,
0.007128213066607714,
-0.11252258718013763,
0.06305865943431854,
0.03397420793771744,
-0.22762253880500793,
0.0006893770187161863,
0.06642123311758041,
-0.07006710022687912,
-0.0024247700348496437,
-0.026499588042497635,
0.05657242611050606,
0.0656052976846695,
0.054629553109407425,
-0.00971333310008049,
0.03816632181406021,
0.0034184439573436975,
-0.0585215799510479,
0.016623929142951965,
0.05121519789099693,
0.02472509816288948,
-0.09763528406620026,
0.06927435845136642,
-0.1574270874261856,
0.04766253009438515,
-0.0030655991286039352,
-0.04124255105853081,
0.006064958870410919,
0.008823691867291927,
-0.06491616368293762,
0.05165379121899605,
0.07916834205389023,
-0.0016257909592241049,
-0.0062433634884655476,
-0.057178743183612823,
-0.02632102556526661,
-0.027755750343203545,
-0.09291748702526093,
-0.10495562851428986,
-0.14682936668395996,
-0.11640441417694092,
0.09368976950645447,
-0.01011267676949501,
-0.1848134547472,
0.022154374048113823,
-0.08606051653623581,
0.08319322764873505,
-0.1670055389404297,
0.08040720224380493,
0.07041648775339127,
0.013038921169936657,
-0.0031511052511632442,
-0.02002427540719509,
0.054132770746946335,
0.086809903383255,
-0.10407156497240067,
-0.07400695979595184
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-cased-finetuned-chunk
This model is a fine-tuned version of [distilbert-base-cased](https://huggingface.co/distilbert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5180
- Precision: 0.8615
- Recall: 0.9088
- F1: 0.8845
- Accuracy: 0.8239
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.8391 | 1.0 | 878 | 0.5871 | 0.8453 | 0.9035 | 0.8734 | 0.8054 |
| 0.6134 | 2.0 | 1756 | 0.5447 | 0.8555 | 0.8983 | 0.8764 | 0.8142 |
| 0.5565 | 3.0 | 2634 | 0.5180 | 0.8615 | 0.9088 | 0.8845 | 0.8239 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.9.1
- Tokenizers 0.10.3
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["precision", "recall", "f1", "accuracy"], "model-index": [{"name": "distilbert-base-cased-finetuned-chunk", "results": []}]} | token-classification | RobW/distilbert-base-cased-finetuned-chunk | [
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"token-classification",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #distilbert #token-classification #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
| distilbert-base-cased-finetuned-chunk
=====================================
This model is a fine-tuned version of distilbert-base-cased on an unknown dataset.
It achieves the following results on the evaluation set:
* Loss: 0.5180
* Precision: 0.8615
* Recall: 0.9088
* F1: 0.8845
* Accuracy: 0.8239
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.9.1
* Tokenizers 0.10.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.9.1\n* Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #distilbert #token-classification #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.9.1\n* Tokenizers 0.10.3"
] | [
58,
98,
4,
25
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #distilbert #token-classification #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.9.1\n* Tokenizers 0.10.3"
] | [
-0.09748740494251251,
0.0621245875954628,
-0.002328770002350211,
0.12379291653633118,
0.17716068029403687,
0.015226769261062145,
0.10663709789514542,
0.11257807165384293,
-0.10800983756780624,
0.011717256158590317,
0.12110041826963425,
0.19606934487819672,
-0.006380980834364891,
0.10547377169132233,
-0.047442756593227386,
-0.2579883933067322,
-0.014786053448915482,
0.06342053413391113,
-0.0962677001953125,
0.1353372186422348,
0.10193578153848648,
-0.14119291305541992,
0.07844668626785278,
0.01889694109559059,
-0.2417656034231186,
0.014535319991409779,
0.019169321283698082,
-0.0673656091094017,
0.15297885239124298,
0.008514338172972202,
0.13559478521347046,
-0.006059612613171339,
0.08251513540744781,
-0.15518806874752045,
0.006574315018951893,
0.045827869325876236,
0.022073324769735336,
0.08725553005933762,
0.06290805339813232,
0.010660616680979729,
0.11064885556697845,
-0.06871771067380905,
0.05227481573820114,
0.018399188295006752,
-0.11523517221212387,
-0.2504919767379761,
-0.07652144134044647,
0.02790032885968685,
0.06623603403568268,
0.1002606675028801,
0.009216380305588245,
0.1421358734369278,
-0.09632876515388489,
0.0922989770770073,
0.2295181304216385,
-0.2743314802646637,
-0.06581666320562363,
0.041814979165792465,
0.0011878873920068145,
0.05253243073821068,
-0.11260974407196045,
-0.03721911460161209,
0.05746516212821007,
0.05112650617957115,
0.14829713106155396,
-0.04235801100730896,
-0.10739035904407501,
0.014326192438602448,
-0.1499505639076233,
-0.025712041184306145,
0.12359907478094101,
0.029366113245487213,
-0.03417195379734039,
-0.027778558433055878,
-0.05692879855632782,
-0.149076446890831,
-0.041149433702230453,
-0.021093767136335373,
0.045873768627643585,
-0.03934446722269058,
-0.06412801891565323,
0.012217622250318527,
-0.10776334255933762,
-0.07057080417871475,
-0.07254904508590698,
0.17338386178016663,
0.04879831150174141,
0.012991859577596188,
-0.02378855086863041,
0.11282605677843094,
0.01748020574450493,
-0.12684525549411774,
0.03370756283402443,
0.03174880892038345,
0.009102649055421352,
-0.05392616242170334,
-0.07147666811943054,
-0.036087486892938614,
0.0054723541252315044,
0.11906732618808746,
-0.06408781558275223,
0.0407445952296257,
0.060187630355358124,
0.04432245343923569,
-0.10965482145547867,
0.20501196384429932,
-0.031108267605304718,
0.00674709677696228,
0.008824179880321026,
0.04162497818470001,
-0.006083660759031773,
0.008416402153670788,
-0.11286289989948273,
-0.00681033730506897,
0.12434683740139008,
0.012501247227191925,
-0.07987115532159805,
0.06836157292127609,
-0.051910705864429474,
-0.03219522535800934,
0.013068354688584805,
-0.09869758784770966,
0.03250856325030327,
-0.012804185971617699,
-0.08699169009923935,
-0.0003998464089818299,
0.022582795470952988,
0.00917715672403574,
-0.01747734472155571,
0.1286112368106842,
-0.08514483273029327,
0.0456603541970253,
-0.10519740730524063,
-0.09181133657693863,
0.0050925612449646,
-0.06755887717008591,
0.03659636527299881,
-0.10766354948282242,
-0.14901795983314514,
-0.008486057631671429,
0.05945216864347458,
-0.013061518780887127,
-0.06357605010271072,
-0.027283497154712677,
-0.07135790586471558,
-0.008900861255824566,
-0.016582489013671875,
0.14482299983501434,
-0.05105309188365936,
0.11529955267906189,
0.049437522888183594,
0.0628969669342041,
-0.04232664406299591,
0.0631471648812294,
-0.108350969851017,
0.009173729456961155,
-0.20459264516830444,
0.025898264721035957,
-0.051023468375205994,
0.08224307000637054,
-0.09184054285287857,
-0.12379010021686554,
0.022240163758397102,
-0.012040656991302967,
0.06391545385122299,
0.07242803275585175,
-0.14064927399158478,
-0.08186715096235275,
0.13310618698596954,
-0.0649886205792427,
-0.10713203251361847,
0.11229240894317627,
-0.057964324951171875,
0.05197561904788017,
0.07067049294710159,
0.15034820139408112,
0.08328660577535629,
-0.07114408165216446,
0.03307603299617767,
0.0025671301409602165,
0.035744793713092804,
-0.07987884432077408,
0.05873117968440056,
0.010092292912304401,
-0.023782480508089066,
0.04037824645638466,
-0.04504793509840965,
0.07408551126718521,
-0.09741529077291489,
-0.0912560448050499,
-0.04469051957130432,
-0.1023050993680954,
0.05474859103560448,
0.08065792173147202,
0.09768074005842209,
-0.08962943404912949,
-0.05896977335214615,
0.09668931365013123,
0.0803983062505722,
-0.049600448459386826,
0.026404809206724167,
-0.05163295939564705,
0.06018028035759926,
-0.043467067182064056,
-0.028211049735546112,
-0.19485202431678772,
0.002160498406738043,
0.01362228486686945,
-0.022617347538471222,
0.02528977021574974,
0.023904958739876747,
0.0711231529712677,
0.062162838876247406,
-0.05279531702399254,
-0.025810787454247475,
-0.028656646609306335,
-0.00365451886318624,
-0.13828328251838684,
-0.1783137172460556,
-0.031389422714710236,
-0.012551365420222282,
0.09772533178329468,
-0.18358981609344482,
0.03297244384884834,
-0.03723416477441788,
0.07529839873313904,
0.0003539829922374338,
-0.00016714242519810796,
-0.056643012911081314,
0.09223711490631104,
-0.029145102947950363,
-0.04854108765721321,
0.07578284293413162,
0.00030483806040138006,
-0.07560122758150101,
-0.059211842715740204,
-0.07746478170156479,
0.2008344829082489,
0.13708281517028809,
-0.14659424126148224,
-0.08619102090597153,
-0.0039182789623737335,
-0.0658901259303093,
-0.033807676285505295,
-0.038326673209667206,
0.05011441186070442,
0.17884883284568787,
-0.020691143348813057,
0.15295074880123138,
-0.06816906481981277,
-0.055138230323791504,
0.019768429920077324,
-0.03904498368501663,
0.03926838934421539,
0.10921215265989304,
0.11960218101739883,
-0.07831807434558868,
0.1415238231420517,
0.17824603617191315,
-0.11423984169960022,
0.11174419522285461,
-0.05136987566947937,
-0.06992965191602707,
-0.012920805253088474,
-0.015302974730730057,
-0.001276148366741836,
0.08482927083969116,
-0.12264303863048553,
-0.00018489078502170742,
0.020141933113336563,
0.022428272292017937,
0.020251834765076637,
-0.2336679995059967,
-0.038412906229496,
0.030630500987172127,
-0.026925403624773026,
0.01974981278181076,
-0.00972253829240799,
0.010947010479867458,
0.09991703927516937,
0.00041320122545585036,
-0.10452339053153992,
0.04463791847229004,
0.013857416808605194,
-0.06850873678922653,
0.2163342982530594,
-0.08076361566781998,
-0.139410600066185,
-0.1239200308918953,
-0.07555034756660461,
-0.04311129078269005,
0.013670503161847591,
0.06022842973470688,
-0.09189750254154205,
-0.026114974170923233,
-0.034307826310396194,
0.024205783382058144,
-0.008759944699704647,
0.04993492737412453,
0.00538324099034071,
0.00908674392849207,
0.08585135638713837,
-0.10559753328561783,
-0.007053261622786522,
-0.05250529572367668,
-0.07179553806781769,
0.048896800726652145,
0.04354804381728172,
0.10301294177770615,
0.15769973397254944,
-0.03008957765996456,
0.0017445071134716272,
-0.023555580526590347,
0.23326490819454193,
-0.059680160135030746,
-0.035626765340566635,
0.1531476229429245,
0.002292410470545292,
0.05904151126742363,
0.0934138372540474,
0.07839331775903702,
-0.08588723838329315,
0.010564789175987244,
0.027077745646238327,
-0.0331038162112236,
-0.20969036221504211,
-0.04704653471708298,
-0.053855009377002716,
-0.03575555607676506,
0.10409669578075409,
0.029835350811481476,
0.05830590799450874,
0.07610438019037247,
0.04863475263118744,
0.10040396451950073,
-0.05558908358216286,
0.056468185037374496,
0.12607482075691223,
0.05003507435321808,
0.12203323841094971,
-0.03962644934654236,
-0.0885484367609024,
0.027742484584450722,
-0.01124604418873787,
0.22668302059173584,
0.00819235760718584,
0.10379194468259811,
0.057423174381256104,
0.19651514291763306,
-0.0010464373044669628,
0.08819577097892761,
-0.005549105815589428,
-0.047949377447366714,
-0.012551830150187016,
-0.0375552736222744,
-0.03451281785964966,
0.013294768519699574,
-0.060567714273929596,
0.06255891174077988,
-0.11192061752080917,
-0.020928995683789253,
0.050432343035936356,
0.25694695115089417,
0.010317683219909668,
-0.322571337223053,
-0.08883577585220337,
-0.010505160316824913,
-0.039538733661174774,
-0.016875242814421654,
0.022920621559023857,
0.07074668258428574,
-0.09866979718208313,
0.017213009297847748,
-0.07272075861692429,
0.09102649986743927,
-0.0318012498319149,
0.04925057664513588,
0.08186791837215424,
0.09573791921138763,
0.024677637964487076,
0.08339667320251465,
-0.3202778100967407,
0.2663644254207611,
-0.003495546290650964,
0.07203143835067749,
-0.07083406299352646,
0.0031465061474591494,
0.041226521134376526,
0.06881500780582428,
0.04834563657641411,
-0.012805365025997162,
-0.023762207478284836,
-0.2201365977525711,
-0.038626838475465775,
0.024371769279241562,
0.08571549504995346,
-0.027073530480265617,
0.08010858297348022,
-0.03398662433028221,
0.007929956540465355,
0.0785847082734108,
-0.03325904905796051,
-0.04823150858283043,
-0.07523799687623978,
-0.013112572021782398,
0.015199967660009861,
-0.04259229451417923,
-0.06458943337202072,
-0.1115223690867424,
-0.13567502796649933,
0.14690746366977692,
-0.007290101144462824,
-0.0319603867828846,
-0.11626619100570679,
0.07612589746713638,
0.0868523120880127,
-0.088581383228302,
0.05830267444252968,
0.0015015691751614213,
0.05598000809550285,
0.037263430655002594,
-0.07626165449619293,
0.104589082300663,
-0.0659913495182991,
-0.1558116227388382,
-0.05073745176196098,
0.09838775545358658,
0.028816770762205124,
0.059184134006500244,
-0.01369356457144022,
0.008081423118710518,
-0.04628303274512291,
-0.09725455194711685,
0.018324825912714005,
-0.029343044385313988,
0.07281719893217087,
0.014667041599750519,
-0.059498827904462814,
0.02198038436472416,
-0.06301196664571762,
-0.02839483506977558,
0.1715797632932663,
0.2261125147342682,
-0.09832288324832916,
0.0011852653697133064,
0.03722207248210907,
-0.057808395475149155,
-0.19046616554260254,
0.03604062274098396,
0.07025525718927383,
-0.00693690637126565,
0.04017326980829239,
-0.168450728058815,
0.15186277031898499,
0.10802844166755676,
-0.016118014231324196,
0.1032819151878357,
-0.3267216384410858,
-0.1264406144618988,
0.1321590095758438,
0.1572181135416031,
0.12639842927455902,
-0.13482727110385895,
-0.015102621167898178,
-0.01206201035529375,
-0.11758033186197281,
0.08572611212730408,
-0.05563094839453697,
0.1199432834982872,
-0.042167749255895615,
0.08985520154237747,
0.001330407103523612,
-0.06019499897956848,
0.11503507196903229,
0.029537338763475418,
0.10635513812303543,
-0.052635569125413895,
-0.036494020372629166,
0.0240364708006382,
-0.03093099035322666,
0.011396263726055622,
-0.056721605360507965,
0.03552879020571709,
-0.08001740276813507,
-0.016622738912701607,
-0.08755543828010559,
0.05880739167332649,
-0.026111561805009842,
-0.06596668064594269,
-0.04068295657634735,
0.026430413126945496,
0.032618217170238495,
-0.017846785485744476,
0.1259516477584839,
0.042082589119672775,
0.1440386176109314,
0.11406289786100388,
0.04536569491028786,
-0.06588853150606155,
-0.0907878652215004,
-0.02492492087185383,
-0.012126359157264233,
0.07032336294651031,
-0.1258334219455719,
0.028521450236439705,
0.14596058428287506,
0.022132014855742455,
0.12015242874622345,
0.08636397868394852,
-0.010166977532207966,
0.001844161655753851,
0.061527542769908905,
-0.16134800016880035,
-0.06316830962896347,
-0.008551400154829025,
-0.05036051198840141,
-0.09036967158317566,
0.06658940762281418,
0.07473237812519073,
-0.0789686068892479,
-0.014549926854670048,
-0.006177859380841255,
-0.013070537708699703,
-0.07161556929349899,
0.2132110446691513,
0.06773719936609268,
0.04638729989528656,
-0.10410190373659134,
0.06556485593318939,
0.0604637935757637,
-0.06522326916456223,
-0.02255774475634098,
0.06556647270917892,
-0.08622117340564728,
-0.039476633071899414,
0.11457822471857071,
0.1613141894340515,
-0.08209259063005447,
-0.03977438807487488,
-0.14071156084537506,
-0.12331845611333847,
0.08084224164485931,
0.14178039133548737,
0.13346488773822784,
0.015588453970849514,
-0.06280628591775894,
0.016352195292711258,
-0.12257662415504456,
0.07441724091768265,
0.040174297988414764,
0.08357774466276169,
-0.15922565758228302,
0.1736304759979248,
0.012183936312794685,
0.05786391347646713,
-0.024797653779387474,
0.025738073512911797,
-0.09683940559625626,
0.019342800602316856,
-0.10502203553915024,
-0.03862627223134041,
-0.02001609466969967,
0.00778706930577755,
0.00034062875784002244,
-0.0661587193608284,
-0.04758184030652046,
0.025690730661153793,
-0.12529583275318146,
-0.02030937373638153,
0.037857986986637115,
0.05826665833592415,
-0.10650474578142166,
-0.04743684083223343,
0.02921111136674881,
-0.05695246160030365,
0.05599816516041756,
0.050199273973703384,
0.01806068606674671,
0.0629344955086708,
-0.12687180936336517,
-0.008531888946890831,
0.08503133803606033,
0.013570336624979973,
0.07043685764074326,
-0.08934063464403152,
-0.005780115257948637,
0.004021066706627607,
0.07329340279102325,
0.010681673884391785,
0.06402569264173508,
-0.15790370106697083,
-0.014357347972691059,
-0.038292210549116135,
-0.0896049439907074,
-0.07080668210983276,
0.009841213002800941,
0.08722896873950958,
0.012546642683446407,
0.1977144479751587,
-0.0740882009267807,
0.03906017541885376,
-0.20889045298099518,
-0.0035088444128632545,
-0.022954506799578667,
-0.12129764258861542,
-0.1344294399023056,
-0.06236698478460312,
0.05442062392830849,
-0.04522605612874031,
0.12638717889785767,
0.02707398310303688,
0.04578050225973129,
0.028255663812160492,
-0.025343576446175575,
0.018823329359292984,
0.0212265495210886,
0.22058549523353577,
0.03496319428086281,
-0.027430633082985878,
0.0735759511590004,
0.05993971228599548,
0.09550993889570236,
0.10713460296392441,
0.17861057817935944,
0.16227997839450836,
-0.03963743522763252,
0.08217285573482513,
0.021735459566116333,
-0.037786293774843216,
-0.18000449240207672,
0.0305295679718256,
-0.03954079747200012,
0.08813755959272385,
-0.01926475763320923,
0.21587614715099335,
0.06257585436105728,
-0.16947749257087708,
0.05142687261104584,
-0.0445619635283947,
-0.08312008529901505,
-0.09502755850553513,
-0.02629459835588932,
-0.07744286954402924,
-0.14635303616523743,
0.0023593995720148087,
-0.09418297559022903,
0.009729543700814247,
0.11075056344270706,
0.004480066243559122,
-0.031883951276540756,
0.17278653383255005,
0.030288835987448692,
0.02754291146993637,
0.050990018993616104,
-0.003966802731156349,
-0.03412233293056488,
-0.10339794307947159,
-0.0645434632897377,
-0.03388646990060806,
-0.015408397652208805,
0.03315345570445061,
-0.07290720194578171,
-0.08581385016441345,
0.027443567290902138,
-0.021337123587727547,
-0.09079086780548096,
0.021319711580872536,
0.015909258276224136,
0.0596063956618309,
0.03236890584230423,
-0.0010407939553260803,
0.01676006242632866,
-0.02073223888874054,
0.21042722463607788,
-0.08080518245697021,
-0.08745136111974716,
-0.08380011469125748,
0.294209748506546,
0.05084957927465439,
-0.007580716162919998,
0.0349327027797699,
-0.051310330629348755,
-0.007114895619452,
0.25682297348976135,
0.17290186882019043,
-0.08580762892961502,
-0.009791755117475986,
0.0039046029560267925,
-0.017351705580949783,
-0.026999086141586304,
0.12282361090183258,
0.1439877599477768,
0.044579703360795975,
-0.10315662622451782,
-0.04072229564189911,
-0.06319583207368851,
-0.007513438351452351,
-0.0634513646364212,
0.0503317192196846,
0.04207272082567215,
0.0018866104073822498,
-0.041908491402864456,
0.049103736877441406,
-0.056899260729551315,
-0.0903431624174118,
0.07293872535228729,
-0.1789207011461258,
-0.1586916595697403,
-0.012108477763831615,
0.11479000002145767,
-0.002068763365969062,
0.06227242946624756,
-0.034433379769325256,
0.003998058382421732,
0.055176421999931335,
-0.02328285202383995,
-0.08397500216960907,
-0.08044593036174774,
0.10823686420917511,
-0.08769939094781876,
0.18895798921585083,
-0.039669331163167953,
0.08103376626968384,
0.11676274240016937,
0.07080376148223877,
-0.07216363400220871,
0.0595620796084404,
0.03618750721216202,
-0.090363509953022,
0.033114295452833176,
0.08788712322711945,
-0.020605839788913727,
0.04455513879656792,
0.023891229182481766,
-0.13002225756645203,
0.019446372985839844,
-0.06378189474344254,
-0.043255340307950974,
-0.045988183468580246,
-0.05149173364043236,
-0.05204867199063301,
0.12009231746196747,
0.2158593386411667,
-0.02376033365726471,
0.0032804280053824186,
-0.0838228166103363,
0.017463775351643562,
0.06467742472887039,
0.0033619110472500324,
-0.08754108101129532,
-0.2167998105287552,
0.009734256193041801,
0.04955880716443062,
-0.03579019010066986,
-0.1880049705505371,
-0.10135629028081894,
0.0023868680000305176,
-0.0827258825302124,
-0.09604674577713013,
0.07737613469362259,
0.05721953511238098,
0.05472111329436302,
-0.047718461602926254,
-0.08163908123970032,
-0.09735549986362457,
0.14516405761241913,
-0.1573438048362732,
-0.08978547155857086
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# deberta-base-mnli-finetuned-cola
This model is a fine-tuned version of [microsoft/deberta-base-mnli](https://huggingface.co/microsoft/deberta-base-mnli) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8205
- Matthews Correlation: 0.6282
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.4713 | 1.0 | 535 | 0.5110 | 0.5797 |
| 0.2678 | 2.0 | 1070 | 0.6648 | 0.5154 |
| 0.1811 | 3.0 | 1605 | 0.6681 | 0.6121 |
| 0.113 | 4.0 | 2140 | 0.8205 | 0.6282 |
| 0.0831 | 5.0 | 2675 | 1.0413 | 0.6057 |
### Framework versions
- Transformers 4.11.3
- Pytorch 1.9.0+cu111
- Datasets 1.14.0
- Tokenizers 0.10.3
| {"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["matthews_correlation"], "model-index": [{"name": "deberta-base-mnli-finetuned-cola", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "cola"}, "metrics": [{"type": "matthews_correlation", "value": 0.6281691768918801, "name": "Matthews Correlation"}]}]}]} | text-classification | Roberta55/deberta-base-mnli-finetuned-cola | [
"transformers",
"pytorch",
"tensorboard",
"deberta",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:mit",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #deberta #text-classification #generated_from_trainer #dataset-glue #license-mit #model-index #autotrain_compatible #endpoints_compatible #region-us
| deberta-base-mnli-finetuned-cola
================================
This model is a fine-tuned version of microsoft/deberta-base-mnli on the glue dataset.
It achieves the following results on the evaluation set:
* Loss: 0.8205
* Matthews Correlation: 0.6282
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.11.3
* Pytorch 1.9.0+cu111
* Datasets 1.14.0
* Tokenizers 0.10.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.9.0+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #deberta #text-classification #generated_from_trainer #dataset-glue #license-mit #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.9.0+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] | [
64,
98,
4,
34
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #deberta #text-classification #generated_from_trainer #dataset-glue #license-mit #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.9.0+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] | [
-0.09831836074590683,
0.0745333805680275,
-0.0019468548707664013,
0.11811316758394241,
0.18282398581504822,
0.04341888800263405,
0.1358741670846939,
0.11761824786663055,
-0.08077634871006012,
0.022195948287844658,
0.12428750842809677,
0.1781395822763443,
0.010487406514585018,
0.1138959527015686,
-0.05339307710528374,
-0.28571897745132446,
-0.018744099885225296,
0.05850134417414665,
-0.07061664015054703,
0.12503281235694885,
0.10272610187530518,
-0.1287475824356079,
0.09424647688865662,
0.0001925505930557847,
-0.21888728439807892,
0.013815918937325478,
0.010503812693059444,
-0.05163988098502159,
0.14906080067157745,
0.03850190341472626,
0.1358039528131485,
0.01342026237398386,
0.08919663727283478,
-0.18151000142097473,
0.014193251729011536,
0.04589175060391426,
-0.004782284144312143,
0.08608260005712509,
0.047499146312475204,
-0.0001441979402443394,
0.13329382240772247,
-0.09220436960458755,
0.0643293485045433,
0.015252863056957722,
-0.12764988839626312,
-0.21027833223342896,
-0.06287135183811188,
0.031445760279893875,
0.0683513656258583,
0.10461581498384476,
-0.01450082566589117,
0.1235136091709137,
-0.08513899892568588,
0.09467263519763947,
0.19524376094341278,
-0.2806968092918396,
-0.06307128816843033,
0.06012386828660965,
0.01978045329451561,
0.05325552448630333,
-0.10414233803749084,
-0.024556245654821396,
0.060814376920461655,
0.04888811707496643,
0.11990498006343842,
-0.03156920149922371,
-0.0796445906162262,
0.015210594050586224,
-0.13694240152835846,
-0.00952517706900835,
0.15541109442710876,
0.0330301970243454,
-0.025029102340340614,
-0.05758160352706909,
-0.04565590247511864,
-0.131911501288414,
-0.03072621487081051,
-0.028929637745022774,
0.04667572304606438,
-0.03475199267268181,
-0.07694379985332489,
-0.005575047805905342,
-0.11443056166172028,
-0.06395100802183151,
-0.07982868701219559,
0.13377739489078522,
0.03400937467813492,
0.0010204523568972945,
-0.040178172290325165,
0.10829780995845795,
-0.010880670510232449,
-0.1262456625699997,
0.0246768519282341,
0.024547550827264786,
0.007412073202431202,
-0.062322039157152176,
-0.061538003385066986,
-0.0766511932015419,
0.0038742159958928823,
0.11835554987192154,
-0.054061610251665115,
0.03713314235210419,
0.052926741540431976,
0.043746646493673325,
-0.07045478373765945,
0.18917614221572876,
-0.05042562261223793,
-0.024096736684441566,
0.0006246967823244631,
0.043983642011880875,
0.010392715223133564,
-0.014895642176270485,
-0.13200470805168152,
-0.0022541615180671215,
0.10016144812107086,
0.0028101294301450253,
-0.0671951025724411,
0.07882754504680634,
-0.05428537353873253,
-0.031181497499346733,
0.0020450660958886147,
-0.08473312109708786,
0.026596857234835625,
-0.0025920418556779623,
-0.07931998372077942,
-0.017712073400616646,
0.0161721333861351,
0.013974455185234547,
-0.010787433944642544,
0.13237296044826508,
-0.10030947625637054,
0.02614540606737137,
-0.09653612971305847,
-0.1295878291130066,
0.014855464920401573,
-0.10074873268604279,
0.03669019415974617,
-0.09936582297086716,
-0.17542408406734467,
-0.025666292756795883,
0.043639786541461945,
-0.029730038717389107,
-0.05404958128929138,
-0.06136539578437805,
-0.0704907774925232,
0.008260887116193771,
-0.012788524851202965,
0.1041492447257042,
-0.05972396582365036,
0.09760365635156631,
0.034392476081848145,
0.05708170309662819,
-0.06057248264551163,
0.05912093445658684,
-0.09266070276498795,
0.0032745427452027798,
-0.16841307282447815,
0.047285836189985275,
-0.04010959714651108,
0.07139284908771515,
-0.07313759624958038,
-0.10464532673358917,
0.011167136952280998,
0.01727728731930256,
0.06674925982952118,
0.09394557029008865,
-0.18540410697460175,
-0.09248721599578857,
0.14918167889118195,
-0.073256716132164,
-0.1190381869673729,
0.11831291764974594,
-0.07312742620706558,
0.07098239660263062,
0.07891193777322769,
0.18374191224575043,
0.0724988728761673,
-0.08319684118032455,
-0.0009543810156174004,
0.014014121145009995,
0.04335625469684601,
-0.06415369361639023,
0.060656048357486725,
0.008521389216184616,
0.03211122378706932,
0.02657855860888958,
-0.020320964977145195,
0.05846383422613144,
-0.10001871734857559,
-0.08727055788040161,
-0.022835111245512962,
-0.08597094565629959,
0.04706409201025963,
0.08519723266363144,
0.07784302532672882,
-0.09946262836456299,
-0.08067724853754044,
0.09427744895219803,
0.0793711468577385,
-0.06503663212060928,
0.01491212286055088,
-0.0566786490380764,
0.06573612987995148,
-0.03228631615638733,
-0.02858172543346882,
-0.17400851845741272,
-0.047696553170681,
-0.002830210840329528,
0.02251434326171875,
0.0261716116219759,
0.046764373779296875,
0.06829749792814255,
0.058128006756305695,
-0.058623239398002625,
-0.009617914445698261,
-0.03111172467470169,
0.006863822229206562,
-0.13242429494857788,
-0.20664221048355103,
-0.023843519389629364,
-0.026693085208535194,
0.15128815174102783,
-0.22835654020309448,
0.04904574155807495,
-0.00541048776358366,
0.07832508534193039,
0.015973005443811417,
-0.005124777089804411,
-0.04992052912712097,
0.08058245480060577,
-0.04189362749457359,
-0.049081627279520035,
0.0721953734755516,
0.009669997729361057,
-0.0949767455458641,
-0.051234614104032516,
-0.11893045157194138,
0.17628826200962067,
0.13693767786026,
-0.12744706869125366,
-0.0955742821097374,
-0.012529593892395496,
-0.05537141114473343,
-0.029576819390058517,
-0.044899601489305496,
0.013801917433738708,
0.18216153979301453,
-0.013189427554607391,
0.15082718431949615,
-0.06296005100011826,
-0.04611099883913994,
0.017124786972999573,
-0.031347040086984634,
0.010596364736557007,
0.1251574456691742,
0.12287209928035736,
-0.08841241896152496,
0.15760648250579834,
0.1301731914281845,
-0.08963077515363693,
0.16266463696956635,
-0.02681565470993519,
-0.07052195072174072,
-0.024514788761734962,
-0.039380140602588654,
-0.011174967512488365,
0.11411765217781067,
-0.14714528620243073,
-0.010697774589061737,
0.014874630607664585,
0.012090766802430153,
0.030134078115224838,
-0.21706680953502655,
-0.04726112633943558,
0.0373261533677578,
-0.03092573583126068,
-0.019664401188492775,
-0.00674098264425993,
0.0028707582969218493,
0.10753762722015381,
0.011647479608654976,
-0.08331722021102905,
0.0385982021689415,
0.004929439164698124,
-0.08481082320213318,
0.22623273730278015,
-0.07124137133359909,
-0.15662717819213867,
-0.11871713399887085,
-0.06055368855595589,
-0.04209316894412041,
0.011229414492845535,
0.061397891491651535,
-0.09605064988136292,
-0.026589153334498405,
-0.05656132847070694,
0.04237309843301773,
-0.013314279727637768,
0.02245662733912468,
-0.005268138367682695,
0.0061923423781991005,
0.050053685903549194,
-0.11135336756706238,
-0.013125193305313587,
-0.06849803775548935,
-0.049250658601522446,
0.044618215411901474,
0.023289989680051804,
0.1154731810092926,
0.15654586255550385,
-0.022760499268770218,
0.008570635691285133,
-0.04107411950826645,
0.2415260225534439,
-0.07456948608160019,
-0.02897130325436592,
0.12754298746585846,
-0.0134602515026927,
0.03794620931148529,
0.1212843805551529,
0.0769951194524765,
-0.08610657602548599,
0.00680325785651803,
0.04427667707204819,
-0.033246904611587524,
-0.2231389433145523,
-0.0515185222029686,
-0.04983598738908768,
0.004733922891318798,
0.08285710960626602,
0.023145833984017372,
0.03807085379958153,
0.06969893723726273,
0.04296955093741417,
0.07981221377849579,
-0.031159384176135063,
0.05783533677458763,
0.13718944787979126,
0.03164568915963173,
0.13412220776081085,
-0.044420212507247925,
-0.08228697627782822,
0.03710951656103134,
-0.014501956291496754,
0.20898082852363586,
0.008356520906090736,
0.12280793488025665,
0.05574849620461464,
0.1303512006998062,
-0.006091390736401081,
0.06641500443220139,
-0.0028146514669060707,
-0.0487511120736599,
-0.014216562733054161,
-0.03281346336007118,
-0.022567318752408028,
0.023646896705031395,
-0.04751097783446312,
0.04074771702289581,
-0.13638103008270264,
0.0002694575523491949,
0.05782865360379219,
0.20031648874282837,
0.05288703739643097,
-0.33586758375167847,
-0.09744369983673096,
0.004575724247843027,
-0.018097300082445145,
-0.019490478560328484,
0.01676524430513382,
0.09451159834861755,
-0.08633538335561752,
0.03562898561358452,
-0.06445219367742538,
0.0902571827173233,
-0.07324381917715073,
0.05747359246015549,
0.0846761167049408,
0.09728770703077316,
-0.005682291463017464,
0.08481139689683914,
-0.3075083792209625,
0.2643335461616516,
0.0072878506034612656,
0.06927860528230667,
-0.07318570464849472,
-0.00763305276632309,
0.029458582401275635,
0.0713030993938446,
0.07063056528568268,
-0.02028956077992916,
-0.012401631101965904,
-0.20137494802474976,
-0.04239880293607712,
0.029792632907629013,
0.08635558933019638,
-0.03254791721701622,
0.09035289287567139,
-0.0225304514169693,
0.018733272328972816,
0.08614721149206161,
-0.00822510477155447,
-0.05934687703847885,
-0.10266255587339401,
-0.004759819246828556,
0.012927685864269733,
-0.058321740478277206,
-0.05171646922826767,
-0.12483152002096176,
-0.12313004583120346,
0.15716056525707245,
-0.035308487713336945,
-0.03899216279387474,
-0.10892747342586517,
0.10554449260234833,
0.05169150233268738,
-0.08976014703512192,
0.030072925612330437,
0.017091955989599228,
0.07174312323331833,
0.0238626878708601,
-0.07106798887252808,
0.11492475867271423,
-0.06137226149439812,
-0.14540992677211761,
-0.06556358933448792,
0.09140390157699585,
0.04402763396501541,
0.05957525223493576,
-0.007532901130616665,
0.005599104333668947,
-0.038059622049331665,
-0.08503369987010956,
0.023401090875267982,
-0.013710787519812584,
0.0702681839466095,
0.020580027252435684,
-0.07150189578533173,
0.01949799247086048,
-0.06082100793719292,
-0.02612455002963543,
0.1962059885263443,
0.23261092603206635,
-0.09659883379936218,
-0.00016250631597358733,
0.030576055869460106,
-0.0729188546538353,
-0.20894524455070496,
0.06130542233586311,
0.050392698496580124,
0.018162330612540245,
0.04243959113955498,
-0.18363068997859955,
0.12482032179832458,
0.10767465829849243,
-0.0070218974724411964,
0.1086713895201683,
-0.30505263805389404,
-0.13407857716083527,
0.1389763504266739,
0.13948766887187958,
0.12073540687561035,
-0.1406734734773636,
-0.0150979682803154,
-0.02970057725906372,
-0.12592382729053497,
0.1283644586801529,
-0.10273588448762894,
0.12182413041591644,
-0.02682649716734886,
0.0836588516831398,
0.010400506667792797,
-0.05546623095870018,
0.11532680690288544,
0.03232602775096893,
0.10142478346824646,
-0.06384405493736267,
-0.053622301667928696,
0.043615177273750305,
-0.03430231660604477,
0.026003051549196243,
-0.07744529098272324,
0.019302116706967354,
-0.09949538111686707,
-0.037966351956129074,
-0.0723746120929718,
0.044758763164281845,
-0.045636676251888275,
-0.07869488000869751,
-0.03877754881978035,
0.03595864772796631,
0.031987328082323074,
-0.017788512632250786,
0.13133002817630768,
0.008167886175215244,
0.15223713219165802,
0.08573273569345474,
0.08743929862976074,
-0.056901365518569946,
-0.059985119849443436,
-0.009247704409062862,
-0.018035374581813812,
0.054932333528995514,
-0.132851704955101,
0.018920980393886566,
0.15230068564414978,
0.018870549276471138,
0.1535940021276474,
0.09413124620914459,
-0.028561072424054146,
0.002475755289196968,
0.0683705061674118,
-0.15493351221084595,
-0.09888232499361038,
-0.01498396322131157,
-0.10521384328603745,
-0.11366540193557739,
0.0589388869702816,
0.10250002145767212,
-0.06750539690256119,
-0.00836600549519062,
-0.0055329161696136,
-0.0020461909007281065,
-0.05905354395508766,
0.19458919763565063,
0.07506593316793442,
0.03862927854061127,
-0.09449799358844757,
0.0644536092877388,
0.043859317898750305,
-0.07544375211000443,
0.010101661086082458,
0.06872367113828659,
-0.07518847286701202,
-0.05125100165605545,
0.056864626705646515,
0.2101065218448639,
-0.08022661507129669,
-0.040200814604759216,
-0.1510055661201477,
-0.12636785209178925,
0.07352110743522644,
0.15101763606071472,
0.11802973598241806,
0.01825861819088459,
-0.058985043317079544,
0.015098455362021923,
-0.13306143879890442,
0.09423814713954926,
0.043382517993450165,
0.06937398761510849,
-0.1484927535057068,
0.16946932673454285,
0.003658625530079007,
0.0437813475728035,
-0.02771926485002041,
0.030107568949460983,
-0.12415869534015656,
0.0048272195272147655,
-0.1027492880821228,
-0.030102264136075974,
-0.03315448388457298,
0.008643011562526226,
0.001205315813422203,
-0.04647447168827057,
-0.05432994291186333,
0.002743721706792712,
-0.10624080151319504,
-0.0189503226429224,
0.04228844493627548,
0.06335259228944778,
-0.11071540415287018,
-0.04020817577838898,
0.009866987355053425,
-0.056762635707855225,
0.07977277785539627,
0.035757750272750854,
0.024764221161603928,
0.05807563662528992,
-0.1404813975095749,
0.02723328210413456,
0.06914918124675751,
0.012955078855156898,
0.07027621567249298,
-0.07850315421819687,
-0.005024655722081661,
-0.011504781432449818,
0.058263130486011505,
0.027816206216812134,
0.07041201740503311,
-0.1288212537765503,
0.008612778969109058,
-0.025386376306414604,
-0.08670078963041306,
-0.07068871706724167,
0.04098910093307495,
0.07351307570934296,
0.023931453004479408,
0.1941685676574707,
-0.08149249851703644,
0.041586983948946,
-0.21218115091323853,
0.0050720470026135445,
-0.004039945546537638,
-0.10491462051868439,
-0.08410712331533432,
-0.07129407674074173,
0.06833059340715408,
-0.06240154430270195,
0.15346097946166992,
0.05307622626423836,
0.03016989305615425,
0.0324445478618145,
-0.013660570606589317,
0.01921425200998783,
0.019882122054696083,
0.20209574699401855,
0.02346903085708618,
-0.04093461111187935,
0.03466091677546501,
0.06380860507488251,
0.10528084635734558,
0.1242525652050972,
0.20352371037006378,
0.14191670715808868,
-0.03088534064590931,
0.08696859329938889,
0.055081285536289215,
-0.06116291508078575,
-0.15190225839614868,
0.03862141817808151,
-0.02743898518383503,
0.08908165991306305,
-0.022434934973716736,
0.19180098176002502,
0.07317378371953964,
-0.1679425984621048,
0.03924943506717682,
-0.0629807785153389,
-0.09767285734415054,
-0.11938324570655823,
-0.03600331395864487,
-0.08109116554260254,
-0.1284555196762085,
0.0035981987603008747,
-0.1227990910410881,
-0.007360442075878382,
0.11061938107013702,
0.00826315674930811,
-0.03689949959516525,
0.15504467487335205,
0.018752533942461014,
0.024217737838625908,
0.0624917633831501,
0.011695836670696735,
-0.025571122765541077,
-0.12471040338277817,
-0.05068664252758026,
-0.02451586164534092,
-0.009069088846445084,
0.02205030433833599,
-0.07228643447160721,
-0.05500096455216408,
0.029397813603281975,
-0.020202672109007835,
-0.09762889891862869,
0.01130878645926714,
0.032112207263708115,
0.058444589376449585,
0.05033528059720993,
0.0024067105259746313,
0.005356614477932453,
-0.009427796117961407,
0.20416614413261414,
-0.06602322310209274,
-0.06401281803846359,
-0.09894431382417679,
0.2580423057079315,
0.04855429008603096,
0.0001799154415493831,
0.030495161190629005,
-0.08104229718446732,
0.009766782633960247,
0.23080286383628845,
0.23436188697814941,
-0.09682992100715637,
0.0001176772711914964,
0.015287910588085651,
-0.00412550987675786,
-0.013091846369206905,
0.1052141785621643,
0.11689259111881256,
0.07174423336982727,
-0.09033243358135223,
-0.036425866186618805,
-0.05083870515227318,
-0.007178311236202717,
-0.021029651165008545,
0.06594768166542053,
0.06711789965629578,
0.010538051836192608,
-0.039564330130815506,
0.06412743777036667,
-0.07422538101673126,
-0.09088671207427979,
0.06782707571983337,
-0.22261878848075867,
-0.16115215420722961,
-0.017698848620057106,
0.08757283538579941,
-0.0066207293421030045,
0.07742860168218613,
-0.03264506533741951,
-0.004671839997172356,
0.06810488551855087,
-0.017245950177311897,
-0.10196615010499954,
-0.05971462279558182,
0.0874791294336319,
-0.08400937914848328,
0.19749924540519714,
-0.05592946335673332,
0.07139667868614197,
0.13322634994983673,
0.06987307965755463,
-0.057747095823287964,
0.06776653975248337,
0.03640521690249443,
-0.04697293043136597,
0.033775653690099716,
0.07486297935247421,
-0.04305264353752136,
0.05559656396508217,
0.051458001136779785,
-0.16074395179748535,
0.027444686740636826,
-0.06791695952415466,
-0.07646162807941437,
-0.05395076796412468,
-0.02141639031469822,
-0.06138094514608383,
0.12843497097492218,
0.22253815829753876,
-0.019248925149440765,
-0.0023968161549419165,
-0.062022604048252106,
0.005892243701964617,
0.07064711302518845,
0.039419859647750854,
-0.06456545740365982,
-0.21971464157104492,
0.012907097116112709,
0.05590832605957985,
-0.01891474239528179,
-0.2675029933452606,
-0.08466469496488571,
-0.000601159583311528,
-0.07069454342126846,
-0.09086751192808151,
0.08384028822183609,
0.09060352295637131,
0.05345035716891289,
-0.06311272084712982,
-0.04870813712477684,
-0.07953064143657684,
0.1533832550048828,
-0.14993037283420563,
-0.10364595800638199
] |
null | null | transformers |
# Mikoto Jinba DialoGPT Model | {"tags": ["conversational"]} | text-generation | RobinMari/DialoGPT-small-mikoto | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Mikoto Jinba DialoGPT Model | [
"# Mikoto Jinba DialoGPT Model"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Mikoto Jinba DialoGPT Model"
] | [
51,
10
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Mikoto Jinba DialoGPT Model"
] | [
0.005013268440961838,
0.02107866480946541,
-0.005706805735826492,
0.0016309097409248352,
0.1669316440820694,
-0.004758101888000965,
0.19027763605117798,
0.12647640705108643,
-0.012222811579704285,
-0.023154379799962044,
0.07364284992218018,
0.16258615255355835,
0.05070917680859566,
0.12349360436201096,
-0.06637296080589294,
-0.35943934321403503,
0.08126898854970932,
0.06867870688438416,
0.03306363895535469,
0.11012745648622513,
0.11334601789712906,
-0.04103482514619827,
0.0852557122707367,
0.013047032058238983,
-0.14844292402267456,
0.017163272947072983,
-0.006911692675203085,
-0.16200001537799835,
0.0716022476553917,
0.06602219492197037,
0.012759053148329258,
0.010110906325280666,
-0.05462947115302086,
-0.12108143419027328,
0.03438880294561386,
-0.03453927859663963,
-0.04717906191945076,
0.02699855901300907,
0.033269383013248444,
-0.056761838495731354,
0.15468543767929077,
0.11077631264925003,
-0.05308498442173004,
0.04385431110858917,
-0.15122410655021667,
0.04595212638378143,
0.01442202739417553,
0.08620515465736389,
0.12181445211172104,
0.10458134114742279,
-0.055415574461221695,
0.0705302283167839,
-0.07502732425928116,
0.11470529437065125,
0.0962543711066246,
-0.328421413898468,
-0.0342511385679245,
0.13791660964488983,
0.04038163274526596,
0.058624524623155594,
-0.05199890956282616,
0.06703319400548935,
0.029912961646914482,
-0.008752239868044853,
-0.07653086632490158,
-0.0931219607591629,
-0.041163209825754166,
0.007840985432267189,
-0.07302449643611908,
0.02779744751751423,
0.2761317491531372,
-0.04212510585784912,
0.05877428129315376,
-0.06510836631059647,
-0.07017000019550323,
0.03419638052582741,
-0.052295755594968796,
-0.02807849645614624,
-0.10884745419025421,
0.06473226845264435,
0.011672040447592735,
-0.070258229970932,
-0.09418696910142899,
-0.034376807510852814,
-0.16052678227424622,
0.22107161581516266,
0.05230170488357544,
0.016236837953329086,
-0.23482905328273773,
0.06090162694454193,
0.0038461717776954174,
-0.11131474375724792,
0.0017847655108198524,
-0.09190847724676132,
0.050584252923727036,
0.02534627541899681,
0.0017690450185909867,
-0.0791969820857048,
0.0679844319820404,
0.051677439361810684,
-0.0021948039066046476,
0.019207341596484184,
0.016962949186563492,
0.05549333989620209,
0.07124358415603638,
0.04045167937874794,
-0.03194349631667137,
-0.04705903306603432,
0.017598481848835945,
-0.07766634225845337,
-0.014426251873373985,
-0.06746254861354828,
-0.1696263700723648,
-0.04351397976279259,
0.01770072430372238,
0.049149248749017715,
-0.003566539380699396,
0.1353033035993576,
0.015034079551696777,
-0.05554747208952904,
0.018849942833185196,
-0.05081301927566528,
-0.021696429699659348,
-0.008306840434670448,
-0.03429114818572998,
0.13364985585212708,
-0.025767063722014427,
0.04043005779385567,
-0.12924149632453918,
0.012237386777997017,
-0.03686048462986946,
0.016887575387954712,
-0.03677963465452194,
-0.0304730087518692,
0.0009813320357352495,
-0.018978243693709373,
0.027655862271785736,
-0.1621260792016983,
-0.1455567628145218,
-0.00034957562456838787,
-0.02621842920780182,
-0.07032456994056702,
-0.09126126766204834,
-0.09772983938455582,
-0.03637942299246788,
0.033899251371622086,
-0.06815729290246964,
-0.012723399326205254,
-0.06065298616886139,
0.08353497087955475,
0.0019362197490409017,
0.09369658678770065,
-0.034619592130184174,
0.06946422904729843,
-0.07871175557374954,
-0.02481130138039589,
-0.08444646745920181,
0.08986041694879532,
0.006391784641891718,
0.04951650649309158,
-0.030383726581931114,
0.0053721945732831955,
-0.05825023353099823,
0.06791584938764572,
-0.026048703119158745,
0.2403467446565628,
-0.05978955328464508,
-0.10498322546482086,
0.31137439608573914,
-0.07273392379283905,
-0.1165609285235405,
0.1062953993678093,
0.014534679241478443,
0.13100409507751465,
0.09899035096168518,
0.17225657403469086,
-0.056701306253671646,
0.023008476942777634,
0.06214311346411705,
0.06779705733060837,
-0.07537709176540375,
0.0018054379615932703,
0.03683869168162346,
0.01582568883895874,
-0.10627809911966324,
0.055591415613889694,
0.12005209922790527,
0.06485167890787125,
-0.061677463352680206,
-0.029771672561764717,
0.014631930738687515,
0.002712133340537548,
0.05978195741772652,
-0.031123464927077293,
0.14695967733860016,
-0.018769562244415283,
-0.021428067237138748,
-0.012515285983681679,
0.03811362385749817,
-0.053006600588560104,
0.01549413800239563,
-0.09009435027837753,
0.06299317628145218,
-0.0398278646171093,
0.07857891172170639,
-0.09930133074522018,
0.01418168842792511,
-0.01489313691854477,
0.1177854835987091,
0.033802784979343414,
0.049114882946014404,
0.04262214154005051,
-0.0266434196382761,
-0.026557790115475655,
0.036890316754579544,
0.13939335942268372,
-0.016617564484477043,
-0.0676809698343277,
-0.11069217324256897,
0.10095637291669846,
-0.04427587613463402,
0.05037637799978256,
-0.035550788044929504,
0.013586902990937233,
-0.017051836475729942,
0.12184308469295502,
-0.031458113342523575,
0.05360521376132965,
0.06321989744901657,
0.015392037108540535,
-0.07313334196805954,
0.04407015070319176,
0.07805608212947845,
0.002861805260181427,
-0.08812637627124786,
0.28856223821640015,
-0.17889469861984253,
0.07766999304294586,
0.16737669706344604,
-0.18376116454601288,
0.009547318331897259,
-0.11779991537332535,
-0.027088984847068787,
-0.016513921320438385,
0.060306087136268616,
0.008934082463383675,
0.1702859103679657,
-0.04268188402056694,
0.18562349677085876,
-0.06105346977710724,
-0.008495676331222057,
0.008556602522730827,
-0.08316468447446823,
0.0003848693158943206,
0.10847064852714539,
0.13328447937965393,
-0.1873675435781479,
0.1810930371284485,
0.042231444269418716,
0.029883265495300293,
0.24073882400989532,
0.03521324321627617,
-0.01638229563832283,
0.03865445405244827,
0.01435778010636568,
-0.03883030638098717,
-0.025020238012075424,
-0.29704031348228455,
-0.029788566753268242,
0.07397293299436569,
0.040164366364479065,
0.10874972492456436,
-0.05860208719968796,
-0.03714229539036751,
-0.0288563035428524,
-0.015511056408286095,
0.049146998673677444,
0.1274583637714386,
0.020450297743082047,
0.13018347322940826,
-0.02644275315105915,
-0.03085833415389061,
0.04371395707130432,
0.02297096885740757,
-0.07318566739559174,
0.16626586019992828,
-0.10596992820501328,
-0.32314491271972656,
-0.08020485937595367,
-0.1999388039112091,
-0.0902317613363266,
0.03332099691033363,
0.09395179897546768,
-0.14156131446361542,
-0.02001631259918213,
-0.0038269483484327793,
0.0895465686917305,
-0.08170764893293381,
-0.003442430403083563,
-0.020174231380224228,
0.030336063355207443,
-0.15462948381900787,
-0.06507683545351028,
-0.050018079578876495,
-0.0373244434595108,
-0.04584198817610741,
0.14524799585342407,
-0.16173216700553894,
0.05165569484233856,
0.20536531507968903,
0.06830661743879318,
0.05562204867601395,
-0.044035159051418304,
0.18840853869915009,
-0.13431844115257263,
0.024343518540263176,
0.1959291398525238,
-0.015105176717042923,
0.04544652998447418,
0.14865967631340027,
-0.024401914328336716,
-0.06203281134366989,
0.04361450672149658,
-0.026067592203617096,
-0.06497839093208313,
-0.21120807528495789,
-0.09503999352455139,
-0.13348743319511414,
0.11808883398771286,
-0.005282657220959663,
0.03648269176483154,
0.16739827394485474,
0.08897780627012253,
-0.043741703033447266,
0.03168845921754837,
0.05930609628558159,
0.09780370444059372,
0.14099164307117462,
-0.04424985498189926,
0.12928283214569092,
-0.0025606031995266676,
-0.15890657901763916,
0.07366343587636948,
0.0604984387755394,
0.09267658740282059,
0.036897316575050354,
0.04545780271291733,
0.03783227503299713,
0.07856830954551697,
0.1539045125246048,
0.04568659886717796,
-0.001128842355683446,
-0.025836676359176636,
-0.03997133672237396,
-0.05423770844936371,
-0.004737595561891794,
0.06388511508703232,
0.01820952445268631,
-0.12862873077392578,
-0.048373542726039886,
-0.014307205565273762,
0.07452692836523056,
0.07784419506788254,
0.07610300183296204,
-0.12824903428554535,
-0.03324189409613609,
0.06282175332307816,
-0.018867939710617065,
-0.11916571855545044,
0.08251864463090897,
0.0495583713054657,
-0.16009783744812012,
0.04179001599550247,
-0.006315981969237328,
0.12242073565721512,
-0.031051605939865112,
0.05847432464361191,
-0.11855262517929077,
-0.05066421627998352,
-0.013278061524033546,
0.11884501576423645,
-0.3225339353084564,
0.2128235399723053,
-0.00001931654696818441,
-0.013352777808904648,
-0.12024910748004913,
-0.006681376602500677,
0.03837849572300911,
0.07330767065286636,
0.10364345461130142,
-0.004481344949454069,
-0.002101205289363861,
-0.03140442818403244,
-0.06916403770446777,
0.034490834921598434,
0.11974062025547028,
-0.028473615646362305,
0.0010030484991148114,
-0.017104240134358406,
0.0001534606417408213,
-0.05363815277814865,
-0.07669465243816376,
-0.04873781278729439,
-0.16310349106788635,
0.09850642830133438,
0.09654099494218826,
0.08947011083364487,
0.05394697189331055,
-0.02646235190331936,
-0.03128713369369507,
0.21561461687088013,
0.01691918633878231,
-0.1015307754278183,
-0.09612950682640076,
-0.03226657211780548,
0.05612003430724144,
-0.08323324471712112,
-0.007612396962940693,
-0.08720581978559494,
0.04645345360040665,
-0.03993736580014229,
-0.1675255447626114,
0.07796388119459152,
-0.09788317233324051,
-0.06186770647764206,
-0.013338427059352398,
0.16792429983615875,
-0.021881630644202232,
-0.0007297380943782628,
0.04612085595726967,
-0.027687331661581993,
-0.08688494563102722,
-0.08345885574817657,
-0.009795294143259525,
0.04352322593331337,
0.04100121557712555,
0.03836610168218613,
-0.08409780263900757,
-0.0494852289557457,
-0.1226331889629364,
-0.08713990449905396,
0.2600093483924866,
0.1516132652759552,
-0.0034262544941157103,
0.15139761567115784,
0.1469014286994934,
-0.05793773755431175,
-0.2524647116661072,
-0.15738633275032043,
-0.08008581399917603,
0.0016095406608656049,
-0.10966762155294418,
-0.17068415880203247,
0.052712250500917435,
-0.03665781766176224,
-0.018479246646165848,
0.07183938473463058,
-0.28609129786491394,
-0.10826360434293747,
0.11883596330881119,
-0.0005746126407757401,
0.419332891702652,
-0.1142549067735672,
-0.07038495689630508,
-0.025716321542859077,
-0.1501908004283905,
0.08410056680440903,
0.019737185910344124,
0.13639678061008453,
-0.015523290261626244,
0.16195344924926758,
0.047384023666381836,
0.0057066031731665134,
0.10086551308631897,
0.011720603331923485,
-0.06607107073068619,
-0.11769457161426544,
-0.09127718210220337,
-0.03174596279859543,
0.017512623220682144,
0.034011632204055786,
-0.05481958016753197,
0.016362037509679794,
-0.1511979103088379,
-0.06917540729045868,
-0.07300826907157898,
0.014336757361888885,
0.026482058688998222,
-0.09187116473913193,
-0.006213055457919836,
-0.005438349209725857,
0.004192968364804983,
0.02064923383295536,
0.15525168180465698,
-0.0876566469669342,
0.12979644536972046,
0.07176273316144943,
0.08915156871080399,
-0.10965225100517273,
0.04844316467642784,
-0.06675674766302109,
-0.05055015906691551,
0.09209320694208145,
-0.09545359760522842,
0.02025431953370571,
0.07773809134960175,
-0.04279448837041855,
0.09627597779035568,
0.08187835663557053,
-0.027143318206071854,
0.037316471338272095,
0.0922837108373642,
-0.23056311905384064,
-0.037369903177022934,
-0.08278379589319229,
0.04013180360198021,
0.12176385521888733,
0.08718632161617279,
0.19091075658798218,
-0.04663906246423721,
-0.04541885480284691,
-0.0023075309582054615,
0.03441651538014412,
-0.03771081939339638,
0.05505135655403137,
-0.039050858467817307,
0.011893522925674915,
-0.13427768647670746,
0.06618428230285645,
0.01624605618417263,
-0.07717563956975937,
0.054268497973680496,
0.14775188267230988,
-0.11310018599033356,
-0.11072434484958649,
-0.042561180889606476,
0.14464010298252106,
-0.08121422678232193,
-0.024639852344989777,
-0.061411477625370026,
-0.13546940684318542,
0.050292398780584335,
0.0888979360461235,
0.036118898540735245,
0.025597160682082176,
-0.05244122073054314,
-0.018409887328743935,
-0.070687435567379,
-0.0008760899654589593,
0.07784391194581985,
-0.03018234483897686,
-0.05674010142683983,
0.06408814340829849,
-0.005472853314131498,
0.15851467847824097,
-0.09405231475830078,
-0.11223967373371124,
-0.17311573028564453,
0.039581626653671265,
-0.06803598254919052,
-0.06761027872562408,
-0.15593640506267548,
-0.08262655884027481,
-0.0306436475366354,
-0.04030270129442215,
-0.04710077866911888,
-0.04202417656779289,
-0.09772731363773346,
0.04345831274986267,
-0.05663294345140457,
0.04249737411737442,
-0.028861911967396736,
0.019000964239239693,
0.05236645042896271,
-0.03634175285696983,
0.13299287855625153,
0.17243631184101105,
-0.1143503412604332,
0.068991519510746,
-0.12083922326564789,
-0.02423243783414364,
0.08943117409944534,
0.011381404474377632,
0.06132740154862404,
0.05262153595685959,
0.013398216105997562,
0.05406324565410614,
0.06486406922340393,
0.04109349846839905,
0.03483957052230835,
-0.08927183598279953,
0.026791399344801903,
-0.07376155257225037,
-0.13091941177845,
-0.05856728553771973,
0.008389926515519619,
0.025402778759598732,
0.03217913955450058,
0.056969158351421356,
-0.0765686184167862,
0.10508241504430771,
-0.02132199890911579,
0.03399970009922981,
0.01672958955168724,
-0.16287285089492798,
0.005757811479270458,
-0.10759187489748001,
0.04526769369840622,
-0.0004792088584508747,
0.18147248029708862,
0.05874921381473541,
-0.04947318509221077,
0.008515020832419395,
-0.05296362191438675,
0.051759786903858185,
0.007240360602736473,
0.19667948782444,
0.11695869266986847,
-0.033809904009103775,
-0.07499296963214874,
0.09043209999799728,
0.012529995292425156,
0.0569637306034565,
0.07606323063373566,
0.010105753317475319,
0.028994513675570488,
0.10077185928821564,
-0.02470310777425766,
0.008981043472886086,
-0.08815009891986847,
-0.1604979783296585,
-0.0828692615032196,
0.05595887079834938,
-0.036772821098566055,
0.12045522034168243,
0.18878671526908875,
-0.0266268327832222,
0.0343402624130249,
-0.015198001638054848,
-0.07699563354253769,
-0.17516382038593292,
-0.17663879692554474,
-0.08926830440759659,
-0.13611292839050293,
0.024979326874017715,
-0.1360032558441162,
0.03847423568367958,
0.02074059098958969,
0.08750762790441513,
-0.0667174831032753,
0.058180008083581924,
0.08482557535171509,
-0.10854148119688034,
0.10551771521568298,
-0.03338636830449104,
0.07710976898670197,
-0.04525567963719368,
-0.010646085254848003,
-0.08631300926208496,
0.02365477941930294,
0.004755814094096422,
0.05705665424466133,
-0.07245107740163803,
0.016968630254268646,
-0.10768704861402512,
-0.09633703529834747,
-0.033572662621736526,
0.03677740693092346,
0.034373003989458084,
0.14101965725421906,
0.026536429300904274,
-0.03610721603035927,
0.013878059573471546,
0.24505525827407837,
-0.03162747249007225,
-0.06775873154401779,
-0.08731240034103394,
0.17600467801094055,
0.02344752661883831,
0.07251334190368652,
-0.030049456283450127,
0.006292673293501139,
-0.10759357362985611,
0.3690715730190277,
0.2499813735485077,
-0.08217545598745346,
-0.004188341088593006,
-0.004051709547638893,
0.04758559539914131,
0.10996618866920471,
0.1169026792049408,
0.10677609592676163,
0.3045928180217743,
-0.05365293100476265,
0.011290269903838634,
-0.038185447454452515,
-0.036881424486637115,
-0.09136564284563065,
0.07107996195554733,
0.0837133526802063,
-0.09588208794593811,
-0.02836696431040764,
0.10496829450130463,
-0.2720545828342438,
0.12901151180267334,
-0.14945398271083832,
-0.16542355716228485,
-0.09080757945775986,
0.002764798467978835,
0.08649516105651855,
0.061829861253499985,
0.0886174812912941,
-0.022380996495485306,
-0.034435730427503586,
0.06352412700653076,
0.053631436079740524,
-0.14019325375556946,
0.0033238043542951345,
0.07075759768486023,
-0.0389041006565094,
-0.02495095506310463,
-0.0206721443682909,
0.05072277784347534,
0.06876052170991898,
0.03759034350514412,
-0.006262303330004215,
0.04570862278342247,
0.000735999085009098,
-0.041493505239486694,
0.03711915388703346,
0.08199596405029297,
0.010740414261817932,
-0.11921966820955276,
0.0829244926571846,
-0.1322340965270996,
0.03782595321536064,
-0.009942223317921162,
-0.016497502103447914,
-0.0221017487347126,
0.07685984671115875,
-0.07162453979253769,
0.06898321211338043,
0.12777602672576904,
-0.026572169736027718,
-0.025341784581542015,
-0.024866485968232155,
0.02624279260635376,
-0.04369059205055237,
-0.10594789683818817,
-0.10036662966012955,
-0.1637468934059143,
-0.12657195329666138,
0.043707214295864105,
0.024467013776302338,
-0.199724018573761,
0.025598421692848206,
-0.14481323957443237,
0.04917251691222191,
-0.14991331100463867,
0.09918449074029922,
0.08368255198001862,
0.011040206998586655,
0.0018352132756263018,
-0.10528965294361115,
0.04714902490377426,
0.07380139082670212,
-0.120976023375988,
-0.10280036926269531
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Rocketknight1/bert-base-cased-finetuned-swag
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.8709
- Train Accuracy: 0.6465
- Validation Loss: 0.6167
- Validation Accuracy: 0.7590
- Epoch: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 5e-05, 'decay_steps': 9192, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Epoch |
|:----------:|:--------------:|:---------------:|:-------------------:|:-----:|
| 0.8709 | 0.6465 | 0.6167 | 0.7590 | 0 |
### Framework versions
- Transformers 4.21.0.dev0
- TensorFlow 2.9.1
- Datasets 2.3.3.dev0
- Tokenizers 0.11.0
| {"license": "apache-2.0", "tags": ["generated_from_keras_callback"], "model-index": [{"name": "Rocketknight1/bert-base-cased-finetuned-swag", "results": []}]} | multiple-choice | Rocketknight1/bert-base-cased-finetuned-swag | [
"transformers",
"tf",
"tensorboard",
"bert",
"multiple-choice",
"generated_from_keras_callback",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #tf #tensorboard #bert #multiple-choice #generated_from_keras_callback #license-apache-2.0 #endpoints_compatible #region-us
| Rocketknight1/bert-base-cased-finetuned-swag
============================================
This model is a fine-tuned version of bert-base-cased on an unknown dataset.
It achieves the following results on the evaluation set:
* Train Loss: 0.8709
* Train Accuracy: 0.6465
* Validation Loss: 0.6167
* Validation Accuracy: 0.7590
* Epoch: 0
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* optimizer: {'name': 'Adam', 'learning\_rate': {'class\_name': 'PolynomialDecay', 'config': {'initial\_learning\_rate': 5e-05, 'decay\_steps': 9192, 'end\_learning\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\_1': 0.9, 'beta\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
* training\_precision: float32
### Training results
### Framework versions
* Transformers 4.21.0.dev0
* TensorFlow 2.9.1
* Datasets 2.3.3.dev0
* Tokenizers 0.11.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 5e-05, 'decay\\_steps': 9192, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.21.0.dev0\n* TensorFlow 2.9.1\n* Datasets 2.3.3.dev0\n* Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #tf #tensorboard #bert #multiple-choice #generated_from_keras_callback #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 5e-05, 'decay\\_steps': 9192, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.21.0.dev0\n* TensorFlow 2.9.1\n* Datasets 2.3.3.dev0\n* Tokenizers 0.11.0"
] | [
51,
178,
4,
37
] | [
"passage: TAGS\n#transformers #tf #tensorboard #bert #multiple-choice #generated_from_keras_callback #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 5e-05, 'decay\\_steps': 9192, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}\n* training\\_precision: float32### Training results### Framework versions\n\n\n* Transformers 4.21.0.dev0\n* TensorFlow 2.9.1\n* Datasets 2.3.3.dev0\n* Tokenizers 0.11.0"
] | [
-0.06934039294719696,
0.022483794018626213,
-0.005754392594099045,
0.060452722012996674,
0.14990651607513428,
0.04995405673980713,
0.14502853155136108,
0.10870028287172318,
-0.08420946449041367,
0.11679313331842422,
0.1460534930229187,
0.15087106823921204,
0.06468415260314941,
0.08851107954978943,
-0.06868099421262741,
-0.16307315230369568,
0.07143863290548325,
-0.029102278873324394,
-0.10204991698265076,
0.0792197659611702,
0.0774196907877922,
-0.0767492726445198,
0.07389916479587555,
-0.017166703939437866,
-0.09981285780668259,
0.017440050840377808,
0.07818080484867096,
-0.0627184808254242,
0.11935026198625565,
0.06926622241735458,
0.09545600414276123,
-0.0042683701030910015,
0.010577014647424221,
-0.20251701772212982,
0.013315883465111256,
0.12213883548974991,
0.00020553040667437017,
0.06454639136791229,
0.007023711688816547,
-0.0013719516573473811,
0.11425886303186417,
-0.08503466844558716,
0.06270825862884521,
0.036790668964385986,
-0.12948578596115112,
-0.28348156809806824,
-0.10929983854293823,
0.018946001306176186,
0.06798125803470612,
0.0777585357427597,
-0.0025205169804394245,
0.16193664073944092,
-0.04088116064667702,
0.0895179808139801,
0.1655164659023285,
-0.28073248267173767,
-0.05789833143353462,
0.028142103925347328,
0.0284461360424757,
0.04321395978331566,
-0.059332266449928284,
0.01735524833202362,
0.03222045302391052,
0.04700274020433426,
0.0221080519258976,
-0.00918868463486433,
0.0047650267370045185,
-0.03486970439553261,
-0.07921623438596725,
-0.059751685708761215,
0.12618808448314667,
0.037247758358716965,
-0.06800953298807144,
-0.05850225314497948,
-0.02657175250351429,
-0.13656236231327057,
0.005970348138362169,
-0.02270497940480709,
0.0025860662572085857,
0.002242223359644413,
-0.04256194084882736,
-0.023547708988189697,
-0.06771674752235413,
-0.05712272226810455,
0.00680864742025733,
0.1534755825996399,
0.035649269819259644,
0.050283268094062805,
-0.013250849209725857,
0.06579486280679703,
-0.000030279816201073118,
-0.11882182955741882,
-0.012779943645000458,
-0.004738916177302599,
-0.061801280826330185,
-0.017734818160533905,
-0.084601491689682,
-0.035554856061935425,
0.06905748695135117,
0.12848339974880219,
-0.04610271006822586,
0.1180080771446228,
-0.015472394414246082,
0.026453351601958275,
-0.10854886472225189,
0.10722638666629791,
-0.03564261272549629,
0.05498155578970909,
-0.006401051301509142,
0.0877780020236969,
0.048945002257823944,
-0.04512016475200653,
-0.047680098563432693,
0.04761175438761711,
0.0924241840839386,
0.041180532425642014,
-0.04765276610851288,
0.07601936161518097,
-0.07154436409473419,
-0.007186048664152622,
-0.04367538169026375,
-0.10515549033880234,
0.037538137286901474,
0.04587343707680702,
-0.081380695104599,
0.04306729882955551,
0.07112841308116913,
-0.011536099947988987,
-0.05538863688707352,
0.03657066076993942,
-0.06461986899375916,
-0.028969276696443558,
-0.10197946429252625,
-0.13807937502861023,
0.037931278347969055,
-0.064743272960186,
-0.016129666939377785,
-0.06774316728115082,
-0.13934935629367828,
-0.034162163734436035,
0.08634459227323532,
-0.03693407401442528,
-0.017534123733639717,
-0.06040014326572418,
-0.17316783964633942,
0.04231058433651924,
0.008731026202440262,
0.13426944613456726,
-0.040040869265794754,
0.08436450362205505,
-0.006232211831957102,
0.04650997370481491,
-0.02878682129085064,
0.026651963591575623,
-0.021917186677455902,
0.04403895512223244,
-0.16236960887908936,
0.07432062178850174,
-0.08718140423297882,
0.03891916945576668,
-0.14103785157203674,
-0.09240412712097168,
0.05131585896015167,
0.01890632137656212,
0.11833834648132324,
0.09309910237789154,
-0.14821581542491913,
-0.06646478176116943,
0.09543488174676895,
-0.0898868590593338,
-0.0933600440621376,
0.07963186502456665,
-0.05546914041042328,
0.025792153552174568,
0.07633624970912933,
0.04936997964978218,
0.050355181097984314,
-0.1216476783156395,
0.024826429784297943,
-0.04813947528600693,
0.01718727871775627,
0.05221598595380783,
0.025314874947071075,
-0.02391115203499794,
-0.10062278807163239,
0.016495009884238243,
-0.028405429795384407,
0.017231913283467293,
-0.07550438493490219,
-0.059930212795734406,
-0.029211796820163727,
-0.06277411431074142,
0.03228636085987091,
0.03691747039556503,
0.031282566487789154,
-0.10101045668125153,
-0.15050707757472992,
0.07611441612243652,
0.05309710651636124,
-0.07629206031560898,
0.02110426314175129,
-0.07531697303056717,
0.03824460133910179,
0.03304838016629219,
0.018901625648140907,
-0.15923888981342316,
-0.0815121978521347,
0.023471562191843987,
-0.020500293001532555,
0.0030621022451668978,
-0.004404348321259022,
0.07282917201519012,
0.030905047431588173,
-0.047352343797683716,
0.0034433845430612564,
-0.037971131503582,
0.017875662073493004,
-0.057210057973861694,
-0.24100112915039062,
-0.036871060729026794,
-0.02160816080868244,
0.08569826930761337,
-0.27900975942611694,
0.011852359399199486,
0.07289595156908035,
0.11161468923091888,
0.017466243356466293,
-0.025036996230483055,
-0.04089457169175148,
0.05831724405288696,
-0.020902184769511223,
-0.06297057867050171,
0.03207101672887802,
0.027062812820076942,
-0.11718857288360596,
-0.054498638957738876,
-0.17122861742973328,
0.09474213421344757,
0.1255151331424713,
-0.09906837344169617,
-0.13504378497600555,
0.07872094213962555,
-0.03144506737589836,
-0.029231615364551544,
-0.007428601384162903,
-0.012962122447788715,
0.13501809537410736,
0.04524736851453781,
0.12010695040225983,
-0.046936459839344025,
-0.020981263369321823,
0.046744443476200104,
-0.0155284209176898,
-0.028136087581515312,
0.1311732679605484,
-0.018150337040424347,
-0.07595229148864746,
0.08011966198682785,
0.07730380445718765,
-0.11068765074014664,
0.08296984434127808,
-0.059376198798418045,
-0.050835251808166504,
-0.08374158293008804,
0.07316556572914124,
0.06624515354633331,
0.10261055082082748,
-0.1076425313949585,
0.011536727659404278,
0.006527731195092201,
0.004031019750982523,
-0.026428651064634323,
-0.2066642940044403,
0.0006572315469384193,
0.004781605675816536,
-0.05291493237018585,
0.01349664106965065,
-0.0006509305676445365,
0.015090350061655045,
0.11177865415811539,
0.026307683438062668,
-0.02674386277794838,
0.07175326347351074,
-0.03438040614128113,
-0.08646674454212189,
0.2354399710893631,
-0.126138836145401,
-0.12613306939601898,
-0.13020071387290955,
-0.022258782759308815,
-0.05345899239182472,
-0.009410343132913113,
-0.00041570141911506653,
-0.08679128438234329,
-0.05918291211128235,
-0.05688698589801788,
0.0014479635283350945,
-0.03178752213716507,
0.03408973664045334,
0.03582744300365448,
-0.0004009305266663432,
0.14508399367332458,
-0.10717781633138657,
-0.026349054649472237,
-0.012507332488894463,
-0.07196290791034698,
0.010615408420562744,
0.026760423555970192,
-0.0028844864573329687,
0.11368072032928467,
0.00379032245837152,
0.02324552647769451,
-0.03381174057722092,
0.22880755364894867,
-0.04452243819832802,
-0.012909064069390297,
0.133269265294075,
-0.03036295250058174,
0.07014492154121399,
0.10977872461080551,
0.06250496208667755,
-0.11139683425426483,
0.04406687617301941,
0.07469785958528519,
-0.024828242138028145,
-0.24693959951400757,
0.004534127656370401,
-0.03354378417134285,
-0.10086750239133835,
0.059384219348430634,
0.037503838539123535,
0.12675631046295166,
0.02464757114648819,
0.000527360534761101,
0.09582687169313431,
0.041496507823467255,
0.06753560155630112,
0.17005622386932373,
0.04796764254570007,
0.08907444030046463,
-0.039606042206287384,
-0.004548320081084967,
0.020718708634376526,
-0.023893948644399643,
0.21328440308570862,
0.014415420591831207,
0.07659044116735458,
0.08634798973798752,
0.07424254715442657,
-0.0428321473300457,
0.025409026071429253,
0.003172861644998193,
-0.005844432860612869,
0.018897607922554016,
-0.06705719232559204,
-0.04109140485525131,
0.04461520537734032,
-0.018058480694890022,
0.08125623315572739,
-0.0973462462425232,
0.0012420706916600466,
0.05415372550487518,
0.24922555685043335,
0.09934601187705994,
-0.2874661386013031,
-0.12486428022384644,
0.0031257399823516607,
-0.023770730942487717,
-0.048795513808727264,
-0.010848181322216988,
0.09351131319999695,
-0.07661715149879456,
0.07663121819496155,
-0.07658788561820984,
0.04775969311594963,
-0.042827557772397995,
0.04563242569565773,
0.12141598016023636,
0.1167164072394371,
0.011876992881298065,
0.02832762897014618,
-0.3301880657672882,
0.27977484464645386,
0.025617515668272972,
0.14526930451393127,
-0.08084633201360703,
0.04540591314435005,
0.0298045352101326,
-0.05490746721625328,
0.060513049364089966,
-0.01170679833739996,
-0.12004043161869049,
-0.20815235376358032,
-0.03563566505908966,
0.0028254373464733362,
0.14124636352062225,
0.016435656696558,
0.1144610270857811,
-0.04336610808968544,
0.01449583750218153,
0.0740446075797081,
-0.01861627772450447,
-0.15305443108081818,
-0.0648699402809143,
0.05406613275408745,
0.01834401860833168,
-0.04462532326579094,
-0.09044408798217773,
-0.09640702605247498,
-0.06832848489284515,
0.16625504195690155,
-0.15893997251987457,
-0.05053413659334183,
-0.12336495518684387,
0.08383321762084961,
0.10326901823282242,
-0.05633489787578583,
0.02596757560968399,
-0.013347757048904896,
0.0782209038734436,
0.033155154436826706,
-0.08211741596460342,
0.12040726095438004,
-0.02259577065706253,
-0.23106637597084045,
-0.05337798595428467,
0.11428500711917877,
0.049783818423748016,
0.032926205545663834,
-0.024266794323921204,
0.06265965104103088,
0.03966405987739563,
-0.11293672025203705,
0.08561282604932785,
0.010333603248000145,
0.01889389008283615,
0.04582320153713226,
-0.019465168938040733,
0.02732164040207863,
-0.048471689224243164,
0.01372662652283907,
0.08012840896844864,
0.30890339612960815,
-0.06859194487333298,
0.013139765709638596,
0.028014246374368668,
-0.07981417328119278,
-0.1986064463853836,
0.06090829521417618,
0.09119697660207748,
0.00925920158624649,
-0.057124581187963486,
-0.2012026309967041,
0.05425753444433212,
0.07875175774097443,
-0.016937216743826866,
0.09520076960325241,
-0.30486273765563965,
-0.14380216598510742,
0.07818299531936646,
0.1498357355594635,
0.12208016961812973,
-0.1787579357624054,
-0.05321129038929939,
-0.0517713688313961,
-0.030256176367402077,
0.14934933185577393,
-0.07101363688707352,
0.10633182525634766,
0.030106190592050552,
0.02971489727497101,
0.011641941033303738,
-0.042585909366607666,
0.15700596570968628,
-0.0501137301325798,
0.09648510813713074,
-0.03981378301978111,
-0.05417494475841522,
0.12587210536003113,
-0.07456705719232559,
0.01874019391834736,
-0.052642516791820526,
0.019039040431380272,
-0.09686914086341858,
-0.004250194877386093,
-0.09805497527122498,
0.04018633812665939,
-0.07097218930721283,
-0.004934784024953842,
-0.01451411098241806,
0.054838743060827255,
0.06910490244626999,
-0.014097734354436398,
0.10837415605783463,
0.002027581911534071,
0.1683172583580017,
0.1525193452835083,
0.07122558355331421,
0.04787793383002281,
-0.040293071419000626,
0.07617903500795364,
-0.027116263285279274,
0.07874304056167603,
-0.14884960651397705,
0.039126940071582794,
0.12778395414352417,
0.007166460622102022,
0.1471896916627884,
0.07428953051567078,
-0.09545308351516724,
0.02464955300092697,
0.05193600058555603,
-0.12980473041534424,
-0.13272474706172943,
0.021237853914499283,
0.026512056589126587,
-0.09319069981575012,
0.030708249658346176,
0.14620879292488098,
-0.04395613446831703,
0.02785538136959076,
0.004716777242720127,
0.03808487951755524,
-0.07235130667686462,
0.13573040068149567,
0.024031296372413635,
0.07767089456319809,
-0.07946057617664337,
0.11513115465641022,
0.053191788494586945,
-0.11551598459482193,
0.10142095386981964,
0.02030877023935318,
-0.048515625298023224,
-0.006723132915794849,
0.037031013518571854,
0.09351969510316849,
0.0577615462243557,
-0.06257268041372299,
-0.1327303647994995,
-0.18342435359954834,
0.07349806278944016,
0.2224860042333603,
0.04268667846918106,
0.07864411175251007,
-0.04253774136304855,
0.012905243784189224,
-0.09399592876434326,
0.06816480308771133,
0.05360276252031326,
0.03234872594475746,
-0.1434684842824936,
0.17165511846542358,
0.013135883957147598,
-0.0012206180253997445,
-0.007033341098576784,
-0.0032996363006532192,
-0.1994268298149109,
0.005422833375632763,
-0.157767653465271,
0.017240257933735847,
0.011190168559551239,
-0.013622041791677475,
0.04741433262825012,
-0.054372455924749374,
-0.06280400604009628,
0.04363870620727539,
-0.09574418514966965,
-0.045612312853336334,
0.0539686381816864,
0.05494438484311104,
-0.13122569024562836,
-0.07611193507909775,
0.029772238805890083,
-0.12477824091911316,
0.039085812866687775,
0.05564790591597557,
-0.004879990126937628,
0.0419081412255764,
-0.10040514171123505,
0.02822016179561615,
0.04680035263299942,
-0.000609269947744906,
0.03550192713737488,
-0.18274186551570892,
0.01035156100988388,
-0.04091866686940193,
0.039743177592754364,
0.029363909736275673,
0.08637016266584396,
-0.08764082193374634,
-0.04608390852808952,
-0.002444756682962179,
-0.03754869103431702,
-0.04333629831671715,
0.0486130490899086,
0.1313147395849228,
-0.01421775296330452,
0.17081080377101898,
-0.12552842497825623,
0.02512562833726406,
-0.18856662511825562,
-0.0013776244595646858,
0.009369010105729103,
-0.09371260553598404,
-0.08502788841724396,
-0.014874275773763657,
0.10725170373916626,
-0.0859588235616684,
0.08664829283952713,
-0.0631578117609024,
0.07934767007827759,
0.03966396674513817,
-0.07581689208745956,
-0.08309027552604675,
0.08991879224777222,
0.17922824621200562,
0.039414674043655396,
-0.013822066597640514,
0.029017210006713867,
-0.0365305133163929,
0.0750625729560852,
0.08731848746538162,
0.2045195996761322,
0.130188450217247,
0.04452182725071907,
0.11200354993343353,
0.08380245417356491,
-0.06534116715192795,
-0.06343849003314972,
0.14810585975646973,
-0.0760326087474823,
0.1629747897386551,
-0.05141494423151016,
0.09645134210586548,
0.04155530035495758,
-0.17294296622276306,
0.022900594398379326,
-0.08863318711519241,
-0.10364680737257004,
-0.11903077363967896,
-0.10887978225946426,
-0.0716618075966835,
-0.09787076711654663,
0.004714681766927242,
-0.09341849386692047,
0.021260535344481468,
0.09241436421871185,
0.03642969951033592,
0.0020577111281454563,
0.09793314337730408,
-0.03081444837152958,
0.040960025042295456,
0.10073639452457428,
-0.0025204604025930166,
-0.021848751232028008,
-0.028129244223237038,
-0.07969192415475845,
0.05910371243953705,
-0.015505542047321796,
0.03663354367017746,
0.01179439015686512,
-0.011697331443428993,
0.05397331714630127,
-0.007625412195920944,
-0.10361064225435257,
0.06509119272232056,
0.012480318546295166,
0.022392287850379944,
0.07078518718481064,
0.04632589966058731,
-0.005813294090330601,
-0.027328146621584892,
0.10835498571395874,
-0.09235243499279022,
-0.03226448595523834,
-0.1522633284330368,
0.27869129180908203,
-0.04005507379770279,
0.03321097791194916,
0.017964566126465797,
-0.058100923895835876,
-0.0491904653608799,
0.1902429163455963,
0.13188892602920532,
-0.05585431680083275,
-0.022810237482190132,
0.07025044411420822,
-0.01682652160525322,
-0.05032225698232651,
0.13084566593170166,
0.06664630770683289,
-0.033444151282310486,
-0.05314461514353752,
-0.038459040224552155,
-0.014691749587655067,
-0.02833615615963936,
-0.04166875407099724,
0.06275171041488647,
0.022986382246017456,
-0.008950361981987953,
-0.027935251593589783,
0.06122361868619919,
-0.07018478959798813,
-0.17170701920986176,
0.13020449876785278,
-0.17965242266654968,
-0.15934725105762482,
-0.0164350438863039,
0.025548195466399193,
0.0009353391942568123,
0.06889359652996063,
-0.02468891814351082,
-0.011458606459200382,
0.14116251468658447,
-0.03664097562432289,
-0.010768046602606773,
-0.11210283637046814,
0.07224111258983612,
-0.05940655991435051,
0.1714792102575302,
-0.02283560112118721,
0.055439431220293045,
0.13437704741954803,
0.011455376632511616,
-0.0749516487121582,
0.03280709683895111,
0.08187247812747955,
-0.12020552903413773,
-0.020304391160607338,
0.09789939969778061,
-0.027869965881109238,
0.13906964659690857,
0.055996108800172806,
-0.0960981696844101,
0.031832028180360794,
-0.06179073825478554,
-0.09005337953567505,
-0.02783445455133915,
-0.047890618443489075,
-0.07412514835596085,
0.11897081136703491,
0.2499912828207016,
-0.038426853716373444,
0.010634941048920155,
-0.03154546767473221,
-0.0012748048175126314,
0.06080436706542969,
0.035360027104616165,
-0.056899767369031906,
-0.23192155361175537,
0.06928516179323196,
0.028920330107212067,
0.05093175172805786,
-0.18429024517536163,
-0.09526121616363525,
0.03591776266694069,
-0.02003507874906063,
-0.08365403115749359,
0.09060628712177277,
0.030580250546336174,
0.059567101299762726,
-0.0596616268157959,
-0.12504032254219055,
-0.05637330934405327,
0.1862836629152298,
-0.08982795476913452,
-0.08118247240781784
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Rocketknight1/bert-base-cased-finetuned-wikitext2
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 6.3982
- Validation Loss: 6.2664
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 7.0679 | 6.4768 | 0 |
| 6.3982 | 6.2664 | 1 |
### Framework versions
- Transformers 4.21.0.dev0
- TensorFlow 2.9.1
- Datasets 2.3.3.dev0
- Tokenizers 0.11.0
| {"license": "apache-2.0", "tags": ["generated_from_keras_callback"], "model-index": [{"name": "Rocketknight1/bert-base-cased-finetuned-wikitext2", "results": []}]} | fill-mask | Rocketknight1/bert-base-cased-finetuned-wikitext2 | [
"transformers",
"tf",
"tensorboard",
"bert",
"fill-mask",
"generated_from_keras_callback",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #tf #tensorboard #bert #fill-mask #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
| Rocketknight1/bert-base-cased-finetuned-wikitext2
=================================================
This model is a fine-tuned version of bert-base-cased on an unknown dataset.
It achieves the following results on the evaluation set:
* Train Loss: 6.3982
* Validation Loss: 6.2664
* Epoch: 1
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* optimizer: {'name': 'AdamWeightDecay', 'learning\_rate': 2e-05, 'decay': 0.0, 'beta\_1': 0.9, 'beta\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight\_decay\_rate': 0.01}
* training\_precision: float32
### Training results
### Framework versions
* Transformers 4.21.0.dev0
* TensorFlow 2.9.1
* Datasets 2.3.3.dev0
* Tokenizers 0.11.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'AdamWeightDecay', 'learning\\_rate': 2e-05, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight\\_decay\\_rate': 0.01}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.21.0.dev0\n* TensorFlow 2.9.1\n* Datasets 2.3.3.dev0\n* Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #tf #tensorboard #bert #fill-mask #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'AdamWeightDecay', 'learning\\_rate': 2e-05, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight\\_decay\\_rate': 0.01}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.21.0.dev0\n* TensorFlow 2.9.1\n* Datasets 2.3.3.dev0\n* Tokenizers 0.11.0"
] | [
58,
118,
4,
37
] | [
"passage: TAGS\n#transformers #tf #tensorboard #bert #fill-mask #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'AdamWeightDecay', 'learning\\_rate': 2e-05, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight\\_decay\\_rate': 0.01}\n* training\\_precision: float32### Training results### Framework versions\n\n\n* Transformers 4.21.0.dev0\n* TensorFlow 2.9.1\n* Datasets 2.3.3.dev0\n* Tokenizers 0.11.0"
] | [
-0.06221121549606323,
0.06061618775129318,
-0.0037965967785567045,
0.09442359209060669,
0.1361091583967209,
0.04179688170552254,
0.14105869829654694,
0.14693419635295868,
-0.09545431286096573,
0.07705992460250854,
0.14540231227874756,
0.15201760828495026,
0.034328944981098175,
0.13513192534446716,
-0.08727990090847015,
-0.17171229422092438,
0.037790682166814804,
0.009312917478382587,
-0.10518446564674377,
0.08221855014562607,
0.07969046384096146,
-0.08363683521747589,
0.10612686723470688,
0.005192311014980078,
-0.1756276935338974,
0.04868483543395996,
0.10352151840925217,
-0.08792011439800262,
0.1217375248670578,
0.06921663880348206,
0.10375639796257019,
0.0001674895902397111,
0.03249279037117958,
-0.15389224886894226,
0.01138625293970108,
0.11125573515892029,
-0.02262907288968563,
0.08459235727787018,
0.009948666207492352,
0.010427827946841717,
0.09192865341901779,
-0.12195683270692825,
0.05072147399187088,
0.022268913686275482,
-0.11376886814832687,
-0.24849388003349304,
-0.10694199800491333,
0.013616169802844524,
0.03364041447639465,
0.07023783773183823,
0.01298382505774498,
0.20633117854595184,
-0.011070998385548592,
0.09458362311124802,
0.17813533544540405,
-0.3504534065723419,
-0.05520302429795265,
0.041916582733392715,
0.006977349054068327,
0.02056030184030533,
-0.0638672262430191,
0.04864775016903877,
0.052743855863809586,
0.04727715626358986,
0.10626884549856186,
-0.036896731704473495,
-0.04985116049647331,
-0.011546841822564602,
-0.10331477969884872,
-0.021627919748425484,
0.11204715818166733,
0.031972736120224,
-0.06229721009731293,
-0.03854382038116455,
-0.07906533032655716,
-0.14184454083442688,
-0.013086550869047642,
-0.08276256918907166,
0.030662288889288902,
0.012901692651212215,
-0.08265385776758194,
-0.056071966886520386,
-0.08941362798213959,
-0.06490925699472427,
-0.0982503667473793,
0.15822595357894897,
0.015788370743393898,
0.05440888926386833,
-0.02643420547246933,
0.06137046962976456,
-0.07248374819755554,
-0.13674397766590118,
0.006106208078563213,
0.02338082157075405,
-0.060457903891801834,
-0.04013453423976898,
-0.08624865114688873,
-0.14196020364761353,
0.06140882521867752,
0.14087343215942383,
-0.05334896221756935,
0.07213729619979858,
-0.07962426543235779,
0.02908506616950035,
-0.11597316712141037,
0.13907861709594727,
-0.03857390210032463,
-0.024307118728756905,
0.045446157455444336,
0.025864137336611748,
0.07003980875015259,
-0.03325670585036278,
-0.08452034741640091,
-0.0035029032733291388,
0.10926038771867752,
-0.005181712098419666,
-0.040073931217193604,
0.0975487157702446,
-0.05355028435587883,
-0.0008229311788454652,
0.0017297150334343314,
-0.09187967330217361,
0.03333820775151253,
-0.017510758712887764,
-0.06128598004579544,
0.0066132377833127975,
0.09476754069328308,
-0.005766783840954304,
-0.027647484093904495,
0.06394589692354202,
-0.08533336967229843,
-0.032590292394161224,
-0.08866579085588455,
-0.12861260771751404,
0.04530642181634903,
-0.07666710764169693,
-0.0206385999917984,
-0.09151912480592728,
-0.19338862597942352,
0.008773235604166985,
0.07199353724718094,
-0.04523281753063202,
0.000703299418091774,
-0.025828350335359573,
-0.1405191719532013,
0.051298294216394424,
-0.0031818810384720564,
0.13811221718788147,
-0.042095329612493515,
0.0750056579709053,
0.012519854120910168,
0.07805900275707245,
-0.10601487010717392,
0.03212493285536766,
-0.05069934204220772,
0.010377716273069382,
-0.2191329151391983,
0.0440058596432209,
-0.06296385079622269,
0.012966248206794262,
-0.1361650973558426,
-0.05668952688574791,
-0.03357705473899841,
0.017584873363375664,
0.10754650831222534,
0.0977889820933342,
-0.18951499462127686,
-0.061841897666454315,
0.12297145277261734,
-0.1087842583656311,
-0.09914357960224152,
0.12696826457977295,
-0.04958118870854378,
0.037867214530706406,
0.0720536857843399,
0.08563999831676483,
0.003045877441763878,
-0.12544973194599152,
0.005729699041694403,
-0.02211901545524597,
-0.03166862204670906,
-0.005798959173262119,
0.025047844275832176,
-0.025606656447052956,
-0.03486912325024605,
0.005711440462619066,
-0.012430321425199509,
0.02945329062640667,
-0.05881933867931366,
-0.067450612783432,
-0.06606163829565048,
-0.056355737149715424,
0.07631858438253403,
0.0244708564132452,
0.06628503650426865,
-0.08124798536300659,
-0.14357097446918488,
0.08494876325130463,
0.03549034893512726,
-0.04126932844519615,
0.04556968808174133,
-0.09386087954044342,
0.04614682495594025,
-0.05968039855360985,
0.001213599811308086,
-0.1810927391052246,
-0.07073882222175598,
0.024023544043302536,
0.01835748180747032,
0.03009226731956005,
0.027233703061938286,
0.09936153143644333,
0.015733817592263222,
-0.080038882791996,
0.012277520261704922,
-0.019728941842913628,
0.01658742129802704,
-0.08935831487178802,
-0.25084465742111206,
0.00026836918550543487,
-0.0407252162694931,
0.06087931990623474,
-0.2086356282234192,
0.007507141679525375,
0.056150440126657486,
0.12034913152456284,
0.05993395298719406,
-0.0031083151698112488,
-0.05637848377227783,
0.037340499460697174,
-0.02841406688094139,
-0.06859306246042252,
0.023580048233270645,
0.027639906853437424,
-0.13403922319412231,
0.006180816795676947,
-0.15771472454071045,
0.14511635899543762,
0.15298259258270264,
-0.08842191100120544,
-0.11865189671516418,
0.06295750290155411,
-0.028750145807862282,
-0.04300427436828613,
-0.014662940986454487,
-0.019926855340600014,
0.13415443897247314,
0.01715916395187378,
0.12318164855241776,
-0.054333556443452835,
-0.04381801187992096,
0.04327917471528053,
-0.010963041335344315,
-0.032161466777324677,
0.03591161593794823,
0.041156280785799026,
-0.1374398022890091,
0.11566553264856339,
0.14379242062568665,
-0.12725327908992767,
0.133484348654747,
-0.038930706679821014,
-0.07630153745412827,
-0.06117841973900795,
0.025291195139288902,
0.04216929152607918,
0.11188165843486786,
-0.09718602895736694,
0.0192610714584589,
0.023883063346147537,
0.012461988255381584,
-0.008679021149873734,
-0.19641907513141632,
0.003048873273655772,
-0.006993644405156374,
-0.06210692971944809,
0.03127641975879669,
0.04213248938322067,
0.014128084294497967,
0.1189875528216362,
0.035881105810403824,
-0.04893777146935463,
0.09407069534063339,
-0.0058996002189815044,
-0.07442547380924225,
0.19607627391815186,
-0.1241249367594719,
-0.14164415001869202,
-0.13202418386936188,
-0.06225411221385002,
-0.06157239153981209,
0.010592029429972172,
0.017876701429486275,
-0.08127693086862564,
-0.0654420480132103,
-0.05462553724646568,
-0.0008096402161754668,
0.015112444758415222,
0.08260496705770493,
0.050722938030958176,
-0.029443284496665,
0.13502101600170135,
-0.09790477901697159,
-0.038499049842357635,
-0.03150425851345062,
-0.04879439249634743,
0.025504477322101593,
0.042985934764146805,
0.04940995201468468,
0.10251636058092117,
-0.030517134815454483,
0.021949198096990585,
-0.04825862497091293,
0.19951412081718445,
-0.05377357825636864,
0.021828653290867805,
0.14284254610538483,
-0.04873344674706459,
0.06097463145852089,
0.1174459382891655,
0.04776572808623314,
-0.09928105771541595,
0.038100454956293106,
0.0849677249789238,
-0.03166092187166214,
-0.223215252161026,
-0.01401481218636036,
-0.0381719134747982,
-0.08988700062036514,
0.04301755875349045,
0.0547754243016243,
0.15060292184352875,
0.02493719570338726,
0.03578874096274376,
0.12226375937461853,
-0.011547526344656944,
0.06500866264104843,
0.16647295653820038,
0.057888008654117584,
0.11397187411785126,
-0.042872149497270584,
-0.031088562682271004,
0.06487351655960083,
-0.007835937663912773,
0.1927814930677414,
0.028571128845214844,
0.08016498386859894,
0.07571002095937729,
0.07429717481136322,
-0.023453472182154655,
0.03525057062506676,
0.0004233546496834606,
-0.037701211869716644,
-0.011343502439558506,
-0.07323502749204636,
-0.018200580030679703,
0.05253654718399048,
-0.06217328459024429,
0.07573932409286499,
-0.06308753043413162,
0.06014494597911835,
0.06671237200498581,
0.26257145404815674,
0.06551463901996613,
-0.32843083143234253,
-0.09613130986690521,
0.0003417861007619649,
-0.02737277001142502,
-0.03183598443865776,
-0.030534056946635246,
0.10057023167610168,
-0.045246902853250504,
0.10345928370952606,
-0.10278920084238052,
0.04121166095137596,
0.012288733385503292,
0.04542705789208412,
0.09350652247667313,
0.13558657467365265,
0.002186404075473547,
0.001850454369559884,
-0.3322782814502716,
0.2786402702331543,
0.049319684505462646,
0.12745971977710724,
-0.08962169289588928,
0.048787813633680344,
0.054208654910326004,
0.016994474455714226,
0.09049513936042786,
-0.03331485763192177,
-0.0923599898815155,
-0.11505507677793503,
-0.055874209851026535,
0.016233615577220917,
0.11431154608726501,
0.055913668125867844,
0.08853784203529358,
-0.04805739223957062,
0.003882765304297209,
0.11000902205705643,
0.026065874844789505,
-0.13210541009902954,
-0.04680788889527321,
0.0426906980574131,
0.06087400019168854,
-0.09760645031929016,
-0.0707627609372139,
-0.10545875132083893,
-0.09430018067359924,
0.20580953359603882,
-0.06001032516360283,
-0.02222844772040844,
-0.1374807506799698,
0.1332259178161621,
0.09046149253845215,
-0.06711095571517944,
0.059726208448410034,
-0.0010928995907306671,
0.06777673959732056,
0.04874081909656525,
-0.12718261778354645,
0.13606232404708862,
-0.034718938171863556,
-0.1718684881925583,
-0.06105842813849449,
0.07426783442497253,
0.017819030210375786,
0.04811646044254303,
-0.007486782036721706,
0.04973994567990303,
0.044138386845588684,
-0.08465632051229477,
0.08916270732879639,
-0.006588292308151722,
0.045950647443532944,
0.005018073599785566,
-0.032033395022153854,
-0.04444334656000137,
-0.01246164832264185,
0.017556369304656982,
0.12635557353496552,
0.24794569611549377,
-0.08841918408870697,
0.028134668245911598,
0.012085248716175556,
-0.06608879566192627,
-0.22823446989059448,
0.08898276090621948,
0.06405122578144073,
0.011620416305959225,
-0.004680836573243141,
-0.142100989818573,
0.10076890140771866,
0.07112814486026764,
-0.021784614771604538,
0.09937053173780441,
-0.2671316862106323,
-0.15646959841251373,
0.09387184679508209,
0.12913082540035248,
0.1580362766981125,
-0.1535436064004898,
-0.04477318003773689,
-0.04075385630130768,
-0.0337805338203907,
0.1403800994157791,
-0.14975738525390625,
0.09406682848930359,
0.01674184575676918,
0.052366457879543304,
-0.003079063491895795,
-0.04961574077606201,
0.09310655295848846,
-0.04258089140057564,
0.10518015921115875,
-0.049542803317308426,
-0.03053782507777214,
0.12295329570770264,
-0.05730942636728287,
0.03296918049454689,
-0.06437107175588608,
0.01749938726425171,
-0.046114400029182434,
0.007107391022145748,
-0.07895149290561676,
0.07140929251909256,
-0.03555535525083542,
-0.049726374447345734,
-0.006446225568652153,
0.045915279537439346,
0.051248837262392044,
-0.04889242723584175,
0.13005994260311127,
0.012114346027374268,
0.15232141315937042,
0.17008084058761597,
0.07721494883298874,
-0.025330234318971634,
0.010947419330477715,
0.07014812529087067,
-0.052276451140642166,
0.08282562345266342,
-0.16516168415546417,
0.05248456075787544,
0.09375576674938202,
-0.009275155141949654,
0.13709355890750885,
0.07013044506311417,
-0.06720597296953201,
0.012169302441179752,
0.08666782826185226,
-0.16065925359725952,
-0.0960652083158493,
0.028012637048959732,
-0.07735560089349747,
-0.09377986937761307,
0.05279906094074249,
0.15333996713161469,
-0.03368092328310013,
0.03646198287606239,
0.034310370683670044,
0.008771764114499092,
-0.0861232653260231,
0.15530645847320557,
0.02088475041091442,
0.02140447497367859,
-0.0766923800110817,
0.11086118221282959,
0.01445271261036396,
-0.0947415828704834,
0.08777809143066406,
0.01990274339914322,
-0.06755278259515762,
-0.012449387460947037,
0.058748479932546616,
0.16462796926498413,
-0.0008233428234234452,
-0.05518927052617073,
-0.13225071132183075,
-0.10947714000940323,
0.06238352879881859,
0.25962305068969727,
0.0622045136988163,
0.04545426368713379,
-0.0562659427523613,
0.021604038774967194,
-0.08067552000284195,
0.07826691120862961,
0.04823053628206253,
0.05896233767271042,
-0.15059858560562134,
0.1713845133781433,
-0.014925410971045494,
0.01335956808179617,
-0.045180682092905045,
0.05412089079618454,
-0.1415582150220871,
-0.023520145565271378,
-0.13458943367004395,
0.0005525349406525493,
-0.004908763337880373,
-0.02280457317829132,
0.02701413445174694,
-0.07782257348299026,
-0.09600555151700974,
0.037966933101415634,
-0.09378671646118164,
-0.03616713359951973,
0.07195279002189636,
0.019056526944041252,
-0.1339217722415924,
-0.07154577970504761,
0.008556023240089417,
-0.07321315258741379,
0.0648529902100563,
0.04992814362049103,
0.003234652103856206,
0.06292181462049484,
-0.11701279133558273,
0.014747089706361294,
0.10077846795320511,
-0.004744081757962704,
0.06628422439098358,
-0.10492495447397232,
0.0038969956804066896,
-0.018201759085059166,
0.06105223670601845,
0.022982997819781303,
0.13583295047283173,
-0.0920100137591362,
-0.018390802666544914,
-0.05285775288939476,
-0.04001302644610405,
-0.044366054236888885,
0.03643687441945076,
0.1507761925458908,
0.002165089128538966,
0.18926696479320526,
-0.12116669863462448,
-0.01050128135830164,
-0.1887015402317047,
0.031854450702667236,
-0.0012263762764632702,
-0.0971020981669426,
-0.10388487577438354,
-0.008858051151037216,
0.08449719846248627,
-0.08478938788175583,
0.10337597131729126,
-0.031137049198150635,
0.05813764035701752,
0.058200426399707794,
-0.062151409685611725,
-0.06185607239603996,
0.033034179359674454,
0.1931987702846527,
0.03171740472316742,
-0.02241019904613495,
0.026624027639627457,
0.007480140309780836,
0.09910345822572708,
0.06356252729892731,
0.23147322237491608,
0.12976184487342834,
0.0012731883907690644,
0.14734433591365814,
0.07825001329183578,
-0.04565631225705147,
-0.12843629717826843,
0.13090412318706512,
-0.10392068326473236,
0.14652366936206818,
-0.03488608077168465,
0.10027395188808441,
0.08219919353723526,
-0.15829701721668243,
0.01820460334420204,
-0.03648100420832634,
-0.06816250830888748,
-0.14319683611392975,
-0.09927018731832504,
-0.0989837497472763,
-0.11779379844665527,
0.007825520820915699,
-0.09163561463356018,
0.042273953557014465,
0.026027435436844826,
0.03000161424279213,
-0.021135689690709114,
0.15996132791042328,
-0.06348040699958801,
0.024952305480837822,
0.07864232361316681,
0.003944178111851215,
-0.02210240811109543,
-0.04196026176214218,
-0.07382192462682724,
0.0201571024954319,
0.03425229340791702,
0.01858515664935112,
-0.024427758529782295,
0.009200921282172203,
0.055497247725725174,
-0.03666931390762329,
-0.09856605529785156,
0.036080215126276016,
0.040711142122745514,
0.018839156255126,
0.050870008766651154,
0.0435105636715889,
0.008318976499140263,
-0.02369452826678753,
0.12521322071552277,
-0.10746310651302338,
-0.048029348254203796,
-0.17573857307434082,
0.2309306561946869,
-0.00371074047870934,
0.031202755868434906,
0.0015685961116105318,
-0.08276497572660446,
-0.050087086856365204,
0.19592328369617462,
0.1673073023557663,
-0.052909430116415024,
0.0013745357282459736,
0.06258359551429749,
-0.007171420846134424,
-0.06959415227174759,
0.12201850861310959,
0.07004072517156601,
-0.011546541005373001,
-0.05207113176584244,
-0.08838839083909988,
-0.031118419021368027,
-0.011750408448278904,
-0.030641088262200356,
0.053617771714925766,
0.02584736980497837,
-0.006840271409600973,
-0.012417774647474289,
0.0395827442407608,
-0.021155547350645065,
-0.15347982943058014,
0.06607349216938019,
-0.21270795166492462,
-0.1549614518880844,
0.00829789973795414,
0.01626546122133732,
-0.02837451919913292,
0.05543231964111328,
-0.025897126644849777,
-0.0010722394799813628,
0.0895053967833519,
-0.03218303248286247,
-0.05236310884356499,
-0.06899438798427582,
0.06742610782384872,
-0.06333775073289871,
0.18071076273918152,
-0.019004138186573982,
0.04364081472158432,
0.1373521089553833,
0.05444996431469917,
-0.08550010621547699,
0.04210316762328148,
0.03511124476790428,
-0.11293445527553558,
-0.006050398573279381,
0.09838966280221939,
-0.03425014764070511,
0.11131203174591064,
0.05996061488986015,
-0.07922032475471497,
0.020562876015901566,
-0.10452285408973694,
-0.09928513318300247,
-0.04582188278436661,
-0.03775740787386894,
-0.09963922202587128,
0.1232241541147232,
0.21983295679092407,
-0.004848747979849577,
0.03902226686477661,
-0.06169332563877106,
0.012202535755932331,
0.07536538690328598,
0.004628750495612621,
-0.08218322694301605,
-0.21534965932369232,
0.07707931846380234,
0.054729584604501724,
0.007256114389747381,
-0.21320830285549164,
-0.09990549832582474,
0.012784409336745739,
-0.028421062976121902,
-0.11497684568166733,
0.08169647306203842,
0.07061704248189926,
0.05809919536113739,
-0.043143730610609055,
-0.136624276638031,
-0.0359191969037056,
0.13899263739585876,
-0.14687016606330872,
-0.07792889326810837
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Rocketknight1/bert-base-uncased-finetuned-swag
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.8360
- Train Accuracy: 0.6631
- Validation Loss: 0.5885
- Validation Accuracy: 0.7706
- Epoch: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 5e-05, 'decay_steps': 9192, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Epoch |
|:----------:|:--------------:|:---------------:|:-------------------:|:-----:|
| 0.8360 | 0.6631 | 0.5885 | 0.7706 | 0 |
### Framework versions
- Transformers 4.18.0.dev0
- TensorFlow 2.8.0-rc0
- Datasets 2.0.1.dev0
- Tokenizers 0.11.0
| {"license": "apache-2.0", "tags": ["generated_from_keras_callback"], "model-index": [{"name": "Rocketknight1/bert-base-uncased-finetuned-swag", "results": []}]} | multiple-choice | Rocketknight1/bert-base-uncased-finetuned-swag | [
"transformers",
"tf",
"tensorboard",
"bert",
"multiple-choice",
"generated_from_keras_callback",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #tf #tensorboard #bert #multiple-choice #generated_from_keras_callback #license-apache-2.0 #endpoints_compatible #region-us
| Rocketknight1/bert-base-uncased-finetuned-swag
==============================================
This model is a fine-tuned version of bert-base-uncased on an unknown dataset.
It achieves the following results on the evaluation set:
* Train Loss: 0.8360
* Train Accuracy: 0.6631
* Validation Loss: 0.5885
* Validation Accuracy: 0.7706
* Epoch: 0
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* optimizer: {'name': 'Adam', 'learning\_rate': {'class\_name': 'PolynomialDecay', 'config': {'initial\_learning\_rate': 5e-05, 'decay\_steps': 9192, 'end\_learning\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\_1': 0.9, 'beta\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
* training\_precision: float32
### Training results
### Framework versions
* Transformers 4.18.0.dev0
* TensorFlow 2.8.0-rc0
* Datasets 2.0.1.dev0
* Tokenizers 0.11.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 5e-05, 'decay\\_steps': 9192, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.18.0.dev0\n* TensorFlow 2.8.0-rc0\n* Datasets 2.0.1.dev0\n* Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #tf #tensorboard #bert #multiple-choice #generated_from_keras_callback #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 5e-05, 'decay\\_steps': 9192, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.18.0.dev0\n* TensorFlow 2.8.0-rc0\n* Datasets 2.0.1.dev0\n* Tokenizers 0.11.0"
] | [
51,
178,
4,
42
] | [
"passage: TAGS\n#transformers #tf #tensorboard #bert #multiple-choice #generated_from_keras_callback #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 5e-05, 'decay\\_steps': 9192, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}\n* training\\_precision: float32### Training results### Framework versions\n\n\n* Transformers 4.18.0.dev0\n* TensorFlow 2.8.0-rc0\n* Datasets 2.0.1.dev0\n* Tokenizers 0.11.0"
] | [
-0.07737009972333908,
0.04477966949343681,
-0.005890772677958012,
0.06149232015013695,
0.13767419755458832,
0.04242536425590515,
0.14136946201324463,
0.12337469309568405,
-0.09849175810813904,
0.12930643558502197,
0.14982958137989044,
0.15140539407730103,
0.06334895640611649,
0.10641932487487793,
-0.07580602169036865,
-0.15991489589214325,
0.08226543664932251,
-0.019971288740634918,
-0.06881903856992722,
0.07956364750862122,
0.07120173424482346,
-0.08241880685091019,
0.07981013506650925,
-0.0031294692307710648,
-0.10966959595680237,
0.00958289671689272,
0.07700540125370026,
-0.06748086214065552,
0.1095084547996521,
0.06456629186868668,
0.09006568789482117,
-0.001116265426389873,
0.0064765168353915215,
-0.20160692930221558,
0.01264727208763361,
0.12356805801391602,
-0.00023690103262197226,
0.07657143473625183,
0.009409602731466293,
-0.006319232285022736,
0.1216660812497139,
-0.09379690885543823,
0.0626334398984909,
0.03373291343450546,
-0.12573672831058502,
-0.2814175486564636,
-0.11481967568397522,
0.03215271607041359,
0.06220316141843796,
0.06811071932315826,
0.003268368076533079,
0.1863342970609665,
-0.02451326884329319,
0.09441296011209488,
0.1855853945016861,
-0.29218989610671997,
-0.049791280180215836,
0.024367650970816612,
0.02824166789650917,
0.0358215868473053,
-0.059958845376968384,
0.012257558293640614,
0.02945788763463497,
0.04222547635436058,
0.03212938457727432,
-0.009896676056087017,
0.013006600551307201,
-0.04162734001874924,
-0.07935493439435959,
-0.072006456553936,
0.11705700308084488,
0.03430310636758804,
-0.06883303076028824,
-0.06367133557796478,
-0.038592662662267685,
-0.1518383026123047,
0.008460860699415207,
-0.02092420682311058,
-0.0013057858450338244,
-0.0011507360031828284,
-0.053484875708818436,
-0.019446007907390594,
-0.057016290724277496,
-0.04466669261455536,
-0.003928591497242451,
0.14888989925384521,
0.03995189070701599,
0.056589413434267044,
-0.015381661243736744,
0.059861794114112854,
-0.003965620417147875,
-0.12060144543647766,
-0.025698333978652954,
-0.0050387331284582615,
-0.062093254178762436,
-0.024059049785137177,
-0.06876453012228012,
-0.043171241879463196,
0.06918781250715256,
0.14012324810028076,
-0.06243985891342163,
0.11807101964950562,
-0.035785265266895294,
0.02406204119324684,
-0.10232941806316376,
0.10396439582109451,
-0.03266982361674309,
0.06929484754800797,
-0.00047988301957957447,
0.08154379576444626,
0.053631365299224854,
-0.04208255559206009,
-0.04422002285718918,
0.04537680000066757,
0.10211114585399628,
0.04294288158416748,
-0.047550395131111145,
0.07802953571081161,
-0.06790166348218918,
-0.004907831083983183,
-0.0211909431964159,
-0.10667124390602112,
0.043721262365579605,
0.04209465533494949,
-0.08200421929359436,
0.04241203889250755,
0.0632692351937294,
-0.017794711515307426,
-0.0658676028251648,
0.040589991956949234,
-0.06155838817358017,
-0.026562293991446495,
-0.09929894655942917,
-0.13939715921878815,
0.041630566120147705,
-0.0657285824418068,
-0.02392878569662571,
-0.06473761796951294,
-0.13140662014484406,
-0.03284861147403717,
0.08244208991527557,
-0.036957036703825,
-0.006960224825888872,
-0.06070627272129059,
-0.170899897813797,
0.04959943890571594,
0.004414911847561598,
0.11016444116830826,
-0.04141531512141228,
0.07527539879083633,
-0.006984385196119547,
0.05190825089812279,
-0.024822643026709557,
0.022199038416147232,
-0.02210347354412079,
0.04948553442955017,
-0.179275244474411,
0.07710699737071991,
-0.08209450542926788,
0.03249799460172653,
-0.14929606020450592,
-0.0908982902765274,
0.057760678231716156,
0.006205539684742689,
0.12477537244558334,
0.09394153207540512,
-0.1447223573923111,
-0.05860133096575737,
0.11131345480680466,
-0.09517629444599152,
-0.08767690509557724,
0.08689186722040176,
-0.059239454567432404,
0.0039910003542900085,
0.07850588113069534,
0.06602544337511063,
0.03859385848045349,
-0.12670758366584778,
0.007738671265542507,
-0.0559801310300827,
0.005665957927703857,
0.035075243562459946,
0.01980516128242016,
-0.024929694831371307,
-0.07220781594514847,
0.010833117179572582,
-0.010617095977067947,
0.009064778685569763,
-0.07368592172861099,
-0.05386830121278763,
-0.03947532922029495,
-0.04854441061615944,
0.011520511470735073,
0.03139670565724373,
0.0314938984811306,
-0.10859836637973785,
-0.14591988921165466,
0.07181309163570404,
0.05360952764749527,
-0.08295095711946487,
0.031047269701957703,
-0.08720734715461731,
0.04827798530459404,
0.016503358259797096,
0.016557466238737106,
-0.1631968915462494,
-0.07625295966863632,
0.03201274201273918,
-0.029945315793156624,
-0.006800176575779915,
-0.01880037784576416,
0.07152417302131653,
0.03196747973561287,
-0.041324447840452194,
0.001990768825635314,
-0.021075217053294182,
0.014398323372006416,
-0.05997925624251366,
-0.23314040899276733,
-0.03576112166047096,
-0.026561282575130463,
0.040274478495121,
-0.2627037763595581,
0.012957993894815445,
0.09170859307050705,
0.12522204220294952,
0.026846664026379585,
-0.034108854830265045,
-0.02834222838282585,
0.054430004209280014,
-0.019454827532172203,
-0.06906956434249878,
0.029409419745206833,
0.028991824015975,
-0.11522915214300156,
-0.03513457998633385,
-0.18181262910366058,
0.0940510556101799,
0.1226191595196724,
-0.06859539449214935,
-0.1299724578857422,
0.06575721502304077,
-0.03216780349612236,
-0.024991774931550026,
-0.018293125554919243,
-0.0012269753497093916,
0.1418621987104416,
0.04944543167948723,
0.12371480464935303,
-0.05090082064270973,
-0.020369939506053925,
0.05166026949882507,
-0.013462768867611885,
-0.029114844277501106,
0.1316683441400528,
-0.028662709519267082,
-0.09098291397094727,
0.0785621777176857,
0.06369542330503464,
-0.09520243853330612,
0.07917273044586182,
-0.06648276746273041,
-0.052787721157073975,
-0.08622559905052185,
0.07615473121404648,
0.07101266831159592,
0.09846595674753189,
-0.08253531157970428,
0.007004275918006897,
0.012411944568157196,
0.014211255125701427,
-0.03371249884366989,
-0.19564077258110046,
0.01206059381365776,
0.0026502422988414764,
-0.06227796897292137,
0.014240598306059837,
0.006838320288807154,
0.015367617830634117,
0.10878414660692215,
0.021039633080363274,
-0.02395062893629074,
0.07257045805454254,
-0.0295074712485075,
-0.08244728296995163,
0.22870919108390808,
-0.12586140632629395,
-0.11253023892641068,
-0.12267306447029114,
-0.029046330600976944,
-0.050480183213949203,
-0.015202009119093418,
0.002325052861124277,
-0.08362669497728348,
-0.06661447137594223,
-0.066585972905159,
-0.017376979812979698,
-0.03342485427856445,
0.037326179444789886,
0.04318290203809738,
-0.007993647828698158,
0.1414763480424881,
-0.1112075224518776,
-0.033696249127388,
-0.009777056984603405,
-0.057282645255327225,
0.0156355332583189,
0.02180335856974125,
-0.0047143129631876945,
0.0978248119354248,
-0.004595668520778418,
0.0367724783718586,
-0.036953605711460114,
0.22774580121040344,
-0.0380057618021965,
-0.012799839489161968,
0.11734136939048767,
-0.030313480645418167,
0.08145712316036224,
0.11167578399181366,
0.05837390199303627,
-0.10988878458738327,
0.034307338297367096,
0.06572981923818588,
-0.028659839183092117,
-0.24973757565021515,
0.004452328197658062,
-0.029576126486063004,
-0.0960746556520462,
0.06686839461326599,
0.04145229235291481,
0.1180621087551117,
0.019245514646172523,
-0.0035219076089560986,
0.08652251958847046,
0.03089158423244953,
0.07920580357313156,
0.16870851814746857,
0.061816632747650146,
0.08642623573541641,
-0.04586276412010193,
0.004201551433652639,
0.022238658741116524,
-0.020146401599049568,
0.22594629228115082,
0.005475497804582119,
0.10072008520364761,
0.0769391804933548,
0.07575920969247818,
-0.026920966804027557,
0.026662271469831467,
-0.005695647560060024,
0.0019664489664137363,
0.020636579021811485,
-0.06610523909330368,
-0.03815966472029686,
0.03698360547423363,
-0.023290781304240227,
0.07595307379961014,
-0.0969000980257988,
0.018399378284811974,
0.05443256348371506,
0.2536074221134186,
0.09381229430437088,
-0.29870864748954773,
-0.1194259524345398,
-0.0049772802740335464,
-0.030889052897691727,
-0.04908745735883713,
-0.014045669697225094,
0.09842706471681595,
-0.07832925766706467,
0.07736366987228394,
-0.08209112286567688,
0.05220743641257286,
-0.06912322342395782,
0.040985073894262314,
0.10039213299751282,
0.12255752086639404,
0.0017465917626395822,
0.017652001231908798,
-0.3286757469177246,
0.2879196107387543,
0.035844068974256516,
0.14869391918182373,
-0.08124149590730667,
0.040364790707826614,
0.029070833697915077,
-0.048633500933647156,
0.06664944440126419,
-0.008922280743718147,
-0.1649247258901596,
-0.19364115595817566,
-0.046629201620817184,
-0.0018518595024943352,
0.1375826597213745,
0.04159316048026085,
0.11330930143594742,
-0.03430253639817238,
0.010281724855303764,
0.06707170605659485,
-0.0292828306555748,
-0.1443454772233963,
-0.06144534796476364,
0.04856694117188454,
0.040822047740221024,
-0.04363866150379181,
-0.0854438915848732,
-0.09570551663637161,
-0.05481194704771042,
0.16395743191242218,
-0.13232001662254333,
-0.05359315499663353,
-0.1285238415002823,
0.07528909295797348,
0.10099576413631439,
-0.06132562458515167,
0.026903286576271057,
-0.018891362473368645,
0.07913549244403839,
0.030107498168945312,
-0.08405660837888718,
0.11941082030534744,
-0.020740291103720665,
-0.2245514839887619,
-0.05445603281259537,
0.11000753939151764,
0.051458291709423065,
0.0299303587526083,
-0.02011517807841301,
0.06886317580938339,
0.04375382885336876,
-0.109723299741745,
0.08503854274749756,
0.01754673384130001,
0.03350841999053955,
0.0349847637116909,
-0.014071046374738216,
0.02133817970752716,
-0.04095255583524704,
0.01609007455408573,
0.0851176306605339,
0.2994690537452698,
-0.0717824175953865,
0.02767217345535755,
0.027842361479997635,
-0.07893788069486618,
-0.1828567087650299,
0.05530796945095062,
0.08969226479530334,
0.009335359558463097,
-0.06874572485685349,
-0.18911471962928772,
0.04103458300232887,
0.08001627027988434,
-0.014046200551092625,
0.08251776546239853,
-0.2985441982746124,
-0.13847778737545013,
0.06899580359458923,
0.14213183522224426,
0.12310157716274261,
-0.17336052656173706,
-0.05300627648830414,
-0.04674272611737251,
-0.03346167132258415,
0.14952795207500458,
-0.06463810056447983,
0.1062929779291153,
0.01988816261291504,
0.03462366387248039,
0.011103975586593151,
-0.05119439586997032,
0.1377553641796112,
-0.031116867437958717,
0.09223433583974838,
-0.04478297755122185,
-0.039697352796792984,
0.13165995478630066,
-0.08582383394241333,
0.02763688750565052,
-0.06053818017244339,
0.023202311247587204,
-0.11394801735877991,
0.0010669201146811247,
-0.0914120003581047,
0.03878200054168701,
-0.06333326548337936,
0.001611610408872366,
-0.02159547060728073,
0.047640495002269745,
0.08559209853410721,
-0.018776709213852882,
0.11719553917646408,
0.008241171017289162,
0.16225361824035645,
0.16572538018226624,
0.0683349147439003,
0.031046412885189056,
-0.04760510101914406,
0.08236069232225418,
-0.029431790113449097,
0.0774121806025505,
-0.15985259413719177,
0.039042215794324875,
0.13130278885364532,
0.0013554218458011746,
0.1451139897108078,
0.0654408410191536,
-0.0984354093670845,
0.030774105340242386,
0.04388757050037384,
-0.13127531111240387,
-0.12356510758399963,
0.03631145507097244,
0.037600964307785034,
-0.08580230176448822,
0.03873291611671448,
0.14205725491046906,
-0.044214315712451935,
0.026346400380134583,
0.005740644410252571,
0.0363069586455822,
-0.06079895421862602,
0.13566121459007263,
0.021816745400428772,
0.07873444259166718,
-0.08097462356090546,
0.11605240404605865,
0.05080379918217659,
-0.12297724932432175,
0.1164364442229271,
0.026532355695962906,
-0.06374263018369675,
-0.0008274347637780011,
0.04421118646860123,
0.1046951487660408,
0.05240625888109207,
-0.06590194255113602,
-0.13229027390480042,
-0.16407011449337006,
0.08271656185388565,
0.2306426614522934,
0.04366621747612953,
0.0753573402762413,
-0.040989965200424194,
-0.002267161151394248,
-0.09734830260276794,
0.06601272523403168,
0.05355174094438553,
0.03370111808180809,
-0.1302710920572281,
0.16262617707252502,
0.008891436271369457,
-0.005644901655614376,
-0.0035393869038671255,
0.00018287560669705272,
-0.19862079620361328,
0.005672350060194731,
-0.1509983390569687,
0.02688748762011528,
-0.0038907118141651154,
-0.016327334567904472,
0.042215168476104736,
-0.054037101566791534,
-0.05850825086236,
0.046567462384700775,
-0.09425085783004761,
-0.04488682001829147,
0.05603399872779846,
0.047747887670993805,
-0.13368694484233856,
-0.07245312631130219,
0.019505059346556664,
-0.12009071558713913,
0.03846176713705063,
0.05252661928534508,
-0.015540961176156998,
0.02662375196814537,
-0.08554401993751526,
0.02280116081237793,
0.044264715164899826,
-0.006648879498243332,
0.03943117335438728,
-0.18112097680568695,
0.014392392709851265,
-0.027095064520835876,
0.03353786841034889,
0.03353903442621231,
0.0966443419456482,
-0.08328940719366074,
-0.0438249446451664,
-0.0035786288790404797,
-0.028499526903033257,
-0.0502779595553875,
0.05698639154434204,
0.13376230001449585,
-0.014852004125714302,
0.1722135692834854,
-0.12047513574361801,
0.021131617948412895,
-0.1784747838973999,
-0.004848987329751253,
0.018571000546216965,
-0.10114973038434982,
-0.07536053657531738,
-0.008442128077149391,
0.10778048634529114,
-0.09158078581094742,
0.08497249335050583,
-0.06691211462020874,
0.06379418075084686,
0.04611358419060707,
-0.08284373581409454,
-0.1079583466053009,
0.07592964917421341,
0.17267023026943207,
0.03521835803985596,
-0.020180633291602135,
0.04646706208586693,
-0.044714126735925674,
0.07252179831266403,
0.09780894219875336,
0.2021951973438263,
0.1296238899230957,
0.05321117863059044,
0.1124306470155716,
0.07621242105960846,
-0.0603717565536499,
-0.07499953359365463,
0.1353406459093094,
-0.08068234473466873,
0.1734362095594406,
-0.039945073425769806,
0.09418938308954239,
0.05099094286561012,
-0.16679692268371582,
0.025776373222470284,
-0.07169949263334274,
-0.10328143835067749,
-0.12048859149217606,
-0.12315371632575989,
-0.07648078352212906,
-0.09871304780244827,
0.012239856645464897,
-0.09330233186483383,
0.036664336919784546,
0.08364954590797424,
0.04055769741535187,
0.005354376044124365,
0.09529053419828415,
-0.0227549709379673,
0.04118955880403519,
0.09878228604793549,
-0.003332690102979541,
-0.023964975029230118,
-0.017411114647984505,
-0.07951599359512329,
0.05765310302376747,
-0.019416406750679016,
0.04589667543768883,
0.021702786907553673,
0.012362216599285603,
0.05525631085038185,
-0.014344941824674606,
-0.10669007152318954,
0.06015126779675484,
0.018863223493099213,
0.02482248842716217,
0.06980180740356445,
0.04955040290951729,
-0.00951903685927391,
-0.03084832802414894,
0.0972832590341568,
-0.08825614303350449,
-0.026876935735344887,
-0.15740011632442474,
0.2722901403903961,
-0.027328016236424446,
0.02115110121667385,
0.01750376634299755,
-0.05604534596204758,
-0.04683388024568558,
0.17459110915660858,
0.13060101866722107,
-0.0362732969224453,
-0.024414142593741417,
0.05587243661284447,
-0.020506370812654495,
-0.042047169059515,
0.12767185270786285,
0.06632119417190552,
-0.0486963652074337,
-0.046932172030210495,
-0.04311205819249153,
-0.020323703065514565,
-0.028430957347154617,
-0.04513648524880409,
0.05723137408494949,
0.013314111158251762,
-0.013105092570185661,
-0.028977155685424805,
0.05566597357392311,
-0.06764397770166397,
-0.16633319854736328,
0.12595166265964508,
-0.18007174134254456,
-0.15317177772521973,
-0.003534841351211071,
0.024673018604516983,
0.0036053149960935116,
0.06319624185562134,
-0.021265286952257156,
-0.007044874597340822,
0.13987094163894653,
-0.0378311462700367,
-0.0226525217294693,
-0.1120598167181015,
0.06270171701908112,
-0.07085839658975601,
0.16272830963134766,
-0.012056714855134487,
0.056909289211034775,
0.1414610892534256,
0.004334364552050829,
-0.0863841101527214,
0.03156442195177078,
0.07678909599781036,
-0.11501888930797577,
-0.029570186510682106,
0.10532811284065247,
-0.024072468280792236,
0.1391838639974594,
0.05067518725991249,
-0.08849756419658661,
0.024678753688931465,
-0.07147829234600067,
-0.07259093970060349,
-0.033406320959329605,
-0.05282951518893242,
-0.06314859539270401,
0.12302595376968384,
0.24281415343284607,
-0.03316352888941765,
0.015249122865498066,
-0.03165243938565254,
-0.0017558012623339891,
0.060205020010471344,
0.01374245248734951,
-0.05411089211702347,
-0.23399774730205536,
0.073174387216568,
0.042505573481321335,
0.05191603675484657,
-0.17908115684986115,
-0.10037203133106232,
0.03151295334100723,
-0.024684002622961998,
-0.08937804400920868,
0.09127834439277649,
0.025730609893798828,
0.06261789053678513,
-0.05975380912423134,
-0.10069122910499573,
-0.04674253985285759,
0.17725704610347748,
-0.08843405544757843,
-0.07548534870147705
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Rocketknight1/distilbert-base-uncased-finetuned-cola
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.3182
- Validation Loss: 0.4914
- Train Matthews Correlation: 0.5056
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 1602, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Train Matthews Correlation | Epoch |
|:----------:|:---------------:|:--------------------------:|:-----:|
| 0.5126 | 0.4638 | 0.4555 | 0 |
| 0.3182 | 0.4914 | 0.5056 | 1 |
### Framework versions
- Transformers 4.22.0.dev0
- TensorFlow 2.9.1
- Datasets 2.4.1.dev0
- Tokenizers 0.11.0
| {"license": "apache-2.0", "tags": ["generated_from_keras_callback"], "model-index": [{"name": "Rocketknight1/distilbert-base-uncased-finetuned-cola", "results": []}]} | text-classification | Rocketknight1/distilbert-base-uncased-finetuned-cola | [
"transformers",
"tf",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_keras_callback",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #tf #tensorboard #distilbert #text-classification #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
| Rocketknight1/distilbert-base-uncased-finetuned-cola
====================================================
This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset.
It achieves the following results on the evaluation set:
* Train Loss: 0.3182
* Validation Loss: 0.4914
* Train Matthews Correlation: 0.5056
* Epoch: 1
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* optimizer: {'name': 'Adam', 'learning\_rate': {'class\_name': 'PolynomialDecay', 'config': {'initial\_learning\_rate': 2e-05, 'decay\_steps': 1602, 'end\_learning\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\_1': 0.9, 'beta\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
* training\_precision: float32
### Training results
### Framework versions
* Transformers 4.22.0.dev0
* TensorFlow 2.9.1
* Datasets 2.4.1.dev0
* Tokenizers 0.11.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 2e-05, 'decay\\_steps': 1602, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.22.0.dev0\n* TensorFlow 2.9.1\n* Datasets 2.4.1.dev0\n* Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #tf #tensorboard #distilbert #text-classification #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 2e-05, 'decay\\_steps': 1602, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.22.0.dev0\n* TensorFlow 2.9.1\n* Datasets 2.4.1.dev0\n* Tokenizers 0.11.0"
] | [
60,
178,
4,
37
] | [
"passage: TAGS\n#transformers #tf #tensorboard #distilbert #text-classification #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 2e-05, 'decay\\_steps': 1602, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}\n* training\\_precision: float32### Training results### Framework versions\n\n\n* Transformers 4.22.0.dev0\n* TensorFlow 2.9.1\n* Datasets 2.4.1.dev0\n* Tokenizers 0.11.0"
] | [
-0.07391142845153809,
0.08263447135686874,
-0.006611743476241827,
0.056420158594846725,
0.11180195957422256,
0.04199781268835068,
0.1448604315519333,
0.12986727058887482,
-0.08882337808609009,
0.10265776515007019,
0.14940418303012848,
0.1345641314983368,
0.05912665277719498,
0.13202354311943054,
-0.07395804673433304,
-0.16642171144485474,
0.06880466639995575,
-0.02061775140464306,
-0.10576190799474716,
0.07689255475997925,
0.09466816484928131,
-0.07969918847084045,
0.07961246371269226,
-0.01677098125219345,
-0.10681454837322235,
0.025276049971580505,
0.05879884213209152,
-0.08066169172525406,
0.12087461352348328,
0.06940299272537231,
0.08632800728082657,
0.024213261902332306,
0.026158688589930534,
-0.21425819396972656,
0.00972080696374178,
0.10745883733034134,
-0.0013330811634659767,
0.06730563193559647,
0.016737934201955795,
-0.01102826651185751,
0.10091648250818253,
-0.08676556497812271,
0.06455878913402557,
0.024705706164240837,
-0.12911996245384216,
-0.2709568440914154,
-0.09427196532487869,
0.022186504676938057,
0.08133647590875626,
0.07301704585552216,
0.0036305401008576155,
0.11939653754234314,
-0.030866025015711784,
0.09579436480998993,
0.1377905160188675,
-0.27183520793914795,
-0.042738888412714005,
0.0301445871591568,
0.021369582042098045,
0.04484078660607338,
-0.05183258652687073,
0.0073770093731582165,
0.035063691437244415,
0.04127183184027672,
0.053745049983263016,
-0.01320163905620575,
0.035132259130477905,
-0.03816353902220726,
-0.07819728553295135,
-0.07466407120227814,
0.13374464213848114,
0.01891041360795498,
-0.05659733712673187,
-0.06954851001501083,
-0.03777921199798584,
-0.13755637407302856,
-0.00019826732750516385,
-0.04244885593652725,
0.013451946899294853,
0.012275858782231808,
-0.029273830354213715,
-0.019127603620290756,
-0.06724315881729126,
-0.04887111112475395,
0.004052889067679644,
0.11000431329011917,
0.041308995336294174,
0.04327387362718582,
-0.02009713090956211,
0.06595546752214432,
0.0030316361226141453,
-0.1222139522433281,
-0.023893481120467186,
0.0013718752888962626,
-0.04769069328904152,
-0.02902401238679886,
-0.06246061623096466,
-0.02215137891471386,
0.07591525465250015,
0.16434426605701447,
-0.09391816705465317,
0.12394734472036362,
-0.009443181566894054,
0.022578051313757896,
-0.09100908041000366,
0.11293944716453552,
0.003621731884777546,
0.0017864678520709276,
0.00010084761015605181,
0.06733806431293488,
0.027690762653946877,
-0.0453675240278244,
-0.061524808406829834,
0.044568657875061035,
0.09922049194574356,
0.028041549026966095,
-0.04347304627299309,
0.07443520426750183,
-0.06569305062294006,
-0.0035537402145564556,
-0.020398057997226715,
-0.12513715028762817,
0.03706369549036026,
0.03146139532327652,
-0.07538437843322754,
0.012698758393526077,
0.05361640453338623,
-0.003921525087207556,
-0.06781698763370514,
0.03461574390530586,
-0.048498474061489105,
-0.04875390976667404,
-0.09217387437820435,
-0.12483386695384979,
0.037216052412986755,
-0.09673229604959488,
-0.011047491803765297,
-0.07883120328187943,
-0.17430691421031952,
-0.03883847966790199,
0.0971076712012291,
-0.03811538591980934,
-0.03269219398498535,
-0.07761384546756744,
-0.14711621403694153,
0.0652325302362442,
-0.007025263737887144,
0.11301462352275848,
-0.05663466081023216,
0.06596536189317703,
-0.003886908758431673,
0.05522117763757706,
-0.026375971734523773,
0.03113294206559658,
-0.030361032113432884,
0.04340105503797531,
-0.17204156517982483,
0.08857265114784241,
-0.08388113230466843,
0.03234183043241501,
-0.14834515750408173,
-0.06984000653028488,
0.044274523854255676,
0.027410030364990234,
0.11101635545492172,
0.10079839825630188,
-0.15742729604244232,
-0.06979930400848389,
0.12942248582839966,
-0.10265584290027618,
-0.0860704556107521,
0.09373119473457336,
-0.05262873321771622,
0.02595359832048416,
0.07502198219299316,
0.085166797041893,
0.05927405506372452,
-0.10538799315690994,
0.016461888328194618,
-0.04395925626158714,
0.018910441547632217,
0.032702136784791946,
0.031215444207191467,
-0.029588323086500168,
-0.04061657562851906,
0.011925664730370045,
-0.016275888308882713,
0.009916741400957108,
-0.06470528244972229,
-0.05706463381648064,
-0.02024751342833042,
-0.04824620112776756,
0.029872236773371696,
0.02045014314353466,
0.03377092257142067,
-0.10121919959783554,
-0.14192944765090942,
0.06493665277957916,
0.043708208948373795,
-0.07727639377117157,
0.03439443185925484,
-0.099264957010746,
0.041769448667764664,
0.046973150223493576,
0.026148462668061256,
-0.16055648028850555,
-0.04862917214632034,
0.02218969166278839,
0.010032868012785912,
0.0023180723655968904,
-0.023275652900338173,
0.06839604675769806,
0.02803945541381836,
-0.03776492550969124,
-0.010808729566633701,
-0.03872956335544586,
0.014463494531810284,
-0.06777610629796982,
-0.2362384796142578,
-0.015807097777724266,
-0.03851106017827988,
0.08475309610366821,
-0.2573150098323822,
0.024476826190948486,
0.09256105870008469,
0.1168312132358551,
0.034252870827913284,
-0.014815458096563816,
-0.04797890782356262,
0.05211229622364044,
-0.04019173979759216,
-0.06744812428951263,
0.027846474200487137,
0.027612367644906044,
-0.11185585707426071,
-0.007730300072580576,
-0.16444875299930573,
0.0949036106467247,
0.12359078973531723,
-0.07659614086151123,
-0.11306425929069519,
0.048715587705373764,
-0.033230625092983246,
-0.027557961642742157,
-0.007858800701797009,
-0.008902707137167454,
0.17282883822917938,
0.02784714102745056,
0.13577064871788025,
-0.051584940403699875,
-0.042037829756736755,
0.041056666523218155,
-0.024731652811169624,
-0.03989940509200096,
0.11383679509162903,
-0.014064792543649673,
-0.06801269203424454,
0.08770764619112015,
0.1048230305314064,
-0.09241693466901779,
0.11368183046579361,
-0.05744333937764168,
-0.05383418872952461,
-0.06476446241140366,
0.04566885158419609,
0.0523282065987587,
0.08546290546655655,
-0.10607609152793884,
-0.011092807166278362,
0.002354997443035245,
0.030383890494704247,
-0.0077247959561645985,
-0.18332983553409576,
0.02118966355919838,
0.01735980063676834,
-0.06546255201101303,
0.030562540516257286,
-0.01426339615136385,
0.004442327190190554,
0.1106194257736206,
0.02694733627140522,
-0.0568198598921299,
0.06143719330430031,
-0.024002142250537872,
-0.09387430548667908,
0.21855883300304413,
-0.13552728295326233,
-0.1558559238910675,
-0.11963660269975662,
-0.038626689463853836,
-0.018895085901021957,
0.002665711799636483,
0.008893095888197422,
-0.09680738300085068,
-0.06671077013015747,
-0.07525122910737991,
-0.026940051466226578,
-0.028115496039390564,
0.039540596306324005,
0.06786834448575974,
0.011110852472484112,
0.1188591793179512,
-0.1043458878993988,
-0.039582978934049606,
-0.00926976278424263,
-0.04700369015336037,
0.025619402527809143,
0.008552052080631256,
0.002149731619283557,
0.12049811333417892,
-0.012112793512642384,
0.030603090301156044,
-0.04643780365586281,
0.21417783200740814,
-0.05440322682261467,
0.007241528946906328,
0.13978759944438934,
-0.040885113179683685,
0.06176488474011421,
0.11404693126678467,
0.03435302898287773,
-0.11830763518810272,
0.04008116573095322,
0.06288224458694458,
-0.02645834907889366,
-0.25242170691490173,
-0.001966332783922553,
-0.028881246224045753,
-0.04773828014731407,
0.07264648377895355,
0.039564892649650574,
0.11643308401107788,
0.003933779429644346,
0.016100799664855003,
0.09824861586093903,
0.07162990421056747,
0.07272161543369293,
0.15982626378536224,
0.05075477063655853,
0.09522319585084915,
-0.03977004811167717,
-0.0029169435147196054,
0.03260849788784981,
-0.0026885352563112974,
0.20372715592384338,
0.02831386588513851,
0.09093789011240005,
0.0755668357014656,
0.06037955731153488,
-0.024009859189391136,
0.012803259305655956,
0.0032472580205649137,
0.004838194698095322,
0.004369159694761038,
-0.04922245815396309,
-0.041384387761354446,
0.05822337046265602,
-0.032430749386548996,
0.03213879466056824,
-0.0974363312125206,
0.08231885731220245,
0.052982866764068604,
0.2619999647140503,
0.10058581084012985,
-0.30798861384391785,
-0.11810978502035141,
0.013862169347703457,
-0.02952362224459648,
-0.05283449962735176,
0.003961579408496618,
0.10422838479280472,
-0.053232237696647644,
0.07668489217758179,
-0.058303166180849075,
0.0532015860080719,
-0.05520523339509964,
0.046602893620729446,
0.10921402275562286,
0.09219785779714584,
0.013523503206670284,
0.013090500608086586,
-0.35229596495628357,
0.27401790022850037,
0.021928807720541954,
0.14101745188236237,
-0.08162964880466461,
0.062338247895240784,
0.025331031531095505,
-0.03250974789261818,
0.07132861763238907,
-0.011616379022598267,
-0.18810588121414185,
-0.1658492237329483,
-0.06576572358608246,
0.0019008010858669877,
0.1291641741991043,
0.01064685545861721,
0.10153324156999588,
-0.03816364333033562,
0.013027841225266457,
0.0671079009771347,
-0.03604444861412048,
-0.14044830203056335,
-0.07221636921167374,
0.046086471527814865,
0.02663687989115715,
-0.04151704162359238,
-0.08157825469970703,
-0.07344930619001389,
-0.07819029688835144,
0.19120021164417267,
-0.17268280684947968,
-0.04862581938505173,
-0.1326426863670349,
0.09308010339736938,
0.07277673482894897,
-0.0598645880818367,
0.03415730223059654,
0.00920107401907444,
0.09918665885925293,
0.027667252346873283,
-0.07994333654642105,
0.12104019522666931,
-0.038056034594774246,
-0.2218063324689865,
-0.06561470031738281,
0.10396363586187363,
0.03885125368833542,
0.03875688463449478,
0.0025445926003158092,
0.06264381855726242,
0.04535825550556183,
-0.08618160337209702,
0.10919449478387833,
0.020006002858281136,
0.0413251556456089,
0.010387402027845383,
-0.02243887633085251,
-0.020407941192388535,
-0.04331832379102707,
-0.002173790242522955,
0.10191474854946136,
0.28926917910575867,
-0.07520728558301926,
0.04494626447558403,
0.032012175768613815,
-0.10593435168266296,
-0.2194235920906067,
0.0569758303463459,
0.09587457031011581,
0.01746339164674282,
-0.07142897695302963,
-0.20796625316143036,
0.05068439245223999,
0.08038860559463501,
-0.011068389751017094,
0.06507807970046997,
-0.2910469174385071,
-0.1577337235212326,
0.07838228344917297,
0.11538489907979965,
0.09880329668521881,
-0.16130639612674713,
-0.049875326454639435,
-0.05520221218466759,
-0.05069290101528168,
0.1294579952955246,
-0.08476758748292923,
0.07986664772033691,
0.03783537074923515,
0.02913467213511467,
0.004434951115399599,
-0.04412868991494179,
0.12625877559185028,
-0.033778317272663116,
0.09136777371168137,
-0.04967968910932541,
-0.03507174178957939,
0.10437077283859253,
-0.0888165682554245,
0.02897496707737446,
-0.05269104987382889,
0.02371688187122345,
-0.1120452731847763,
-0.007926895283162594,
-0.0666697695851326,
0.041741374880075455,
-0.06978001445531845,
-0.018076254054903984,
-0.005557761061936617,
0.05363823473453522,
0.0807783305644989,
-0.013794909231364727,
0.13100245594978333,
-0.012134700082242489,
0.15204060077667236,
0.1693822592496872,
0.09028282016515732,
0.006561764515936375,
0.002939954400062561,
0.07717370241880417,
-0.025496743619441986,
0.058705706149339676,
-0.1811363846063614,
0.05982021614909172,
0.1350979059934616,
-0.014272770844399929,
0.1563759297132492,
0.06882923096418381,
-0.0756516233086586,
0.02668408490717411,
0.0667063519358635,
-0.13334481418132782,
-0.10392297059297562,
0.02455148845911026,
0.024167489260435104,
-0.07526829093694687,
0.035374850034713745,
0.1654423624277115,
-0.03569578006863594,
0.026462724432349205,
0.0025382968597114086,
0.04818416014313698,
-0.06041528657078743,
0.14679187536239624,
0.003285267623141408,
0.0774112120270729,
-0.0703791081905365,
0.13631518185138702,
0.056144703179597855,
-0.12767110764980316,
0.11635874956846237,
0.026810025796294212,
-0.06384234875440598,
-0.011606150306761265,
0.03099951706826687,
0.10638339817523956,
0.0565437451004982,
-0.07021196186542511,
-0.14552219212055206,
-0.1710921823978424,
0.09627373516559601,
0.20311345160007477,
0.031575292348861694,
0.07042120397090912,
-0.0394558347761631,
0.008558778092265129,
-0.09178927540779114,
0.07786105573177338,
0.05503467097878456,
0.033378373831510544,
-0.13209031522274017,
0.17841736972332,
0.0026113272178918123,
-0.009498929604887962,
-0.010664890520274639,
0.0036123686004430056,
-0.17869611084461212,
0.010949995368719101,
-0.1512790471315384,
0.02880939468741417,
0.007471459452062845,
-0.012450088746845722,
0.040546879172325134,
-0.03920793533325195,
-0.062150970101356506,
0.038902632892131805,
-0.0949677899479866,
-0.030983218923211098,
0.03193022310733795,
0.058667004108428955,
-0.11944594979286194,
-0.07479068636894226,
0.0077564911916852,
-0.12194844335317612,
0.06310860812664032,
0.031233960762619972,
-0.00997444149106741,
0.030194826424121857,
-0.070852130651474,
0.033407095819711685,
0.056730013340711594,
0.0032060814555734396,
0.031379081308841705,
-0.16700944304466248,
0.005602364894002676,
-0.019354691728949547,
0.009446446783840656,
0.02681286260485649,
0.07895491272211075,
-0.0749620869755745,
-0.013017125427722931,
-0.018280835822224617,
-0.04363765940070152,
-0.04481915757060051,
0.06066973879933357,
0.146197110414505,
-0.020708391442894936,
0.175576314330101,
-0.12662896513938904,
0.014299244619905949,
-0.17771512269973755,
0.0073385597206652164,
0.028852587565779686,
-0.07963445037603378,
-0.1026696041226387,
-0.01956893876194954,
0.10389647632837296,
-0.10246744751930237,
0.09639289230108261,
-0.04868140071630478,
0.0621793270111084,
0.053966086357831955,
-0.11230979114770889,
-0.09513472765684128,
0.08995509147644043,
0.15631404519081116,
0.041108615696430206,
-0.019222307950258255,
0.041324544697999954,
-0.030784087255597115,
0.09450636804103851,
0.09265743941068649,
0.21195052564144135,
0.1382652223110199,
0.07005070894956589,
0.13121646642684937,
0.06980573385953903,
-0.06978554278612137,
-0.09685670584440231,
0.14706146717071533,
-0.09294252842664719,
0.16441349685192108,
-0.04080396890640259,
0.09812691062688828,
0.06646458059549332,
-0.18307173252105713,
0.018070634454488754,
-0.08285722136497498,
-0.09727343916893005,
-0.13045300543308258,
-0.11080843210220337,
-0.08775606006383896,
-0.09211838245391846,
0.001712821889668703,
-0.1150684580206871,
0.02778894640505314,
0.04226468876004219,
0.0411154180765152,
0.008663113228976727,
0.0592903271317482,
-0.033989276736974716,
0.03233831748366356,
0.11178857088088989,
-0.004009251482784748,
-0.015345999039709568,
-0.04283718019723892,
-0.08719261735677719,
0.041119154542684555,
0.010396902449429035,
0.04101679101586342,
0.0077957273460924625,
0.00025199571973644197,
0.049525097012519836,
-0.012820534408092499,
-0.09119196981191635,
0.05595143511891365,
0.017239941284060478,
0.015601248480379581,
0.051586639136075974,
0.04949261248111725,
-0.020133888348937035,
-0.007262355647981167,
0.11246320605278015,
-0.08665423840284348,
-0.04591226205229759,
-0.16508352756500244,
0.24707894027233124,
-0.003167730290442705,
0.044199138879776,
0.022022997960448265,
-0.06346602737903595,
-0.041689369827508926,
0.16318140923976898,
0.14973106980323792,
-0.044069789350032806,
-0.027244072407484055,
0.05846143886446953,
-0.005559415090829134,
-0.035521794110536575,
0.13416017591953278,
0.07870172709226608,
-0.015292644500732422,
-0.04077232629060745,
-0.06175308674573898,
-0.013303578831255436,
-0.027708016335964203,
-0.03980546072125435,
0.07778933644294739,
0.006340114865452051,
-0.006335303653031588,
-0.011792181059718132,
0.05230763182044029,
-0.07561304420232773,
-0.1359899938106537,
0.09413975477218628,
-0.2030804604291916,
-0.15565362572669983,
0.001905176555737853,
0.018159547820687294,
-0.009930520318448544,
0.06213150918483734,
-0.003309456165879965,
-0.019858675077557564,
0.11930582672357559,
-0.040201663970947266,
-0.01243695430457592,
-0.07889772951602936,
0.054888997226953506,
-0.05753316357731819,
0.1814369559288025,
-0.02583831362426281,
0.05703191086649895,
0.13734418153762817,
0.029660584405064583,
-0.08305281400680542,
0.027361596003174782,
0.07068916410207748,
-0.09002862125635147,
-0.010827837511897087,
0.0777362734079361,
-0.035548217594623566,
0.14092417061328888,
0.08384551107883453,
-0.08496548235416412,
0.020667236298322678,
-0.06851817667484283,
-0.09697496891021729,
-0.028829999268054962,
-0.043344222009181976,
-0.07045608758926392,
0.11423490941524506,
0.22408606112003326,
-0.03238929063081741,
0.010176487267017365,
-0.03200730308890343,
0.015442815609276295,
0.0432809442281723,
0.023076780140399933,
-0.058729469776153564,
-0.24846993386745453,
0.07035734504461288,
0.03886869177222252,
0.043098609894514084,
-0.1679210215806961,
-0.09259256720542908,
0.023271512240171432,
-0.011504349298775196,
-0.10452518612146378,
0.11427780985832214,
0.06014181300997734,
0.05798833817243576,
-0.06677337735891342,
-0.1069948747754097,
-0.04508422315120697,
0.185794860124588,
-0.09384065121412277,
-0.07856403291225433
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Rocketknight1/distilbert-base-uncased-finetuned-ner
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.2026
- Validation Loss: 0.0726
- Train Precision: 0.8945
- Train Recall: 0.9220
- Train F1: 0.9081
- Train Accuracy: 0.9793
- Epoch: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 2631, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Train Precision | Train Recall | Train F1 | Train Accuracy | Epoch |
|:----------:|:---------------:|:---------------:|:------------:|:--------:|:--------------:|:-----:|
| 0.2026 | 0.0726 | 0.8945 | 0.9220 | 0.9081 | 0.9793 | 0 |
### Framework versions
- Transformers 4.21.0.dev0
- TensorFlow 2.9.1
- Datasets 2.3.3.dev0
- Tokenizers 0.11.0
| {"license": "apache-2.0", "tags": ["generated_from_keras_callback"], "model-index": [{"name": "Rocketknight1/distilbert-base-uncased-finetuned-ner", "results": []}]} | token-classification | Rocketknight1/distilbert-base-uncased-finetuned-ner | [
"transformers",
"tf",
"tensorboard",
"distilbert",
"token-classification",
"generated_from_keras_callback",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #tf #tensorboard #distilbert #token-classification #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
| Rocketknight1/distilbert-base-uncased-finetuned-ner
===================================================
This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset.
It achieves the following results on the evaluation set:
* Train Loss: 0.2026
* Validation Loss: 0.0726
* Train Precision: 0.8945
* Train Recall: 0.9220
* Train F1: 0.9081
* Train Accuracy: 0.9793
* Epoch: 0
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* optimizer: {'name': 'AdamWeightDecay', 'learning\_rate': {'class\_name': 'PolynomialDecay', 'config': {'initial\_learning\_rate': 2e-05, 'decay\_steps': 2631, 'end\_learning\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\_1': 0.9, 'beta\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight\_decay\_rate': 0.01}
* training\_precision: float32
### Training results
### Framework versions
* Transformers 4.21.0.dev0
* TensorFlow 2.9.1
* Datasets 2.3.3.dev0
* Tokenizers 0.11.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'AdamWeightDecay', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 2e-05, 'decay\\_steps': 2631, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight\\_decay\\_rate': 0.01}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.21.0.dev0\n* TensorFlow 2.9.1\n* Datasets 2.3.3.dev0\n* Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #tf #tensorboard #distilbert #token-classification #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'AdamWeightDecay', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 2e-05, 'decay\\_steps': 2631, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight\\_decay\\_rate': 0.01}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.21.0.dev0\n* TensorFlow 2.9.1\n* Datasets 2.3.3.dev0\n* Tokenizers 0.11.0"
] | [
61,
197,
4,
37
] | [
"passage: TAGS\n#transformers #tf #tensorboard #distilbert #token-classification #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'AdamWeightDecay', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 2e-05, 'decay\\_steps': 2631, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight\\_decay\\_rate': 0.01}\n* training\\_precision: float32### Training results### Framework versions\n\n\n* Transformers 4.21.0.dev0\n* TensorFlow 2.9.1\n* Datasets 2.3.3.dev0\n* Tokenizers 0.11.0"
] | [
-0.07989641278982162,
0.10708010941743851,
-0.0072298659943044186,
0.05810641869902611,
0.12128766626119614,
0.04878715053200722,
0.11750128120183945,
0.13901342451572418,
-0.09521288424730301,
0.13337798416614532,
0.12877127528190613,
0.1264398694038391,
0.05151112377643585,
0.1255674958229065,
-0.07582564651966095,
-0.15731821954250336,
0.055003054440021515,
-0.02254880964756012,
-0.13070368766784668,
0.06531133502721786,
0.08458990603685379,
-0.07952862232923508,
0.08689731359481812,
-0.0073118568398058414,
-0.09800393134355545,
0.024740923196077347,
0.047039344906806946,
-0.06807702034711838,
0.09802308678627014,
0.06273147463798523,
0.097078837454319,
0.005130978301167488,
0.040975362062454224,
-0.1889447569847107,
-0.001960311783477664,
0.11203058809041977,
0.009514830075204372,
0.0848536491394043,
0.038821641355752945,
-0.004800580441951752,
0.10144918411970139,
-0.10455259680747986,
0.056079793721437454,
0.025110650807619095,
-0.11992907524108887,
-0.2756901979446411,
-0.09784026443958282,
0.021425379440188408,
0.09376700967550278,
0.05920257419347763,
0.015867670997977257,
0.1087905690073967,
-0.014262335374951363,
0.08214368671178818,
0.1421363353729248,
-0.259112685918808,
-0.0443711057305336,
0.039228882640600204,
0.030538469552993774,
0.0028538659680634737,
-0.05929926782846451,
0.012643943540751934,
0.036457229405641556,
0.03696811571717262,
0.06085200607776642,
-0.013114356435835361,
0.07031232118606567,
-0.04208832234144211,
-0.08293402940034866,
-0.06889258325099945,
0.11497338116168976,
0.034232765436172485,
-0.048234254121780396,
-0.060289498418569565,
-0.059744659811258316,
-0.1928047388792038,
-0.011863218620419502,
-0.04975396394729614,
0.016927387565374374,
0.005558605771511793,
-0.02238461747765541,
0.0032064614351838827,
-0.058103688061237335,
-0.05453025922179222,
0.007605316583067179,
0.12690170109272003,
0.03336462378501892,
0.03782246634364128,
-0.0029265866614878178,
0.06321483105421066,
-0.0015088360523805022,
-0.1281478852033615,
-0.023538749665021896,
0.00951341912150383,
-0.06910470128059387,
-0.024196108803153038,
-0.048827677965164185,
-0.01471003983169794,
0.08214259892702103,
0.17663176357746124,
-0.10771698504686356,
0.11413698643445969,
-0.01122460700571537,
0.014277201145887375,
-0.0782073363661766,
0.10714221000671387,
-0.011145628988742828,
-0.04352259263396263,
-0.00008902463741833344,
0.0660153180360794,
0.03285706415772438,
-0.03435162827372551,
-0.0431857705116272,
0.024411963298916817,
0.11638077348470688,
0.030455445870757103,
-0.016368523240089417,
0.07339253276586533,
-0.05534225329756737,
0.0005630517262034118,
0.05252808704972267,
-0.1225118562579155,
0.03967640548944473,
0.02758922427892685,
-0.07600264996290207,
0.023481056094169617,
0.05190801993012428,
-0.012774580158293247,
-0.057617612183094025,
0.05608275532722473,
-0.05615196377038956,
-0.06443056464195251,
-0.09082010388374329,
-0.1085951030254364,
0.03282741457223892,
-0.051238834857940674,
-0.02010468952357769,
-0.07185302674770355,
-0.1715650111436844,
-0.05007665976881981,
0.08478301763534546,
-0.051520634442567825,
-0.020174305886030197,
-0.04308324307203293,
-0.14841657876968384,
0.08566273748874664,
-0.009230852127075195,
0.10118434578180313,
-0.04866315424442291,
0.05455996096134186,
0.022871581837534904,
0.05554575100541115,
-0.0034459643065929413,
0.031003300100564957,
-0.04131471738219261,
0.058492012321949005,
-0.19310146570205688,
0.08792853355407715,
-0.09696173667907715,
0.017495518550276756,
-0.1577037125825882,
-0.06050492450594902,
0.027764886617660522,
0.02155071683228016,
0.11356859654188156,
0.10652012377977371,
-0.1526225358247757,
-0.06612453609704971,
0.10240590572357178,
-0.10501374304294586,
-0.06520863622426987,
0.08136157691478729,
-0.02508041076362133,
-0.03153849020600319,
0.060051605105400085,
0.06459907442331314,
0.058738239109516144,
-0.09323403239250183,
-0.030710220336914062,
-0.0399191714823246,
0.016226211562752724,
0.01929951272904873,
0.03847493976354599,
-0.03232970088720322,
-0.007149460259824991,
0.010492732748389244,
-0.02159358188509941,
0.0025563081726431847,
-0.058768149465322495,
-0.05183171480894089,
-0.04044128581881523,
-0.04447852820158005,
0.035292185842990875,
0.030206192284822464,
0.03099300153553486,
-0.07637342065572739,
-0.15641801059246063,
0.07107668370008469,
0.06848784536123276,
-0.06887265294790268,
0.03296276181936264,
-0.0834638848900795,
0.05231129378080368,
0.007865308783948421,
0.018038034439086914,
-0.17734527587890625,
-0.07471297681331635,
0.020242741331458092,
-0.024408169090747833,
-0.017198555171489716,
-0.016734065487980843,
0.06744980812072754,
0.03532601147890091,
-0.04213084653019905,
-0.02561868540942669,
-0.02718718908727169,
0.009859772399067879,
-0.0638229250907898,
-0.23984786868095398,
-0.041055597364902496,
-0.03906477615237236,
0.03900853544473648,
-0.26049548387527466,
0.01161526795476675,
0.08636819571256638,
0.14499394595623016,
0.04213700443506241,
-0.029090479016304016,
-0.0361611545085907,
0.04625968635082245,
-0.04755472019314766,
-0.0736912414431572,
0.019109636545181274,
0.007900437340140343,
-0.0942414402961731,
-0.007939901202917099,
-0.16196966171264648,
0.08746835589408875,
0.11773600429296494,
-0.04721894487738609,
-0.11986102163791656,
0.04503213241696358,
-0.03429136425256729,
-0.0507424995303154,
0.0066671958193182945,
-0.01822471246123314,
0.13470831513404846,
0.041150469332933426,
0.12989762425422668,
-0.02361450158059597,
-0.03752422705292702,
0.02363380789756775,
-0.011381166987121105,
-0.04524995759129524,
0.09574045240879059,
0.0026354065630584955,
-0.08147455006837845,
0.09673905372619629,
0.11351349204778671,
-0.10290198028087616,
0.10330517590045929,
-0.06166916340589523,
-0.06925948709249496,
-0.07639450579881668,
0.034513745456933975,
0.055826347321271896,
0.0733087956905365,
-0.0686795637011528,
0.0028370614163577557,
0.009567759931087494,
0.031048882752656937,
-0.00866593699902296,
-0.14893335103988647,
0.03841140866279602,
0.002155665308237076,
-0.057067401707172394,
0.07257561385631561,
-0.006338860839605331,
0.004473128356039524,
0.09809824079275131,
0.0321526899933815,
-0.04880985617637634,
0.05026725307106972,
-0.026446672156453133,
-0.06954122334718704,
0.21198949217796326,
-0.12428705394268036,
-0.12861517071723938,
-0.12837640941143036,
-0.03368183597922325,
-0.027174673974514008,
-0.0013619008241221309,
-0.0014150292845442891,
-0.09323132038116455,
-0.07880301773548126,
-0.05144308879971504,
-0.009650667198002338,
-0.027907652780413628,
0.03922208398580551,
0.06708888709545135,
0.004813142586499453,
0.12970596551895142,
-0.10106948763132095,
-0.03295792639255524,
0.0011332692811265588,
-0.05015192925930023,
-0.007089666556566954,
0.048246607184410095,
0.004850360564887524,
0.11541733145713806,
-0.0067193214781582355,
0.012034785002470016,
-0.028999460861086845,
0.20299872756004333,
-0.055768243968486786,
0.02048911713063717,
0.13734684884548187,
-0.04144107922911644,
0.06957503408193588,
0.11308592557907104,
0.043921612203121185,
-0.09856179356575012,
0.018900129944086075,
0.07369022071361542,
-0.01856853999197483,
-0.24365201592445374,
-0.020766783505678177,
-0.025600049644708633,
-0.030426740646362305,
0.07877986133098602,
0.05019436776638031,
0.10873965919017792,
0.000697467417921871,
0.0073023890145123005,
0.105960913002491,
0.02960069477558136,
0.07665040343999863,
0.17885379493236542,
0.06697794795036316,
0.09843471646308899,
-0.03203757479786873,
0.015539834275841713,
0.045951176434755325,
0.009585976600646973,
0.18701422214508057,
0.021285446360707283,
0.12134801596403122,
0.06338897347450256,
0.048951562494039536,
-0.01646232232451439,
0.006976769305765629,
0.006960140075534582,
0.00873199850320816,
0.013659474439918995,
-0.056594010442495346,
-0.03669176623225212,
0.050975020974874496,
-0.023738477379083633,
0.04368453100323677,
-0.08572842180728912,
0.0986826941370964,
0.041075386106967926,
0.2501637041568756,
0.1054593175649643,
-0.32005298137664795,
-0.1113855242729187,
0.020422665402293205,
-0.02468232996761799,
-0.05239451676607132,
0.0004315889091230929,
0.08220254629850388,
-0.04538577422499657,
0.07911417633295059,
-0.06831276416778564,
0.04789434000849724,
-0.09349973499774933,
0.03552568703889847,
0.10998141020536423,
0.11861248314380646,
0.02201778255403042,
0.000837497937027365,
-0.3055827021598816,
0.2508452534675598,
0.025740591809153557,
0.12017323076725006,
-0.047461532056331635,
0.08316163718700409,
0.029924044385552406,
-0.012436100281774998,
0.06769757717847824,
-0.02254743129014969,
-0.2148279845714569,
-0.15072005987167358,
-0.08385038375854492,
-0.014363022521138191,
0.1074892207980156,
0.0014188012573868036,
0.09631460160017014,
-0.049722716212272644,
-0.005012415815144777,
0.06948739290237427,
-0.06305032223463058,
-0.14555548131465912,
-0.05285344272851944,
0.05920250341296196,
0.02845955453813076,
-0.03277955576777458,
-0.07725590467453003,
-0.07154808193445206,
-0.06849486380815506,
0.22847263514995575,
-0.1640392392873764,
-0.05104878544807434,
-0.13876278698444366,
0.10985118895769119,
0.10695579648017883,
-0.06686951220035553,
0.049765121191740036,
0.0050244564190506935,
0.09638400375843048,
0.05605098977684975,
-0.08356054872274399,
0.11979931592941284,
-0.053441423922777176,
-0.21499453485012054,
-0.07286570221185684,
0.09024909883737564,
0.02140127494931221,
0.026697272434830666,
-0.01044128742069006,
0.06767326593399048,
0.0569327212870121,
-0.08319611102342606,
0.10345833003520966,
0.027395538985729218,
0.05189760401844978,
0.013771377503871918,
-0.010297289118170738,
-0.05997640639543533,
-0.012865857221186161,
0.009615644812583923,
0.07938198000192642,
0.26815861463546753,
-0.08173774927854538,
0.014375673606991768,
0.04592559486627579,
-0.09622322022914886,
-0.20285551249980927,
0.040331169962882996,
0.11409077793359756,
0.0023457473143935204,
-0.05947635695338249,
-0.18946708738803864,
0.08262234926223755,
0.10732914507389069,
-0.024155093356966972,
0.09401153773069382,
-0.2730424702167511,
-0.15411922335624695,
0.08218001574277878,
0.09743935614824295,
0.05841459333896637,
-0.17540697753429413,
-0.07186451554298401,
-0.054474230855703354,
-0.06807444244623184,
0.11275952309370041,
-0.037418294697999954,
0.07283094525337219,
0.0167637187987566,
0.011313517577946186,
0.004195921588689089,
-0.05024278163909912,
0.1269497275352478,
-0.027130063623189926,
0.07963263243436813,
-0.04921896010637283,
-0.02855614200234413,
0.09741029888391495,
-0.10100459307432175,
0.021844005212187767,
-0.047567252069711685,
0.03211354836821556,
-0.12554997205734253,
0.002788356738165021,
-0.058339785784482956,
0.06013298034667969,
-0.06596378237009048,
-0.01917232945561409,
-0.005231436342000961,
0.051392775028944016,
0.06239865720272064,
-0.020450273528695107,
0.1453223079442978,
0.012804091908037663,
0.14222487807273865,
0.183471217751503,
0.06280909478664398,
0.04142416641116142,
-0.037645287811756134,
0.06525787711143494,
-0.0270372461527586,
0.06341136991977692,
-0.19118614494800568,
0.049710024148225784,
0.12700730562210083,
-0.005039573181420565,
0.12065564095973969,
0.057876892387866974,
-0.057737793773412704,
0.020396731793880463,
0.07969361543655396,
-0.11312307417392731,
-0.0803329274058342,
0.02661375142633915,
-0.015663035213947296,
-0.07412028312683105,
0.04994400590658188,
0.1664716601371765,
-0.033476572483778,
0.033983998000621796,
0.02860860712826252,
0.047870174050331116,
-0.05153632164001465,
0.159942165017128,
0.005404691677540541,
0.08370009064674377,
-0.06997417658567429,
0.13279008865356445,
0.059348173439502716,
-0.11375010013580322,
0.11021032929420471,
0.04026070982217789,
-0.05546410009264946,
-0.01010528951883316,
0.008917925879359245,
0.09627588093280792,
0.0329912006855011,
-0.06247974932193756,
-0.11923794448375702,
-0.12404636293649673,
0.09851610660552979,
0.16950836777687073,
0.030869709327816963,
0.06298660486936569,
-0.03572221100330353,
0.015315670520067215,
-0.09634830057621002,
0.08664510399103165,
0.05056832358241081,
0.06490625441074371,
-0.15420113503932953,
0.14714957773685455,
0.001175716519355774,
-0.012832211330533028,
-0.00834725983440876,
0.02067750133574009,
-0.16249649226665497,
-0.011586260981857777,
-0.13946084678173065,
0.024518107995390892,
-0.01219312846660614,
-0.01840432733297348,
0.027209607884287834,
-0.04626477137207985,
-0.07165567576885223,
0.047170478850603104,
-0.08376830816268921,
-0.051834333688020706,
0.026929259300231934,
0.05896235629916191,
-0.13106375932693481,
-0.05047464370727539,
0.005755259655416012,
-0.11406237632036209,
0.05851013585925102,
0.017823196947574615,
0.028234753757715225,
0.023270709440112114,
-0.03901947662234306,
0.033318161964416504,
0.06300444155931473,
-0.006508306134492159,
0.03624165803194046,
-0.1518154740333557,
0.005408380646258593,
-0.037379369139671326,
0.018422063440084457,
0.007358374539762735,
0.07265202701091766,
-0.08924219012260437,
-0.014258875511586666,
-0.028419120237231255,
-0.012721361592411995,
-0.04425681754946709,
0.04982250556349754,
0.1612549126148224,
-0.025161340832710266,
0.16756613552570343,
-0.10653039813041687,
0.01761377975344658,
-0.1856226772069931,
0.013471487909555435,
0.01798991672694683,
-0.07906836271286011,
-0.09400134533643723,
0.005970458034425974,
0.10289302468299866,
-0.1051262617111206,
0.08186767995357513,
-0.05753346160054207,
0.07332748919725418,
0.06121599301695824,
-0.10300544649362564,
-0.10589514672756195,
0.08896423131227493,
0.15342208743095398,
0.061173632740974426,
-0.014575695618987083,
0.050371646881103516,
-0.015744801610708237,
0.08082301169633865,
0.052649516612291336,
0.18334639072418213,
0.1254986673593521,
0.02983064390718937,
0.13276182115077972,
0.06965941935777664,
-0.0932486355304718,
-0.15981309115886688,
0.14290019869804382,
-0.11404556781053543,
0.17367351055145264,
-0.03264482691884041,
0.09525580704212189,
0.06446681171655655,
-0.1692631095647812,
0.020965706557035446,
-0.07036353647708893,
-0.08215126395225525,
-0.11986874043941498,
-0.10723831504583359,
-0.08342736959457397,
-0.10128525644540787,
0.0017039795638993382,
-0.11436030268669128,
0.04757905751466751,
0.0623662956058979,
0.03818296641111374,
0.012387339025735855,
0.07015998661518097,
-0.015070001594722271,
0.03650496527552605,
0.0956943929195404,
0.008576041087508202,
-0.02518439292907715,
-0.029952429234981537,
-0.07874974608421326,
0.017861519008874893,
0.03282557427883148,
0.04418105259537697,
0.0030494434759020805,
-0.006607746239751577,
0.07035107165575027,
0.003210049122571945,
-0.09377773851156235,
0.05349631980061531,
0.03265869617462158,
0.0007802763720974326,
0.05010276287794113,
0.036068666726350784,
-0.012143985368311405,
-0.02044650726020336,
0.1296600103378296,
-0.09914934635162354,
-0.03514539450407028,
-0.16415174305438995,
0.22793366014957428,
0.001449989853426814,
0.0338582880795002,
0.011192108504474163,
-0.07747551798820496,
-0.044656213372945786,
0.12588906288146973,
0.12945964932441711,
-0.016916079446673393,
-0.028170038014650345,
0.07108413428068161,
-0.01197623647749424,
-0.056855928152799606,
0.12904570996761322,
0.07400210946798325,
-0.001045407261699438,
-0.035190802067518234,
-0.07973842322826385,
-0.014704421162605286,
-0.03818272054195404,
-0.045806389302015305,
0.06456905603408813,
0.00018020127026829869,
-0.0005023088306188583,
-0.0038801871705800295,
0.04588794335722923,
-0.04978185519576073,
-0.16866877675056458,
0.08272725343704224,
-0.19489450752735138,
-0.16903382539749146,
-0.0018385775620117784,
0.026721196249127388,
-0.008953476324677467,
0.046939220279455185,
0.00754010071977973,
-0.031242338940501213,
0.13335643708705902,
-0.04330285266041756,
-0.01990845799446106,
-0.08345240354537964,
0.027641016989946365,
-0.0318472757935524,
0.1933930516242981,
-0.02170533873140812,
0.036327458918094635,
0.13923881947994232,
0.0230046845972538,
-0.11048804223537445,
0.00894239079207182,
0.07879742234945297,
-0.11080165952444077,
0.010624040849506855,
0.10185807943344116,
-0.03187384456396103,
0.15446925163269043,
0.07060456275939941,
-0.09481675177812576,
-0.006152068264782429,
-0.0333552211523056,
-0.07939963042736053,
-0.03405546396970749,
-0.04862167313694954,
-0.06472870707511902,
0.12736839056015015,
0.22464506328105927,
-0.022060170769691467,
0.00015914361574687064,
-0.03171737492084503,
0.029185665771365166,
0.0275286678224802,
0.04545365273952484,
-0.057730261236429214,
-0.23924483358860016,
0.08878187090158463,
0.05438489094376564,
0.052307624369859695,
-0.14207309484481812,
-0.08898591250181198,
0.03832034021615982,
-0.01267294678837061,
-0.10061141848564148,
0.11100461333990097,
0.04781554639339447,
0.052758682519197464,
-0.0564957931637764,
-0.10563251376152039,
-0.03805960714817047,
0.1673399806022644,
-0.12923173606395721,
-0.0800345167517662
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Rocketknight1/distilbert-base-uncased-finetuned-squad
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 1.5124
- Train End Logits Accuracy: 0.6041
- Train Start Logits Accuracy: 0.5680
- Validation Loss: 1.1534
- Validation End Logits Accuracy: 0.6849
- Validation Start Logits Accuracy: 0.6443
- Epoch: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 11064, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train End Logits Accuracy | Train Start Logits Accuracy | Validation Loss | Validation End Logits Accuracy | Validation Start Logits Accuracy | Epoch |
|:----------:|:-------------------------:|:---------------------------:|:---------------:|:------------------------------:|:--------------------------------:|:-----:|
| 1.5124 | 0.6041 | 0.5680 | 1.1534 | 0.6849 | 0.6443 | 0 |
### Framework versions
- Transformers 4.21.0.dev0
- TensorFlow 2.9.1
- Datasets 2.3.3.dev0
- Tokenizers 0.11.0
| {"license": "apache-2.0", "tags": ["generated_from_keras_callback"], "model-index": [{"name": "Rocketknight1/distilbert-base-uncased-finetuned-squad", "results": []}]} | question-answering | Rocketknight1/distilbert-base-uncased-finetuned-squad | [
"transformers",
"tf",
"tensorboard",
"distilbert",
"question-answering",
"generated_from_keras_callback",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #tf #tensorboard #distilbert #question-answering #generated_from_keras_callback #license-apache-2.0 #endpoints_compatible #region-us
| Rocketknight1/distilbert-base-uncased-finetuned-squad
=====================================================
This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset.
It achieves the following results on the evaluation set:
* Train Loss: 1.5124
* Train End Logits Accuracy: 0.6041
* Train Start Logits Accuracy: 0.5680
* Validation Loss: 1.1534
* Validation End Logits Accuracy: 0.6849
* Validation Start Logits Accuracy: 0.6443
* Epoch: 0
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* optimizer: {'name': 'Adam', 'learning\_rate': {'class\_name': 'PolynomialDecay', 'config': {'initial\_learning\_rate': 2e-05, 'decay\_steps': 11064, 'end\_learning\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\_1': 0.9, 'beta\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
* training\_precision: float32
### Training results
### Framework versions
* Transformers 4.21.0.dev0
* TensorFlow 2.9.1
* Datasets 2.3.3.dev0
* Tokenizers 0.11.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 2e-05, 'decay\\_steps': 11064, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.21.0.dev0\n* TensorFlow 2.9.1\n* Datasets 2.3.3.dev0\n* Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #tf #tensorboard #distilbert #question-answering #generated_from_keras_callback #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 2e-05, 'decay\\_steps': 11064, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.21.0.dev0\n* TensorFlow 2.9.1\n* Datasets 2.3.3.dev0\n* Tokenizers 0.11.0"
] | [
53,
178,
4,
37
] | [
"passage: TAGS\n#transformers #tf #tensorboard #distilbert #question-answering #generated_from_keras_callback #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 2e-05, 'decay\\_steps': 11064, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}\n* training\\_precision: float32### Training results### Framework versions\n\n\n* Transformers 4.21.0.dev0\n* TensorFlow 2.9.1\n* Datasets 2.3.3.dev0\n* Tokenizers 0.11.0"
] | [
-0.08433522284030914,
0.03735293075442314,
-0.0059020486660301685,
0.0682951956987381,
0.13884718716144562,
0.03911391645669937,
0.13211621344089508,
0.12321195006370544,
-0.07395907491445541,
0.1229408010840416,
0.15624463558197021,
0.14131338894367218,
0.061684295535087585,
0.09725847840309143,
-0.07419240474700928,
-0.15318314731121063,
0.07169660180807114,
-0.021694783121347427,
-0.09569933265447617,
0.07942981272935867,
0.07930496335029602,
-0.07679173350334167,
0.07083454728126526,
-0.020337630063295364,
-0.10204565525054932,
0.021420208737254143,
0.07387547194957733,
-0.06468919664621353,
0.11771907657384872,
0.05968300625681877,
0.08367672562599182,
-0.0014002707321196795,
0.006209672428667545,
-0.20294047892093658,
0.012740534730255604,
0.1145811453461647,
0.006300168577581644,
0.06876781582832336,
0.01113181747496128,
-0.0013350661611184478,
0.08046039938926697,
-0.11053112894296646,
0.05594196915626526,
0.037732530385255814,
-0.13417521119117737,
-0.30042436718940735,
-0.11456368863582611,
0.02035241574048996,
0.07538262009620667,
0.09152297675609589,
-0.004793043714016676,
0.15389670431613922,
-0.04193851351737976,
0.091657854616642,
0.16572560369968414,
-0.28254565596580505,
-0.051014188677072525,
0.020791415125131607,
0.02993958815932274,
0.04911401495337486,
-0.05416946858167648,
0.009697494097054005,
0.02749796211719513,
0.04123266413807869,
0.015893982723355293,
-0.018557190895080566,
0.02202538773417473,
-0.031919755041599274,
-0.07590202242136002,
-0.06888912618160248,
0.11645999550819397,
0.03266051039099693,
-0.0704662874341011,
-0.06568343192338943,
-0.03607060760259628,
-0.1397920697927475,
0.011355025693774223,
-0.0373804047703743,
0.002019714331254363,
0.004128478467464447,
-0.046338457614183426,
-0.027118908241391182,
-0.059329088777303696,
-0.05956154689192772,
-0.0005768047994934022,
0.13773828744888306,
0.04133805260062218,
0.049651697278022766,
-0.026721199974417686,
0.06553688645362854,
0.0004655478405766189,
-0.1267012506723404,
-0.027774475514888763,
-0.004615172278136015,
-0.061370279639959335,
-0.014258192852139473,
-0.07704243808984756,
-0.02720903791487217,
0.06517498940229416,
0.17369547486305237,
-0.07497448474168777,
0.11939805001020432,
-0.013501741923391819,
0.02304995059967041,
-0.10601754486560822,
0.1172039583325386,
-0.02060338482260704,
0.02813834138214588,
-0.00869420263916254,
0.08055815100669861,
0.03660265728831291,
-0.04871901497244835,
-0.043491702526807785,
0.025607582181692123,
0.09176823496818542,
0.04332414269447327,
-0.03658546507358551,
0.07528875768184662,
-0.06627445667982101,
-0.007170722354203463,
-0.03541497886180878,
-0.11531664431095123,
0.04142925888299942,
0.032959092408418655,
-0.08609478175640106,
0.041080061346292496,
0.059694088995456696,
-0.010369504801928997,
-0.05795581638813019,
0.02639162167906761,
-0.060950376093387604,
-0.03615441545844078,
-0.09693537652492523,
-0.1256561577320099,
0.032882243394851685,
-0.08138887584209442,
-0.01782391034066677,
-0.0705801472067833,
-0.16138163208961487,
-0.03909280523657799,
0.08277761191129684,
-0.037599120289087296,
0.0002842065878212452,
-0.06053822115063667,
-0.16334369778633118,
0.04941437765955925,
-0.0005959469708614051,
0.11276497691869736,
-0.040872666984796524,
0.08163601905107498,
0.0037636100314557552,
0.05226999893784523,
-0.027567321434617043,
0.028616705909371376,
-0.032741665840148926,
0.051057569682598114,
-0.16638870537281036,
0.08180245757102966,
-0.08804956823587418,
0.033059317618608475,
-0.15480229258537292,
-0.08912909775972366,
0.05089542269706726,
0.019472023472189903,
0.12671437859535217,
0.09787894785404205,
-0.1439969688653946,
-0.06169125810265541,
0.11525007337331772,
-0.08708097040653229,
-0.09844015538692474,
0.09615132212638855,
-0.0532456636428833,
0.02724744938313961,
0.07495037466287613,
0.07457266747951508,
0.03974738344550133,
-0.11568567901849747,
0.011080536060035229,
-0.06523441523313522,
0.01623283140361309,
0.06195610761642456,
0.031254008412361145,
-0.038699921220541,
-0.07640545070171356,
0.011756615713238716,
-0.03381231054663658,
0.007618180010467768,
-0.06833088397979736,
-0.061224743723869324,
-0.032201237976551056,
-0.048787832260131836,
0.03719382360577583,
0.03232632204890251,
0.027969922870397568,
-0.1008891761302948,
-0.1506994515657425,
0.05776539444923401,
0.0452248714864254,
-0.07436416298151016,
0.02762126736342907,
-0.09531151503324509,
0.036161500960588455,
0.03425711765885353,
0.018877694383263588,
-0.16235657036304474,
-0.06624618917703629,
0.023895982652902603,
-0.019214289262890816,
0.0038910938892513514,
-0.029177717864513397,
0.072906993329525,
0.0231371708214283,
-0.04455957189202309,
-0.008748291991651058,
-0.04613132029771805,
0.011196206323802471,
-0.06592241674661636,
-0.22291356325149536,
-0.02407950721681118,
-0.02455618791282177,
0.07502555102109909,
-0.2657416760921478,
0.016139382496476173,
0.08648902177810669,
0.10349680483341217,
0.027369052171707153,
-0.02173558436334133,
-0.04440087825059891,
0.05503439903259277,
-0.019565708935260773,
-0.06418357789516449,
0.02831539325416088,
0.016860170289874077,
-0.12229340523481369,
-0.04627928137779236,
-0.17171621322631836,
0.07531370967626572,
0.1191016137599945,
-0.08231678605079651,
-0.13084858655929565,
0.05587456747889519,
-0.040053591132164,
-0.031427230685949326,
-0.009036781266331673,
-0.02577054500579834,
0.15524660050868988,
0.04064542427659035,
0.11414158344268799,
-0.043935660272836685,
-0.021647365763783455,
0.03716197982430458,
-0.02189568802714348,
-0.021875552833080292,
0.12945644557476044,
-0.022233497351408005,
-0.07558092474937439,
0.08209101110696793,
0.09315983206033707,
-0.11086080968379974,
0.08896128833293915,
-0.06122566759586334,
-0.06220877543091774,
-0.08153804391622543,
0.072674959897995,
0.07201485335826874,
0.10475378483533859,
-0.10487816482782364,
0.0048958053812384605,
0.01350366696715355,
0.020511312410235405,
-0.023122696205973625,
-0.19448986649513245,
0.000040688435547053814,
0.00711458595469594,
-0.05818605795502663,
0.0074517084285616875,
0.008157291449606419,
0.017358092591166496,
0.10792477428913116,
0.02926403284072876,
-0.024813609197735786,
0.07063958793878555,
-0.03161473572254181,
-0.09003030508756638,
0.22397488355636597,
-0.12987619638442993,
-0.1279401034116745,
-0.12294447422027588,
-0.015475998632609844,
-0.03919874131679535,
-0.002760515548288822,
0.009059051983058453,
-0.09819444268941879,
-0.06019085645675659,
-0.06174787878990173,
-0.008273649960756302,
-0.021959885954856873,
0.03668539226055145,
0.044800952076911926,
-0.0032488899305462837,
0.13793371617794037,
-0.11001911014318466,
-0.03266522288322449,
-0.0050546955317258835,
-0.07566693425178528,
0.02677111327648163,
0.008504359051585197,
0.008980093523859978,
0.10929279774427414,
0.010471553541719913,
0.026116907596588135,
-0.04067393019795418,
0.23082378506660461,
-0.058565326035022736,
-0.004763181786984205,
0.148098886013031,
-0.02364302985370159,
0.07599243521690369,
0.12023242563009262,
0.04734119400382042,
-0.11672863364219666,
0.05264700949192047,
0.07053983956575394,
-0.02433808147907257,
-0.25575509667396545,
0.010037817060947418,
-0.03176713362336159,
-0.06928493082523346,
0.06479427963495255,
0.037811461836099625,
0.14407482743263245,
0.026503948494791985,
0.0027309865690767765,
0.08692061901092529,
0.049775075167417526,
0.07292380928993225,
0.15602809190750122,
0.05791592597961426,
0.09269234538078308,
-0.03263673931360245,
-0.004254921339452267,
0.022056367248296738,
-0.006079395301640034,
0.22766157984733582,
0.014377880841493607,
0.09065892547369003,
0.0807313546538353,
0.08006149530410767,
-0.03797045350074768,
0.013515077531337738,
0.0027771471068263054,
-0.0023249301593750715,
0.016114765778183937,
-0.061878226697444916,
-0.03928156569600105,
0.04628815874457359,
-0.022761905565857887,
0.06585785746574402,
-0.08835191279649734,
0.027548091486096382,
0.06514875590801239,
0.24552254378795624,
0.09931137412786484,
-0.28988736867904663,
-0.12965698540210724,
0.0038038543425500393,
-0.024238362908363342,
-0.0497942753136158,
-0.007395115215331316,
0.09866717457771301,
-0.07741659879684448,
0.06074875593185425,
-0.07571077346801758,
0.05288071557879448,
-0.03176962584257126,
0.050136175006628036,
0.11608923971652985,
0.10153481364250183,
0.01377782877534628,
0.017116323113441467,
-0.3486728370189667,
0.27727246284484863,
0.02932465635240078,
0.14861032366752625,
-0.07974503934383392,
0.05054795742034912,
0.030971599742770195,
-0.0656677633523941,
0.0754951760172844,
-0.010791455395519733,
-0.13673481345176697,
-0.21159683167934418,
-0.049087367951869965,
0.003994468133896589,
0.148440882563591,
0.007630806881934404,
0.11375601589679718,
-0.0417134165763855,
0.017124194651842117,
0.06971024721860886,
-0.01152044627815485,
-0.15456928312778473,
-0.06564676016569138,
0.058709584176540375,
0.022543931379914284,
-0.03057689778506756,
-0.08590082824230194,
-0.07727077603340149,
-0.0689624696969986,
0.16120736300945282,
-0.16144566237926483,
-0.04911890998482704,
-0.13288527727127075,
0.08434290438890457,
0.10254233330488205,
-0.06613543629646301,
0.028823452070355415,
0.00012370073818601668,
0.06661353260278702,
0.03779510036110878,
-0.07846702635288239,
0.13384974002838135,
-0.021132638677954674,
-0.2363509237766266,
-0.059515662491321564,
0.11103925108909607,
0.055068813264369965,
0.035790812224149704,
-0.011924313381314278,
0.07131199538707733,
0.03385234251618385,
-0.1125231608748436,
0.09919809550046921,
0.018454667180776596,
0.038480646908283234,
0.042458269745111465,
-0.011515269987285137,
0.030017755925655365,
-0.050709206610918045,
0.007686588913202286,
0.08736895769834518,
0.3110870122909546,
-0.0716472938656807,
0.013296889141201973,
0.019759122282266617,
-0.07594283670186996,
-0.20622295141220093,
0.07629184424877167,
0.09887795150279999,
0.008618365973234177,
-0.05087414011359215,
-0.1950501948595047,
0.05618343502283096,
0.0957130566239357,
-0.01362099964171648,
0.09095819294452667,
-0.3105155825614929,
-0.14738211035728455,
0.06787382811307907,
0.1345490664243698,
0.12069182842969894,
-0.1837529093027115,
-0.0520738810300827,
-0.04321501776576042,
-0.05603516474366188,
0.13534872233867645,
-0.08909107744693756,
0.09789691120386124,
0.03244141489267349,
0.04195961356163025,
0.006279637571424246,
-0.04969252645969391,
0.1539069265127182,
-0.027120469138026237,
0.09671318531036377,
-0.039671625941991806,
-0.04849352315068245,
0.11642012745141983,
-0.08040773123502731,
0.034989722073078156,
-0.041442736983299255,
0.029212169349193573,
-0.1172504723072052,
-0.0020508274901658297,
-0.08699243515729904,
0.039449021220207214,
-0.07262077927589417,
-0.013700362294912338,
-0.013398231938481331,
0.05975494906306267,
0.07775292545557022,
-0.0145651213824749,
0.11758673191070557,
-0.008297725580632687,
0.163121297955513,
0.1435748189687729,
0.06983982026576996,
0.03507934883236885,
-0.03853505849838257,
0.0815056785941124,
-0.02549232915043831,
0.07477156817913055,
-0.15251296758651733,
0.054511986672878265,
0.13819406926631927,
0.003897961461916566,
0.15261058509349823,
0.06884795427322388,
-0.07857571542263031,
0.034727875143289566,
0.04099201783537865,
-0.12713587284088135,
-0.1326286643743515,
0.030622370541095734,
0.0369698703289032,
-0.08955516666173935,
0.03926759958267212,
0.1475076526403427,
-0.026328399777412415,
0.026139870285987854,
0.01354885846376419,
0.04102932661771774,
-0.06017245724797249,
0.14392586052417755,
0.026419522240757942,
0.07596199214458466,
-0.07203077524900436,
0.1229541003704071,
0.052057698369026184,
-0.13276569545269012,
0.10828889161348343,
0.026078471913933754,
-0.05878610163927078,
-0.008431138470768929,
0.017034253105521202,
0.07684077322483063,
0.04803450033068657,
-0.05975461006164551,
-0.1311621069908142,
-0.1702113151550293,
0.0717104971408844,
0.19993092119693756,
0.035638462752103806,
0.07317575812339783,
-0.039400532841682434,
0.016558881849050522,
-0.09157104790210724,
0.07076798379421234,
0.05733105540275574,
0.03942430764436722,
-0.1269574761390686,
0.17208118736743927,
0.0014689177041873336,
-0.002372168703004718,
-0.006740460637956858,
0.0005977828986942768,
-0.1948746293783188,
0.012530558742582798,
-0.17507341504096985,
0.0186903178691864,
0.006467324215918779,
-0.021925028413534164,
0.03958529978990555,
-0.059377383440732956,
-0.05623286962509155,
0.04272480681538582,
-0.09671227633953094,
-0.04602755606174469,
0.05589843913912773,
0.04274260625243187,
-0.134555846452713,
-0.07862668484449387,
0.03441368043422699,
-0.12079361081123352,
0.05378143861889839,
0.05796736106276512,
-0.010557910427451134,
0.02985498309135437,
-0.0917099341750145,
0.026418136432766914,
0.03749404475092888,
-0.0019212423358112574,
0.040945641696453094,
-0.1745527982711792,
0.0028564147651195526,
-0.03676508739590645,
0.030108438804745674,
0.0351695753633976,
0.08207396417856216,
-0.08412659168243408,
-0.028788479045033455,
-0.016192642971873283,
-0.03208579868078232,
-0.05384085327386856,
0.049867894500494,
0.1350010335445404,
-0.008116237819194794,
0.1664683073759079,
-0.12761910259723663,
0.029776712879538536,
-0.1877145767211914,
-0.000014300927432486787,
0.01784239336848259,
-0.08209660649299622,
-0.07809951156377792,
-0.010114082135260105,
0.11033084243535995,
-0.09811995923519135,
0.08092210441827774,
-0.08781981468200684,
0.08779043704271317,
0.04202260449528694,
-0.10173383355140686,
-0.07353470474481583,
0.07453913241624832,
0.1730833798646927,
0.04366358369588852,
-0.015922842547297478,
0.03520878404378891,
-0.03318624943494797,
0.07318483293056488,
0.09724432975053787,
0.2123374044895172,
0.1365472376346588,
0.058711934834718704,
0.11086641997098923,
0.06501162797212601,
-0.06284938752651215,
-0.0838693156838417,
0.1310729682445526,
-0.0750962346792221,
0.14805810153484344,
-0.04984014853835106,
0.11540906876325607,
0.05924969166517258,
-0.1775241196155548,
0.028279326856136322,
-0.08511444926261902,
-0.10808959603309631,
-0.11774180084466934,
-0.11424952745437622,
-0.07165760546922684,
-0.10045183449983597,
0.004250218626111746,
-0.0970088467001915,
0.024467965587973595,
0.06829502433538437,
0.04394890367984772,
0.0025276076048612595,
0.10421453416347504,
-0.03471320867538452,
0.040193215012550354,
0.10140931606292725,
-0.006405135616660118,
-0.012267292477190495,
-0.01250217854976654,
-0.07993965595960617,
0.057703591883182526,
-0.00454131793230772,
0.054380450397729874,
0.015856927260756493,
-0.016595508903265,
0.04198915511369705,
-0.018977804109454155,
-0.0969691351056099,
0.05089978501200676,
0.030036933720111847,
0.008819729089736938,
0.0720163881778717,
0.052341528236866,
-0.006536267232149839,
-0.022460801526904106,
0.11794297397136688,
-0.09463265538215637,
-0.03124532476067543,
-0.1584480255842209,
0.26021480560302734,
-0.021726595237851143,
0.0356796570122242,
0.011715838685631752,
-0.07376883178949356,
-0.04313033074140549,
0.1721019297838211,
0.1316196173429489,
-0.0584213025867939,
-0.021414246410131454,
0.06777788698673248,
-0.011930339969694614,
-0.05118483304977417,
0.11728128045797348,
0.0729660764336586,
-0.012883850373327732,
-0.047933731228113174,
-0.04884948581457138,
-0.00800660066306591,
-0.03166014701128006,
-0.04805062338709831,
0.0657927542924881,
0.010139994323253632,
-0.006492790766060352,
-0.027880404144525528,
0.06091688573360443,
-0.08063456416130066,
-0.16240781545639038,
0.11152331531047821,
-0.18864600360393524,
-0.1538650393486023,
-0.0015749207232147455,
0.01702793687582016,
-0.0016367011703550816,
0.06971600651741028,
-0.012357933446764946,
-0.006757660303264856,
0.1253652572631836,
-0.035147957503795624,
-0.0012417573016136885,
-0.09886524081230164,
0.07364991307258606,
-0.04583027586340904,
0.1764746606349945,
-0.01886238530278206,
0.060792289674282074,
0.14030446112155914,
0.01908196322619915,
-0.08575104176998138,
0.03605574741959572,
0.08797191828489304,
-0.11511478573083878,
-0.02176014892756939,
0.09706874191761017,
-0.02399490773677826,
0.13187581300735474,
0.07267118990421295,
-0.0899617150425911,
0.025638030841946602,
-0.0587882362306118,
-0.07722944021224976,
-0.050573986023664474,
-0.04981809854507446,
-0.07369058579206467,
0.12048584967851639,
0.24282993376255035,
-0.04069806635379791,
0.016649173572659492,
-0.03252663090825081,
-0.0011303633218631148,
0.05191408097743988,
0.02616526186466217,
-0.06078106164932251,
-0.23210211098194122,
0.08149485290050507,
0.040538106113672256,
0.05126770958304405,
-0.16463886201381683,
-0.10280391573905945,
0.03453347459435463,
-0.02089993841946125,
-0.07997020334005356,
0.09222666174173355,
0.039505768567323685,
0.06349576264619827,
-0.05736057087779045,
-0.12915362417697906,
-0.048091620206832886,
0.1927548199892044,
-0.09484467655420303,
-0.08057915419340134
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Rocketknight1/distilgpt2-finetuned-wikitext2
This model is a fine-tuned version of [distilgpt2](https://huggingface.co/distilgpt2) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 3.8577
- Validation Loss: 3.6752
- Epoch: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 3.8577 | 3.6752 | 0 |
### Framework versions
- Transformers 4.16.0.dev0
- TensorFlow 2.8.0-rc0
- Datasets 1.17.0
- Tokenizers 0.11.0
| {"license": "apache-2.0", "tags": ["generated_from_keras_callback"], "model-index": [{"name": "Rocketknight1/distilgpt2-finetuned-wikitext2", "results": []}]} | text-generation | Rocketknight1/distilgpt2-finetuned-wikitext2 | [
"transformers",
"tf",
"tensorboard",
"gpt2",
"text-generation",
"generated_from_keras_callback",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #tf #tensorboard #gpt2 #text-generation #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| Rocketknight1/distilgpt2-finetuned-wikitext2
============================================
This model is a fine-tuned version of distilgpt2 on an unknown dataset.
It achieves the following results on the evaluation set:
* Train Loss: 3.8577
* Validation Loss: 3.6752
* Epoch: 0
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* optimizer: {'name': 'AdamWeightDecay', 'learning\_rate': 2e-05, 'decay': 0.0, 'beta\_1': 0.9, 'beta\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight\_decay\_rate': 0.01}
* training\_precision: float32
### Training results
### Framework versions
* Transformers 4.16.0.dev0
* TensorFlow 2.8.0-rc0
* Datasets 1.17.0
* Tokenizers 0.11.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'AdamWeightDecay', 'learning\\_rate': 2e-05, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight\\_decay\\_rate': 0.01}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* TensorFlow 2.8.0-rc0\n* Datasets 1.17.0\n* Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #tf #tensorboard #gpt2 #text-generation #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'AdamWeightDecay', 'learning\\_rate': 2e-05, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight\\_decay\\_rate': 0.01}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* TensorFlow 2.8.0-rc0\n* Datasets 1.17.0\n* Tokenizers 0.11.0"
] | [
69,
118,
4,
40
] | [
"passage: TAGS\n#transformers #tf #tensorboard #gpt2 #text-generation #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'AdamWeightDecay', 'learning\\_rate': 2e-05, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight\\_decay\\_rate': 0.01}\n* training\\_precision: float32### Training results### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* TensorFlow 2.8.0-rc0\n* Datasets 1.17.0\n* Tokenizers 0.11.0"
] | [
-0.0687645673751831,
0.11079514771699905,
-0.003718047868460417,
0.0726165920495987,
0.09706392884254456,
0.012029004283249378,
0.1452808976173401,
0.1685742884874344,
-0.12008501589298248,
0.10107062011957169,
0.14764603972434998,
0.16331999003887177,
0.062154147773981094,
0.1467285454273224,
-0.11465314775705338,
-0.1678931564092636,
0.057950831949710846,
0.019895030185580254,
-0.020540079101920128,
0.09406101703643799,
0.0762241780757904,
-0.09220580756664276,
0.10852422565221786,
0.004908388946205378,
-0.18713383376598358,
0.011645002290606499,
0.07343491911888123,
-0.08345136046409607,
0.08516424894332886,
0.05971401557326317,
0.07053390145301819,
0.04116137698292732,
0.018588591367006302,
-0.15697327256202698,
0.0111598065122962,
0.11071392893791199,
-0.019236240535974503,
0.10990551859140396,
0.022649405524134636,
-0.04855642467737198,
0.1184929832816124,
-0.10452207922935486,
0.03226621448993683,
0.040007542818784714,
-0.12021704763174057,
-0.2503228485584259,
-0.1435442864894867,
0.050077855587005615,
0.04682891070842743,
0.06415782123804092,
0.00425871554762125,
0.2146584689617157,
0.020406045019626617,
0.10427837818861008,
0.22155313193798065,
-0.35604918003082275,
-0.03851892426609993,
0.055101823061704636,
0.03334570676088333,
0.053645696491003036,
-0.05686459690332413,
0.04720775783061981,
0.05341097712516785,
0.04253628849983215,
0.08834388852119446,
-0.03143803030252457,
-0.07845698297023773,
-0.030918970704078674,
-0.09706556797027588,
-0.04369202256202698,
0.15220794081687927,
0.021872492507100105,
-0.06487398594617844,
-0.04395332187414169,
-0.07701703161001205,
-0.18223783373832703,
-0.02700546197593212,
-0.03139704093337059,
0.03121039643883705,
0.0006092808325774968,
-0.07737861573696136,
-0.07056935131549835,
-0.06550564616918564,
-0.046274084597826004,
-0.09440246969461441,
0.1303878128528595,
0.023411143571138382,
0.06761683523654938,
-0.040076497942209244,
0.06052191182971001,
-0.09531204402446747,
-0.1361517459154129,
-0.04719613492488861,
0.0012172369752079248,
-0.02145691029727459,
-0.04204116389155388,
-0.05565914139151573,
-0.13781864941120148,
0.067225381731987,
0.1452943980693817,
-0.12161470949649811,
0.09522446990013123,
-0.12321873009204865,
0.036584798246622086,
-0.09224198758602142,
0.1271960735321045,
-0.041052766144275665,
0.03391236066818237,
0.05442490428686142,
0.010164654813706875,
0.09167369455099106,
-0.04100295528769493,
-0.09772324562072754,
0.011336199007928371,
0.08043564856052399,
0.025696197524666786,
-0.02108924835920334,
0.08559402078390121,
-0.0434027723968029,
-0.012967842631042004,
0.022922465577721596,
-0.0953151062130928,
0.03441537544131279,
-0.0019716667011380196,
-0.07351542264223099,
0.01210466492921114,
0.06792572885751724,
-0.0033307650592178106,
-0.06542402505874634,
0.01818329654633999,
-0.07039284706115723,
-0.009557320736348629,
-0.07764681428670883,
-0.13406087458133698,
0.05426953360438347,
-0.14568857848644257,
-0.033821407705545425,
-0.06625429540872574,
-0.17551110684871674,
-0.00609903410077095,
0.05870421230792999,
-0.0691855400800705,
0.015983929857611656,
-0.046598099172115326,
-0.14823628962039948,
0.059014398604631424,
-0.01114210020750761,
0.09243664145469666,
-0.060340262949466705,
0.05324287712574005,
0.0015029973583295941,
0.08360638469457626,
-0.10389910638332367,
0.03269357234239578,
-0.05797673761844635,
0.033252302557229996,
-0.21470728516578674,
0.10038360953330994,
-0.07121936976909637,
0.02166585624217987,
-0.15220999717712402,
-0.055902332067489624,
0.00012827303726226091,
-0.02370554208755493,
0.11524032056331635,
0.11069224029779434,
-0.19549836218357086,
-0.04759607091546059,
0.18682684004306793,
-0.08986714482307434,
-0.11678355932235718,
0.11049909889698029,
-0.04953785985708237,
-0.0065941717475652695,
0.07844699174165726,
0.1624651402235031,
-0.01473636832088232,
-0.09872809797525406,
-0.041689734905958176,
-0.020928263664245605,
-0.012234542518854141,
-0.03153738006949425,
0.019156599417328835,
-0.024996811524033546,
0.03029823675751686,
0.005283278878778219,
0.004577749874442816,
-0.0001549839653307572,
-0.06688857078552246,
-0.05118095874786377,
-0.08123590052127838,
-0.0448780432343483,
0.02727130986750126,
0.01811191625893116,
0.06391651183366776,
-0.11141139268875122,
-0.13262279331684113,
0.05385764315724373,
0.03248071298003197,
-0.0698828175663948,
0.057001564651727676,
-0.11086653918027878,
0.08393271267414093,
-0.05794557183980942,
0.019962579011917114,
-0.18150515854358673,
-0.06321711093187332,
0.025256820023059845,
0.03590798377990723,
0.026025943458080292,
-0.04908326640725136,
0.07691028714179993,
0.014080967754125595,
-0.05317746475338936,
0.024398362264037132,
0.0165752824395895,
0.0072399163618683815,
-0.09783148020505905,
-0.25218477845191956,
-0.00441178772598505,
-0.045917149633169174,
0.04984408989548683,
-0.196633979678154,
0.029538527131080627,
0.11267916113138199,
0.15680909156799316,
0.055487260222435,
-0.02977539598941803,
-0.005117111373692751,
0.024340514093637466,
-0.054767683148384094,
-0.08316896110773087,
0.024891452863812447,
0.031274791806936264,
-0.12985281646251678,
0.056300338357686996,
-0.1728580892086029,
0.1143006980419159,
0.16291292011737823,
-0.025537695735692978,
-0.09361588209867477,
0.03784232214093208,
-0.025305412709712982,
-0.012110532261431217,
-0.019846653565764427,
0.020219020545482635,
0.12889426946640015,
0.028040746226906776,
0.14364029467105865,
-0.08216611295938492,
-0.051555335521698,
0.061508357524871826,
-0.03468817472457886,
-0.024725548923015594,
0.07661239802837372,
-0.020416149869561195,
-0.14822684228420258,
0.12489957362413406,
0.09979427605867386,
-0.07547207921743393,
0.12191415578126907,
-0.051427312195301056,
-0.06797405332326889,
-0.055483732372522354,
0.03264842927455902,
0.053118012845516205,
0.09330230206251144,
-0.10165473073720932,
-0.0019723623991012573,
0.03520454838871956,
0.020698556676506996,
-0.002741388510912657,
-0.1755814552307129,
0.023814132437109947,
-0.014055929146707058,
-0.07990938425064087,
0.004419680684804916,
0.015006829984486103,
0.017995662987232208,
0.1295996606349945,
0.028499815613031387,
-0.020601900294423103,
0.07667999714612961,
0.008083386346697807,
-0.08419116586446762,
0.20337854325771332,
-0.12256384640932083,
-0.11837594956159592,
-0.10494885593652725,
-0.08913836628198624,
-0.08774352073669434,
0.00017683261830825359,
0.026018261909484863,
-0.09657744318246841,
-0.06660868972539902,
-0.09930817782878876,
-0.02695319801568985,
-0.005772498436272144,
0.049892351031303406,
0.04661767929792404,
-0.014317939057946205,
0.09806468337774277,
-0.10523386299610138,
-0.04334893450140953,
-0.03896665200591087,
-0.02055082470178604,
0.027321025729179382,
0.012827972881495953,
0.03254913166165352,
0.1237667128443718,
-0.050125692039728165,
0.05707374960184097,
-0.04450967162847519,
0.1820300668478012,
-0.054571058601140976,
0.011842429637908936,
0.11569627374410629,
-0.023237397894263268,
0.0685393288731575,
0.11147156357765198,
0.047138798981904984,
-0.11013849079608917,
0.007067848462611437,
0.06292668730020523,
-0.06256888061761856,
-0.2799864411354065,
-0.017035117372870445,
-0.045524463057518005,
-0.043821629136800766,
0.048489272594451904,
0.05211431905627251,
0.1322920024394989,
0.04433974623680115,
0.004909656010568142,
0.0994672179222107,
0.00007735876715742052,
0.09403388947248459,
0.1701151579618454,
0.06364955753087997,
0.1233639195561409,
-0.07973933964967728,
0.018746457993984222,
0.06211724504828453,
0.0189210195094347,
0.20967388153076172,
0.006151811219751835,
0.1484013944864273,
0.07617542147636414,
0.09137114882469177,
0.006657721009105444,
0.012837535701692104,
-0.0010083775268867612,
-0.020681479945778847,
0.0086502805352211,
-0.07495364546775818,
-0.012418631464242935,
0.03898727521300316,
-0.08199387043714523,
0.03691673278808594,
-0.06025461107492447,
0.05434903874993324,
0.07632522284984589,
0.28338417410850525,
0.07651142030954361,
-0.37898755073547363,
-0.1081031933426857,
0.012507176026701927,
0.0013672683853656054,
-0.046047110110521317,
-0.023159712553024292,
0.08528358489274979,
-0.043115731328725815,
0.1291326880455017,
-0.08607830107212067,
0.07552659511566162,
-0.04271792992949486,
0.019854212179780006,
0.04543735086917877,
0.1289135366678238,
-0.0167822428047657,
0.008956622332334518,
-0.29970526695251465,
0.2758231461048126,
0.06392062455415726,
0.1180124506354332,
-0.0845726951956749,
0.032915789633989334,
0.02874227799475193,
0.036127232015132904,
0.1134924441576004,
-0.016732502728700638,
-0.15504322946071625,
-0.08884620666503906,
-0.08671146631240845,
0.0014033580664545298,
0.11793023347854614,
0.07144859433174133,
0.09756472706794739,
-0.013601729646325111,
-0.003208072856068611,
0.06486064195632935,
-0.05488666146993637,
-0.10810427367687225,
-0.07359147071838379,
0.03235776349902153,
0.08408515155315399,
-0.0849958136677742,
-0.04792751744389534,
-0.0878266841173172,
-0.06852631270885468,
0.23626747727394104,
-0.06368982046842575,
-0.06525702029466629,
-0.1336362212896347,
0.10936877876520157,
0.0773921012878418,
-0.057848215103149414,
0.03803550824522972,
-0.026472153142094612,
0.0962676852941513,
0.029627487063407898,
-0.13613931834697723,
0.13972754776477814,
-0.0302896685898304,
-0.1797744631767273,
-0.04228880628943443,
0.11196895688772202,
0.031013749539852142,
0.03899051994085312,
0.007245123852044344,
0.07184768468141556,
0.038550201803445816,
-0.09142046421766281,
0.09975767880678177,
0.02512485720217228,
0.07357822358608246,
-0.0026010985020548105,
0.009802032262086868,
-0.04352489858865738,
-0.028031988069415092,
0.01803375408053398,
0.15199582278728485,
0.22746014595031738,
-0.10177808254957199,
0.07567128539085388,
0.004313461482524872,
-0.08821021020412445,
-0.20554770529270172,
0.10315268486738205,
0.04976828396320343,
-0.004834349732846022,
-0.05597158893942833,
-0.15797650814056396,
0.06326402723789215,
0.08954494446516037,
-0.02066281996667385,
0.10637643188238144,
-0.2893471121788025,
-0.1395660787820816,
0.07786517590284348,
0.10729309171438217,
0.12919506430625916,
-0.15681670606136322,
-0.0730988085269928,
-0.048645008355379105,
-0.08237447589635849,
0.1730605959892273,
-0.1908079981803894,
0.1061205342411995,
-0.0034837748389691114,
0.04871304705739021,
0.005857622250914574,
-0.055412858724594116,
0.09632506221532822,
-0.00867945235222578,
0.08422087132930756,
-0.06173583120107651,
0.016230547800660133,
0.16934984922409058,
-0.08446752279996872,
0.062384821474552155,
-0.07114417105913162,
0.03588919714093208,
-0.061958760023117065,
0.002039925428107381,
-0.06271757185459137,
0.05981671065092087,
-0.029948709532618523,
-0.0225293580442667,
-0.046909183263778687,
0.005557205993682146,
0.06551393121480942,
-0.038180161267519,
0.20116859674453735,
-0.0013223221758380532,
0.16506776213645935,
0.22324305772781372,
0.1033438891172409,
-0.07784286141395569,
0.035538822412490845,
0.07414361834526062,
-0.055256348103284836,
0.05429380014538765,
-0.22734977304935455,
0.05063425377011299,
0.11339426785707474,
0.0029289559461176395,
0.11247175186872482,
0.06428481638431549,
-0.08357750624418259,
0.047938086092472076,
0.05763797089457512,
-0.1539752185344696,
-0.13267427682876587,
0.04221365228295326,
-0.037675585597753525,
-0.07789971679449081,
0.10645589232444763,
0.1646192967891693,
-0.04385308176279068,
0.014722897671163082,
0.021837076172232628,
0.025005033239722252,
-0.0597090870141983,
0.13915254175662994,
0.014543863013386726,
0.03760331869125366,
-0.10977743566036224,
0.13686209917068481,
0.026637068018317223,
-0.10628830641508102,
0.11927228420972824,
0.07383587211370468,
-0.09751834720373154,
-0.01804792508482933,
0.023803045973181725,
0.13248266279697418,
-0.04289478436112404,
-0.05413354933261871,
-0.13466355204582214,
-0.1314699947834015,
0.08659448474645615,
0.2791101932525635,
0.03905455023050308,
0.05006696283817291,
-0.03661229833960533,
-0.020920438691973686,
-0.1147264763712883,
0.06639021635055542,
0.02933412604033947,
0.0660923421382904,
-0.13056102395057678,
0.12318719178438187,
-0.005733813159167767,
0.020674992352724075,
-0.034588802605867386,
0.026558296754956245,
-0.13787350058555603,
-0.00441382173448801,
-0.16649173200130463,
0.023804545402526855,
-0.04216673970222473,
-0.02787790447473526,
-0.008387120440602303,
-0.052478235214948654,
-0.08880431950092316,
0.047188106924295425,
-0.0978521779179573,
-0.03862221911549568,
0.02947467379271984,
0.011656290851533413,
-0.14702193439006805,
-0.03551657125353813,
-0.023913603276014328,
-0.07678879052400589,
0.07900558412075043,
0.0710233524441719,
-0.01797451451420784,
0.02764088846743107,
-0.10208680480718613,
0.0019860335160046816,
0.08148137480020523,
-0.009051370434463024,
0.07984734326601028,
-0.0958961769938469,
0.0036344188265502453,
0.013826913200318813,
0.027260074391961098,
0.038038235157728195,
0.14088734984397888,
-0.08014620095491409,
-0.021948695182800293,
-0.04917202889919281,
-0.001649401499889791,
-0.05717676132917404,
0.09604230523109436,
0.1532284915447235,
0.014434009790420532,
0.14350679516792297,
-0.10684190690517426,
-0.012097722850739956,
-0.17476658523082733,
0.009379303082823753,
0.008024168200790882,
-0.1323067843914032,
-0.08230450004339218,
0.009160537272691727,
0.09356565773487091,
-0.09256519377231598,
0.13806849718093872,
-0.016301991418004036,
0.02801646664738655,
0.06604336202144623,
-0.043210387229919434,
-0.08899130672216415,
0.03741209954023361,
0.18359732627868652,
0.026582861319184303,
-0.04258214682340622,
0.046079304069280624,
0.0041171289049088955,
0.09150270372629166,
0.06814953684806824,
0.21992921829223633,
0.11585432291030884,
0.03563690558075905,
0.14026588201522827,
0.07729412615299225,
-0.04898526519536972,
-0.13409967720508575,
0.1517835110425949,
-0.09073701500892639,
0.17358073592185974,
-0.024558590725064278,
0.07660487294197083,
0.12528330087661743,
-0.14983239769935608,
0.0200883187353611,
-0.03877827897667885,
-0.08591361343860626,
-0.14377644658088684,
-0.1245039775967598,
-0.10855232179164886,
-0.14529718458652496,
0.005511907860636711,
-0.12182862311601639,
0.07024882733821869,
0.046095043420791626,
0.041175976395606995,
0.0010959056671708822,
0.1190832182765007,
-0.017789585515856743,
0.005840339232236147,
0.08703180402517319,
-0.002395358169451356,
-0.03517936170101166,
-0.03255584090948105,
-0.06924205273389816,
0.06406071037054062,
0.003786475397646427,
0.048750314861536026,
0.027861347422003746,
0.06214486435055733,
0.06630900502204895,
-0.053250763565301895,
-0.10811139643192291,
0.0363333597779274,
0.0647146999835968,
0.04160905256867409,
0.04045502096414566,
0.0480496883392334,
-0.02566484361886978,
-0.020414244383573532,
0.15351058542728424,
-0.09683269262313843,
-0.015220260247588158,
-0.1604272574186325,
0.23702308535575867,
0.00847043376415968,
-0.007184334099292755,
0.01722380705177784,
-0.08089788258075714,
-0.0259057879447937,
0.16251760721206665,
0.18011391162872314,
-0.01041252538561821,
-0.018853140994906425,
0.010799191892147064,
-0.012311476282775402,
-0.038415856659412384,
0.10584685206413269,
0.08918383717536926,
-0.011275715194642544,
-0.052634406834840775,
-0.0750095397233963,
-0.020613091066479683,
-0.013652200810611248,
-0.04052950069308281,
0.0810738131403923,
0.009265699423849583,
-0.018363414332270622,
-0.009434659034013748,
0.05617431923747063,
-0.07256300002336502,
-0.10823485255241394,
0.06578895449638367,
-0.2173217087984085,
-0.14496482908725739,
0.013252928853034973,
-0.013117523863911629,
-0.0157224889844656,
0.043926533311605453,
-0.011037658900022507,
-0.005289626307785511,
0.10019440948963165,
-0.032719794660806656,
-0.08175808936357498,
-0.07574587315320969,
0.05121583119034767,
-0.13660858571529388,
0.18619295954704285,
-0.027074605226516724,
0.03614639490842819,
0.15155382454395294,
0.02977726422250271,
-0.11794856190681458,
0.03794239088892937,
0.030818235129117966,
-0.06693340092897415,
-0.006057815160602331,
0.10822512954473495,
-0.02888805978000164,
0.10629099607467651,
0.06496061384677887,
-0.08911947906017303,
0.001174233271740377,
-0.09776710718870163,
-0.06741048395633698,
-0.05218357965350151,
-0.059611719101667404,
-0.0813525915145874,
0.11444127559661865,
0.19680607318878174,
-0.030384669080376625,
0.026165567338466644,
-0.05216258391737938,
-0.0007637570379301906,
0.07035256177186966,
-0.018711388111114502,
-0.043335817754268646,
-0.242079958319664,
0.06442870199680328,
0.12505914270877838,
0.021830499172210693,
-0.2422434240579605,
-0.07067961245775223,
0.000925306580029428,
-0.01931043155491352,
-0.10439646989107132,
0.07056055963039398,
0.10342506319284439,
0.06232834607362747,
-0.06412545591592789,
-0.08172289282083511,
-0.01997201517224312,
0.14802885055541992,
-0.10899393260478973,
-0.06515857577323914
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# distilroberta-base-finetuned-wikitext2
This model is a fine-tuned version of [distilroberta-base](https://huggingface.co/distilroberta-base) on an unknown dataset.
It achieves the following results on the evaluation set:
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- TensorFlow 2.8.0-rc0
- Datasets 1.17.0
- Tokenizers 0.11.0
| {"license": "apache-2.0", "tags": ["generated_from_keras_callback"], "model-index": [{"name": "distilroberta-base-finetuned-wikitext2", "results": []}]} | fill-mask | Rocketknight1/distilroberta-base-finetuned-wikitext2 | [
"transformers",
"tf",
"roberta",
"fill-mask",
"generated_from_keras_callback",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #tf #roberta #fill-mask #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# distilroberta-base-finetuned-wikitext2
This model is a fine-tuned version of distilroberta-base on an unknown dataset.
It achieves the following results on the evaluation set:
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- TensorFlow 2.8.0-rc0
- Datasets 1.17.0
- Tokenizers 0.11.0
| [
"# distilroberta-base-finetuned-wikitext2\n\nThis model is a fine-tuned version of distilroberta-base on an unknown dataset.\nIt achieves the following results on the evaluation set:",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}\n- training_precision: float32",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- TensorFlow 2.8.0-rc0\n- Datasets 1.17.0\n- Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #tf #roberta #fill-mask #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# distilroberta-base-finetuned-wikitext2\n\nThis model is a fine-tuned version of distilroberta-base on an unknown dataset.\nIt achieves the following results on the evaluation set:",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}\n- training_precision: float32",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- TensorFlow 2.8.0-rc0\n- Datasets 1.17.0\n- Tokenizers 0.11.0"
] | [
55,
50,
6,
12,
8,
3,
112,
4,
40
] | [
"passage: TAGS\n#transformers #tf #roberta #fill-mask #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n# distilroberta-base-finetuned-wikitext2\n\nThis model is a fine-tuned version of distilroberta-base on an unknown dataset.\nIt achieves the following results on the evaluation set:## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}\n- training_precision: float32### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- TensorFlow 2.8.0-rc0\n- Datasets 1.17.0\n- Tokenizers 0.11.0"
] | [
-0.06249793618917465,
0.1035178005695343,
-0.004307166673243046,
0.06621947139501572,
0.12157366424798965,
0.043647583574056625,
0.11055131256580353,
0.13672032952308655,
-0.10202264785766602,
0.08954265713691711,
0.09547101706266403,
0.09428723901510239,
0.06448762863874435,
0.13723157346248627,
-0.08196716010570526,
-0.1171647161245346,
0.04318345710635185,
-0.028155269101262093,
-0.037537652999162674,
0.07418899983167648,
0.10414264351129532,
-0.06931180506944656,
0.08464264869689941,
-0.0020159746054559946,
-0.16355441510677338,
0.045585937798023224,
0.05703836679458618,
-0.06780832260847092,
0.07717085629701614,
0.050362180918455124,
0.09600744396448135,
0.006210848223417997,
0.054800454527139664,
-0.179362952709198,
0.008651504293084145,
0.11837644129991531,
0.011654922738671303,
0.08748691529035568,
0.010002786293625832,
-0.024290520697832108,
0.10446678102016449,
-0.15001152455806732,
0.08162897080183029,
0.030319582670927048,
-0.10593349486589432,
-0.18403205275535583,
-0.1322363317012787,
0.06898798793554306,
0.06133405491709709,
0.10023505985736847,
0.005932143423706293,
0.18202617764472961,
-0.049051038920879364,
0.07429632544517517,
0.13909728825092316,
-0.26007285714149475,
-0.041864439845085144,
0.06649905443191528,
0.01353814359754324,
0.0271834097802639,
-0.07051171362400055,
0.030143482610583305,
0.05036173760890961,
0.03219294920563698,
0.0539240837097168,
-0.034209564328193665,
-0.08645999431610107,
-0.04439637064933777,
-0.06964067369699478,
-0.034449126571416855,
0.150611013174057,
0.03125200420618057,
-0.08348480612039566,
-0.0818873941898346,
-0.039217811077833176,
-0.08074290305376053,
0.01109505258500576,
-0.04584583640098572,
0.017480574548244476,
-0.006283151917159557,
-0.0439753383398056,
-0.07284032553434372,
-0.07167989760637283,
-0.02522449567914009,
-0.0452527217566967,
0.1050078272819519,
0.012224249541759491,
0.05425521731376648,
-0.025099221616983414,
0.06860765814781189,
-0.07498254626989365,
-0.11272159218788147,
-0.022098738700151443,
-0.01195061206817627,
-0.10155243426561356,
-0.0616009458899498,
-0.05860090255737305,
-0.07463598251342773,
0.04351276159286499,
0.16065317392349243,
-0.08605192601680756,
0.08818286657333374,
-0.07174582034349442,
0.007435504347085953,
-0.04398307576775551,
0.09856796264648438,
-0.04819079861044884,
-0.01897338032722473,
0.028348369523882866,
0.06653685122728348,
0.032075654715299606,
-0.019885694608092308,
-0.056241780519485474,
-0.021077824756503105,
0.04385097324848175,
0.03895414248108864,
-0.04161744937300682,
0.05190836638212204,
-0.0677805244922638,
-0.006931312847882509,
0.03209618106484413,
-0.12809807062149048,
0.03123645856976509,
-0.00799598265439272,
-0.08918308466672897,
0.015264365822076797,
0.10356375575065613,
0.013434980995953083,
-0.036260541528463364,
0.06005275994539261,
-0.06506752967834473,
-0.020370036363601685,
-0.07714418321847916,
-0.09768674522638321,
0.022115398198366165,
-0.13509975373744965,
-0.020890431478619576,
-0.05453473702073097,
-0.20675207674503326,
-0.05347777530550957,
0.06073885038495064,
-0.0654183030128479,
0.005123994313180447,
-0.0260807853192091,
-0.11613496392965317,
0.04348824918270111,
0.011841874569654465,
0.13927988708019257,
-0.03417664393782616,
0.04843446612358093,
-0.008294645696878433,
0.05823012441396713,
-0.049054019153118134,
0.02690739743411541,
-0.05064908787608147,
0.028135428205132484,
-0.14365443587303162,
0.11140726506710052,
-0.07536885887384415,
0.017077598720788956,
-0.1568327248096466,
-0.04723590239882469,
-0.009118014946579933,
-0.008437525480985641,
0.10230585932731628,
0.10763180255889893,
-0.20349301397800446,
-0.01528154592961073,
0.11722791194915771,
-0.08256035298109055,
-0.06450501084327698,
0.09086371213197708,
-0.06930148601531982,
0.0730714201927185,
0.06456239521503448,
0.10579229146242142,
0.04246365278959274,
-0.1393621861934662,
-0.015574277378618717,
-0.028983306139707565,
-0.004490587394684553,
0.04958156868815422,
0.02165869250893593,
-0.041973602026700974,
-0.0013628923334181309,
-0.0024856911040842533,
-0.010592843405902386,
-0.013468952849507332,
-0.055003926157951355,
-0.06564216315746307,
-0.04989238828420639,
-0.059534721076488495,
0.03195606917142868,
0.006098983809351921,
0.024878544732928276,
-0.08196592330932617,
-0.13722693920135498,
0.051719892770051956,
0.0903497114777565,
-0.01993284933269024,
0.022200381383299828,
-0.09985407441854477,
-0.012194514274597168,
-0.017786653712391853,
-0.013797068037092686,
-0.19969289004802704,
-0.0741557627916336,
0.018507780507206917,
-0.008610660210251808,
0.046501219272613525,
-0.006496588699519634,
0.061321962624788284,
0.013517173007130623,
-0.046849556267261505,
0.016025060787796974,
-0.04449500888586044,
0.008021303452551365,
-0.0816427618265152,
-0.22202511131763458,
-0.005188975017517805,
-0.04110294207930565,
0.14456264674663544,
-0.2490883469581604,
0.006600913126021624,
0.050765492022037506,
0.1578969955444336,
0.05288996919989586,
-0.046268001198768616,
0.004032619297504425,
0.02025047317147255,
-0.00772804394364357,
-0.08833266794681549,
0.008463363163173199,
0.017178529873490334,
-0.11926120519638062,
-0.003861660836264491,
-0.16658219695091248,
-0.007072972599416971,
0.10442046821117401,
0.02979615144431591,
-0.15028679370880127,
0.036947574466466904,
-0.036536600440740585,
-0.031072793528437614,
-0.05932074785232544,
-0.008611265569925308,
0.19325044751167297,
0.023787260055541992,
0.12411151081323624,
-0.026083217933773994,
-0.04329511895775795,
0.02573666162788868,
-0.019084425643086433,
-0.03889816254377365,
0.05902645364403725,
-0.013743394054472446,
-0.1673160344362259,
0.06673362851142883,
0.08139614760875702,
-0.0722285807132721,
0.12918956577777863,
-0.025524470955133438,
-0.07376844435930252,
-0.06062634289264679,
0.02743012085556984,
0.04152557998895645,
0.10175637900829315,
-0.057635348290205,
0.015955328941345215,
0.03840946406126022,
0.0015193563885986805,
0.008233550004661083,
-0.15875093638896942,
0.0177236907184124,
0.009733907878398895,
-0.044884324073791504,
0.008298075757920742,
0.005141651723533869,
0.03057042695581913,
0.10846519470214844,
0.033552516251802444,
0.02420334331691265,
0.0592961311340332,
-0.020465625450015068,
-0.08072385936975479,
0.17712609469890594,
-0.12630830705165863,
-0.11443068832159042,
-0.12242191284894943,
0.024924736469984055,
-0.04210253432393074,
-0.019597036764025688,
-0.011858687736093998,
-0.10093686729669571,
-0.0847921073436737,
-0.08637011051177979,
-0.04034155607223511,
-0.016809480264782906,
0.0006910606753081083,
0.08474712073802948,
0.0006177069735713303,
0.12664096057415009,
-0.11573464423418045,
-0.025432907044887543,
-0.014085567556321621,
-0.0773119106888771,
0.011173289269208908,
0.023023098707199097,
0.04176407679915428,
0.08452517539262772,
-0.033341262489557266,
0.029776163399219513,
-0.03628981485962868,
0.19910138845443726,
-0.06253702193498611,
0.018151335418224335,
0.11817587912082672,
-0.013402989134192467,
0.046681392937898636,
0.12346875667572021,
0.02684587612748146,
-0.10928982496261597,
0.05156964063644409,
0.10541953146457672,
-0.035574764013290405,
-0.27072346210479736,
-0.02713976800441742,
-0.029645999893546104,
-0.12229117751121521,
0.05810081213712692,
0.038451649248600006,
0.1260840743780136,
0.04238506779074669,
-0.007236109115183353,
0.10025504231452942,
0.005148271564394236,
0.07019772380590439,
0.11702940613031387,
0.06595730781555176,
0.09231386333703995,
-0.06659548729658127,
0.023700499907135963,
0.07294363528490067,
-0.009048428386449814,
0.23666875064373016,
-0.02777942083775997,
0.07932648807764053,
0.07488496601581573,
0.06295188516378403,
-0.0357837975025177,
0.007825838401913643,
0.011409370228648186,
-0.0201145987957716,
0.010948412120342255,
-0.08069343864917755,
-0.042665526270866394,
0.04448293522000313,
-0.039039116352796555,
0.05971684679389,
-0.06332316994667053,
0.07306647300720215,
0.03923175856471062,
0.21256162226200104,
0.06262664496898651,
-0.3014989197254181,
-0.09173135459423065,
0.006340043153613806,
0.008498097769916058,
-0.0597909577190876,
-0.046695079654455185,
0.10090033710002899,
-0.10169808566570282,
0.10287311673164368,
-0.05292868614196777,
0.06580699235200882,
-0.02943861298263073,
0.005365354008972645,
0.061310797929763794,
0.13437853753566742,
-0.01247257087379694,
0.020545054227113724,
-0.276956707239151,
0.24320632219314575,
0.05663178488612175,
0.15868884325027466,
-0.08662965893745422,
0.05021741986274719,
0.03307271748781204,
0.0086878826841712,
0.11920535564422607,
-0.010262549854815006,
-0.08850757032632828,
-0.14706052839756012,
-0.061617590487003326,
-0.008320333436131477,
0.12996423244476318,
0.02985387295484543,
0.08964703977108002,
-0.025473574176430702,
0.010575732216238976,
0.05532858148217201,
-0.008379865437746048,
-0.21415330469608307,
-0.10674934089183807,
0.045636530965566635,
0.08001778274774551,
-0.1051088199019432,
-0.06354394555091858,
-0.07501374930143356,
-0.021960049867630005,
0.195095494389534,
-0.028491748496890068,
-0.044988151639699936,
-0.1537930965423584,
0.10543491691350937,
0.12638214230537415,
-0.05058463290333748,
0.010501531884074211,
-0.00739778159186244,
0.08936473727226257,
0.025472931563854218,
-0.10159026831388474,
0.0821010023355484,
-0.04572352021932602,
-0.16742825508117676,
-0.064687080681324,
0.10533548146486282,
0.06119535490870476,
0.0476217046380043,
0.012507360428571701,
0.05938158184289932,
0.061296913772821426,
-0.08294351398944855,
0.07874554395675659,
0.06259669363498688,
0.07568719983100891,
0.0905667245388031,
-0.03234204277396202,
-0.0629892572760582,
-0.032901179045438766,
-0.008296382613480091,
0.09424684196710587,
0.21206489205360413,
-0.07634022831916809,
0.08552148938179016,
0.0072262221947312355,
-0.09671681374311447,
-0.19679665565490723,
0.1299458146095276,
0.1150038093328476,
0.02375861257314682,
-0.0017991120694205165,
-0.16058564186096191,
0.08744503557682037,
0.08906406909227371,
-0.012292595580220222,
0.04447516053915024,
-0.28376641869544983,
-0.14494211971759796,
0.041358582675457,
0.09255098551511765,
0.08740980178117752,
-0.14524956047534943,
-0.041395194828510284,
-0.06826991587877274,
-0.11983226239681244,
0.14183244109153748,
-0.17180141806602478,
0.09263341873884201,
0.023452946916222572,
0.07366855442523956,
0.018366647884249687,
-0.04013421759009361,
0.13525746762752533,
0.018312128260731697,
0.09637323766946793,
-0.06628982722759247,
-0.005533509887754917,
0.14328110218048096,
-0.07121067494153976,
0.07373633235692978,
0.008908742107450962,
0.043869614601135254,
-0.11143050342798233,
0.005368879530578852,
-0.08161312341690063,
0.06745504587888718,
-0.0671255812048912,
-0.04230745509266853,
-0.0258774571120739,
0.06404005736112595,
0.06869599223136902,
-0.04981856420636177,
0.05799213424324989,
-0.011345882900059223,
0.14178557693958282,
0.1521172970533371,
0.10167913883924484,
0.03186037018895149,
-0.02159123308956623,
0.06672586500644684,
-0.036268629133701324,
0.061177708208560944,
-0.1746758222579956,
0.04489004984498024,
0.10041607171297073,
0.013689117506146431,
0.15079903602600098,
0.023537898436188698,
-0.09704773128032684,
0.029258333146572113,
0.05491477996110916,
-0.10381531715393066,
-0.07888472080230713,
0.036803510040044785,
-0.012515070848166943,
-0.08632050454616547,
0.014216424897313118,
0.15995638072490692,
-0.03076448291540146,
0.016417082399129868,
-0.00334015185944736,
0.02042200230062008,
-0.05929417908191681,
0.1569577306509018,
0.014665739610791206,
0.05193609371781349,
-0.07786845415830612,
0.14845535159111023,
0.06467314064502716,
-0.10628460347652435,
0.10960298031568527,
0.029703205451369286,
-0.08475752174854279,
-0.020465567708015442,
0.021134529262781143,
0.14770545065402985,
-0.018143411725759506,
-0.05331287160515785,
-0.08279021829366684,
-0.10664597153663635,
0.04454406350851059,
0.1821705400943756,
0.028147263452410698,
0.030613072216510773,
-0.031448058784008026,
-0.017337406054139137,
-0.13127395510673523,
0.05311065912246704,
0.04969640076160431,
0.04180994629859924,
-0.12244100123643875,
0.1484900563955307,
-0.004628756549209356,
-0.0004181538533885032,
-0.020982321351766586,
0.026206210255622864,
-0.11216419190168381,
-0.018074974417686462,
-0.15140023827552795,
0.0277828611433506,
-0.01474667526781559,
-0.0125668253749609,
0.011814679950475693,
-0.03608512878417969,
-0.05314711108803749,
0.053200121968984604,
-0.07118929922580719,
-0.05927153304219246,
0.03688548505306244,
0.031709905713796616,
-0.1313641220331192,
-0.05392620339989662,
0.00727777648717165,
-0.08292821049690247,
0.05291169509291649,
0.05721927434206009,
0.001020754803903401,
0.0031464972998946905,
-0.126975879073143,
-0.005120450165122747,
0.036681126803159714,
0.0011884394334629178,
0.06766936928033829,
-0.11281315237283707,
0.005650524981319904,
-0.028289789333939552,
0.04594099149107933,
0.033798735588788986,
0.10216882079839706,
-0.08809175342321396,
-0.04123055934906006,
-0.029364190995693207,
-0.013725943863391876,
-0.056029632687568665,
0.08375456184148788,
0.15694180130958557,
0.020511934533715248,
0.15778933465480804,
-0.1214878112077713,
0.01735648512840271,
-0.16320422291755676,
-0.016534097492694855,
-0.012497363612055779,
-0.03134290874004364,
-0.06924691796302795,
-0.0019486576784402132,
0.09616159647703171,
-0.08355401456356049,
0.10796001553535461,
-0.0260920412838459,
0.09065096080303192,
0.057868290692567825,
-0.06745706498622894,
-0.08054647594690323,
0.020609047263860703,
0.14676035940647125,
0.04356842488050461,
-0.02667444758117199,
0.05499449744820595,
-0.02971487119793892,
0.06950321793556213,
0.04124845564365387,
0.19591358304023743,
0.1466406285762787,
0.018735425546765327,
0.08004367351531982,
0.052523836493492126,
-0.09450174123048782,
-0.05109492316842079,
0.13782672584056854,
-0.0861985832452774,
0.11281125247478485,
-0.05945662781596184,
0.030136359855532646,
0.09652142226696014,
-0.1723013073205948,
0.042036186903715134,
-0.065233513712883,
-0.09214576333761215,
-0.12187715619802475,
-0.08224878460168839,
-0.08553224802017212,
-0.04734725132584572,
0.01911546103656292,
-0.11103444546461105,
0.06643449515104294,
0.06221921741962433,
0.024739017710089684,
0.013893237337470055,
0.13439683616161346,
-0.11710317432880402,
0.003980416338890791,
0.09199905395507812,
0.0020130693446844816,
-0.023762039840221405,
-0.03165329247713089,
-0.0361919030547142,
0.07683966308832169,
0.02158314920961857,
0.06584487110376358,
-0.0000179698909050785,
0.04789668321609497,
0.036021679639816284,
-0.027637064456939697,
-0.0868680402636528,
0.038817379623651505,
0.034069642424583435,
0.03261033073067665,
0.04961318150162697,
0.06726957112550735,
0.0009457775158807635,
-0.04915400594472885,
0.19596722722053528,
-0.08742505311965942,
-0.06610789895057678,
-0.18775981664657593,
0.1895584762096405,
0.021518724039196968,
0.010205713100731373,
0.02569611929357052,
-0.10672841221094131,
0.0027020075358450413,
0.13115954399108887,
0.16584515571594238,
-0.050869595259428024,
-0.016095103695988655,
0.01758546754717827,
-0.009770243428647518,
-0.060056887567043304,
0.12264678627252579,
0.06308599561452866,
-0.020358946174383163,
-0.051009297370910645,
-0.02651691623032093,
-0.006801455747336149,
-0.02973088063299656,
-0.05475609377026558,
0.056677769869565964,
0.00016138162754941732,
0.0010363470064476132,
-0.010625703260302544,
0.05940406396985054,
0.009379575960338116,
-0.2066405862569809,
0.10352148115634918,
-0.19674737751483917,
-0.15614217519760132,
-0.0189291350543499,
-0.0028152181766927242,
-0.0009817968821153045,
0.05805868282914162,
-0.006501916330307722,
-0.005401206202805042,
0.14029806852340698,
-0.03579074889421463,
-0.04570696875452995,
-0.08133187890052795,
0.040524642914533615,
-0.0622420608997345,
0.20177170634269714,
-0.003293762682005763,
0.07284093648195267,
0.130091592669487,
0.005317761562764645,
-0.12470291554927826,
0.05677999183535576,
0.04877854511141777,
-0.047217998653650284,
0.006905613001435995,
0.15034419298171997,
-0.04387044161558151,
0.09371781349182129,
0.05547761544585228,
-0.1300392746925354,
0.02120528742671013,
-0.06462690979242325,
-0.05024149641394615,
-0.07462744414806366,
-0.009701796807348728,
-0.09512443840503693,
0.15375658869743347,
0.2268223613500595,
-0.030946968123316765,
0.035057876259088516,
-0.06047108396887779,
0.006403640378266573,
0.06687548011541367,
0.048859644681215286,
-0.05892748385667801,
-0.1775401383638382,
0.0823935717344284,
0.018981853500008583,
0.05120992660522461,
-0.24657763540744781,
-0.08000130951404572,
0.023157918825745583,
-0.03932863473892212,
-0.07066912204027176,
0.09777452796697617,
0.06499677896499634,
0.0677322968840599,
-0.03935371711850166,
-0.16428139805793762,
-0.005880517885088921,
0.14892356097698212,
-0.10411835461854935,
-0.049095846712589264
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Rocketknight1/gbert-base-germaner
This model is a fine-tuned version of [deepset/gbert-base](https://huggingface.co/deepset/gbert-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0340
- Validation Loss: 0.0881
- Epoch: 2
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'inner_optimizer': {'class_name': 'AdamWeightDecay', 'config': {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 4176, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}}, 'dynamic': True, 'initial_scale': 32768.0, 'dynamic_growth_steps': 2000}
- training_precision: mixed_float16
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 0.1345 | 0.0865 | 0 |
| 0.0550 | 0.0878 | 1 |
| 0.0340 | 0.0881 | 2 |
### Framework versions
- Transformers 4.15.0.dev0
- TensorFlow 2.6.0
- Datasets 1.16.2.dev0
- Tokenizers 0.10.3
| {"license": "mit", "tags": ["generated_from_keras_callback"], "model-index": [{"name": "Rocketknight1/gbert-base-germaner", "results": []}]} | token-classification | Rocketknight1/gbert-base-germaner | [
"transformers",
"tf",
"tensorboard",
"bert",
"token-classification",
"generated_from_keras_callback",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #tf #tensorboard #bert #token-classification #generated_from_keras_callback #license-mit #autotrain_compatible #endpoints_compatible #region-us
| Rocketknight1/gbert-base-germaner
=================================
This model is a fine-tuned version of deepset/gbert-base on an unknown dataset.
It achieves the following results on the evaluation set:
* Train Loss: 0.0340
* Validation Loss: 0.0881
* Epoch: 2
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* optimizer: {'inner\_optimizer': {'class\_name': 'AdamWeightDecay', 'config': {'name': 'AdamWeightDecay', 'learning\_rate': {'class\_name': 'PolynomialDecay', 'config': {'initial\_learning\_rate': 2e-05, 'decay\_steps': 4176, 'end\_learning\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\_1': 0.9, 'beta\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight\_decay\_rate': 0.01}}, 'dynamic': True, 'initial\_scale': 32768.0, 'dynamic\_growth\_steps': 2000}
* training\_precision: mixed\_float16
### Training results
### Framework versions
* Transformers 4.15.0.dev0
* TensorFlow 2.6.0
* Datasets 1.16.2.dev0
* Tokenizers 0.10.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'inner\\_optimizer': {'class\\_name': 'AdamWeightDecay', 'config': {'name': 'AdamWeightDecay', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 2e-05, 'decay\\_steps': 4176, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight\\_decay\\_rate': 0.01}}, 'dynamic': True, 'initial\\_scale': 32768.0, 'dynamic\\_growth\\_steps': 2000}\n* training\\_precision: mixed\\_float16",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0.dev0\n* TensorFlow 2.6.0\n* Datasets 1.16.2.dev0\n* Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #tf #tensorboard #bert #token-classification #generated_from_keras_callback #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'inner\\_optimizer': {'class\\_name': 'AdamWeightDecay', 'config': {'name': 'AdamWeightDecay', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 2e-05, 'decay\\_steps': 4176, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight\\_decay\\_rate': 0.01}}, 'dynamic': True, 'initial\\_scale': 32768.0, 'dynamic\\_growth\\_steps': 2000}\n* training\\_precision: mixed\\_float16",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0.dev0\n* TensorFlow 2.6.0\n* Datasets 1.16.2.dev0\n* Tokenizers 0.10.3"
] | [
56,
268,
4,
38
] | [
"passage: TAGS\n#transformers #tf #tensorboard #bert #token-classification #generated_from_keras_callback #license-mit #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'inner\\_optimizer': {'class\\_name': 'AdamWeightDecay', 'config': {'name': 'AdamWeightDecay', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 2e-05, 'decay\\_steps': 4176, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight\\_decay\\_rate': 0.01}}, 'dynamic': True, 'initial\\_scale': 32768.0, 'dynamic\\_growth\\_steps': 2000}\n* training\\_precision: mixed\\_float16### Training results### Framework versions\n\n\n* Transformers 4.15.0.dev0\n* TensorFlow 2.6.0\n* Datasets 1.16.2.dev0\n* Tokenizers 0.10.3"
] | [
-0.0826699286699295,
0.10252311825752258,
-0.00771110225468874,
0.06297609210014343,
0.12033891677856445,
0.05016912519931793,
0.11582745611667633,
0.13677659630775452,
-0.08061980456113815,
0.14441336691379547,
0.11764605343341827,
0.12141389399766922,
0.0541728250682354,
0.14204640686511993,
-0.07438589632511139,
-0.1612270027399063,
0.01978972554206848,
-0.03204004839062691,
-0.11685369163751602,
0.07114638388156891,
0.0687742754817009,
-0.07362739741802216,
0.07393155992031097,
-0.012304444797337055,
-0.08989006280899048,
0.0013904516818001866,
0.032758500427007675,
-0.040423210710287094,
0.09716617316007614,
0.09882664680480957,
0.08352918922901154,
0.017589092254638672,
0.014977753162384033,
-0.2168860286474228,
0.004994137678295374,
0.10904823243618011,
0.01517860684543848,
0.07074453681707382,
0.058209378272295,
-0.014029910787940025,
0.12933646142482758,
-0.11169459670782089,
0.04253431409597397,
0.015576554462313652,
-0.14240629971027374,
-0.23740673065185547,
-0.09859808534383774,
0.016510751098394394,
0.08684799075126648,
0.042524922639131546,
-0.0057039461098611355,
0.09571567922830582,
-0.03632959723472595,
0.08911142498254776,
0.10749802738428116,
-0.27646902203559875,
-0.054768916219472885,
0.028305690735578537,
0.004567074589431286,
-0.01577303744852543,
-0.0666632428765297,
-0.0033733698073774576,
0.007660266011953354,
0.020120108500123024,
0.04460465535521507,
-0.016683755442500114,
0.027614625170826912,
-0.04372867941856384,
-0.08400167524814606,
-0.06189117580652237,
0.10880018025636673,
0.05380186811089516,
-0.04451098293066025,
-0.09606825560331345,
-0.04302218556404114,
-0.18053121864795685,
-0.02079387940466404,
-0.053559329360723495,
-0.0015894058160483837,
0.002631808165460825,
-0.02965928241610527,
0.008541968651115894,
-0.0629195049405098,
-0.031782496720552444,
0.018657231703400612,
0.12844467163085938,
0.037398722022771835,
0.01360303070396185,
-0.004163243342190981,
0.0853123813867569,
0.008858767338097095,
-0.1478850543498993,
-0.026755711063742638,
-0.0005126016912981868,
-0.067811518907547,
-0.040681034326553345,
-0.06836555898189545,
0.012743022292852402,
0.1012282595038414,
0.1600087583065033,
-0.06897327303886414,
0.1117808073759079,
0.012794891372323036,
-0.0020814177114516497,
-0.08039391785860062,
0.1250799149274826,
-0.026888396590948105,
-0.03744661062955856,
-0.01659342646598816,
0.06994287669658661,
0.004707152955234051,
-0.03808679059147835,
-0.03928106650710106,
0.030310697853565216,
0.09535575658082962,
0.026447368785738945,
-0.00963235180824995,
0.08724578469991684,
-0.07550734281539917,
-0.014516475610435009,
0.032592907547950745,
-0.1261184960603714,
0.04460727795958519,
0.05162445455789566,
-0.08301105350255966,
0.021481486037373543,
0.053330499678850174,
-0.02624751441180706,
-0.06727781146764755,
0.060343027114868164,
-0.04866330698132515,
-0.051733992993831635,
-0.0718783363699913,
-0.09780089557170868,
0.01832406595349312,
-0.051806606352329254,
-0.0009238907950930297,
-0.06412235647439957,
-0.10802387446165085,
-0.06540770083665848,
0.09400034695863724,
-0.051490090787410736,
-0.021953821182250977,
-0.060563988983631134,
-0.132125124335289,
0.0727437362074852,
-0.002476563211530447,
0.08367784321308136,
-0.05784275382757187,
0.06977810710668564,
0.01387014053761959,
0.03261702135205269,
0.02000151202082634,
0.024349333718419075,
-0.05289638414978981,
0.056322548538446426,
-0.1399533748626709,
0.11502857506275177,
-0.0798349604010582,
0.018104344606399536,
-0.14962667226791382,
-0.05793150141835213,
0.03506406396627426,
0.020969415083527565,
0.12536317110061646,
0.1336723119020462,
-0.15555424988269806,
-0.06262451410293579,
0.15225830674171448,
-0.06140819936990738,
-0.06584010273218155,
0.08021023869514465,
-0.04367730766534805,
-0.013070262037217617,
0.06503330916166306,
0.06891841441392899,
0.1346615105867386,
-0.07926921546459198,
0.011350364424288273,
-0.05677524954080582,
0.04524289071559906,
0.07059279829263687,
0.016057336702942848,
-0.058685656636953354,
-0.012954014353454113,
0.02255634032189846,
-0.017534291371703148,
0.028119947761297226,
-0.07115063816308975,
-0.05295184254646301,
-0.025443222373723984,
-0.05194249376654625,
0.038090016692876816,
0.04285622015595436,
0.012443304061889648,
-0.07161330431699753,
-0.15892384946346283,
0.042782507836818695,
0.060631442815065384,
-0.07471354305744171,
0.0017079859972000122,
-0.07409374415874481,
0.06095677241683006,
0.04558262601494789,
0.025698892772197723,
-0.16357189416885376,
-0.11367039382457733,
0.016651036217808723,
-0.003141237422823906,
0.002040729159489274,
-0.016928182914853096,
0.07585053145885468,
0.03811120241880417,
-0.03545895963907242,
-0.019454970955848694,
-0.0038387600798159838,
0.002285208087414503,
-0.04314863681793213,
-0.22898797690868378,
-0.021761925891041756,
-0.02414671704173088,
0.12051841616630554,
-0.2730630040168762,
0.0013005980290472507,
0.07177812606096268,
0.12887917459011078,
0.04589394852519035,
-0.040639642626047134,
-0.014737398363649845,
0.05780642852187157,
-0.03893161192536354,
-0.07020483165979385,
0.02294670231640339,
0.013492459431290627,
-0.09108491241931915,
-0.033203478902578354,
-0.16020113229751587,
0.09947308152914047,
0.1051626205444336,
-0.0102857556194067,
-0.14770305156707764,
0.0036308965645730495,
-0.02509595826268196,
-0.03251614421606064,
0.022155694663524628,
0.056674495339393616,
0.15229299664497375,
0.04513585940003395,
0.11314217746257782,
-0.01440172828733921,
-0.027617454528808594,
0.030841032043099403,
-0.03180396184325218,
-0.006713313981890678,
0.12019491195678711,
0.00038077132194302976,
-0.10161034762859344,
0.07313262671232224,
0.09157483279705048,
-0.06979966163635254,
0.1223289892077446,
-0.06792829185724258,
-0.07633619755506516,
-0.09044139087200165,
0.07896528393030167,
0.06261848658323288,
0.057417917996644974,
-0.05848538503050804,
-0.0018357113003730774,
0.003410716075450182,
-0.00012911611702293158,
-0.026289429515600204,
-0.14752286672592163,
0.026555465534329414,
0.035417210310697556,
-0.05949072167277336,
0.12364360690116882,
0.005252838600426912,
0.005468745715916157,
0.0890740379691124,
0.035006407648324966,
-0.07316278666257858,
0.01779918372631073,
-0.0231876689940691,
-0.07452674955129623,
0.2280917763710022,
-0.10155566036701202,
-0.12060170620679855,
-0.09170233458280563,
-0.04969031736254692,
-0.04552924260497093,
-0.01229111012071371,
-0.00456860288977623,
-0.0712474137544632,
-0.05488424748182297,
-0.047730714082717896,
-0.02512998692691326,
0.015350408852100372,
0.02815569005906582,
-0.0010764533653855324,
-0.002857646206393838,
0.12248001247644424,
-0.09852448850870132,
-0.028957955539226532,
-0.014727408066391945,
-0.07194900512695312,
0.0019006269285455346,
0.06471064686775208,
0.024853860959410667,
0.13320916891098022,
-0.00011421983072068542,
0.02368060313165188,
-0.027715638279914856,
0.2246103286743164,
-0.06576664000749588,
0.026220373809337616,
0.09926971048116684,
-0.03870268538594246,
0.06309331208467484,
0.15011730790138245,
0.05386815220117569,
-0.10749802738428116,
0.025853512808680534,
0.06740325689315796,
-0.0032955841161310673,
-0.22858403623104095,
-0.030024675652384758,
-0.042003728449344635,
-0.05660218372941017,
0.095843605697155,
0.04649265855550766,
0.11440835148096085,
0.01813969947397709,
0.009412011131644249,
0.06965446472167969,
0.037614356726408005,
0.06268583238124847,
0.16322097182273865,
0.08375368267297745,
0.09680581837892532,
-0.026371121406555176,
-0.012468013912439346,
0.03073379211127758,
-0.006327300798147917,
0.17595593631267548,
0.004384808242321014,
0.09437989443540573,
0.07398758083581924,
0.0946446880698204,
-0.01804845593869686,
-0.004202498123049736,
0.009726480580866337,
0.005756996106356382,
0.0209406279027462,
-0.06690866500139236,
-0.05141222104430199,
0.03049001470208168,
0.0253276564180851,
0.021277468651533127,
-0.07660417258739471,
0.05126592889428139,
0.04756728559732437,
0.23992271721363068,
0.13700361549854279,
-0.30159905552864075,
-0.10919513553380966,
-0.004398368764668703,
-0.0039358497597277164,
-0.05790561065077782,
-0.014785348437726498,
0.06207886338233948,
-0.05501077324151993,
0.06997215002775192,
-0.057970255613327026,
0.04679739102721214,
-0.10813845694065094,
0.037643011659383774,
0.12385819107294083,
0.10081842541694641,
0.019832754507660866,
-0.006859851069748402,
-0.29944825172424316,
0.2568690776824951,
0.02078821323812008,
0.1008426621556282,
-0.04082229733467102,
0.08087158203125,
0.031459350138902664,
-0.042053140699863434,
0.06448588520288467,
-0.021656664088368416,
-0.08695705980062485,
-0.180691197514534,
-0.06354869902133942,
0.001959362765774131,
0.11870121210813522,
-0.03847938030958176,
0.10966825485229492,
-0.03988569229841232,
-0.02139282040297985,
0.05047660693526268,
-0.03621339425444603,
-0.13004039227962494,
-0.0945822149515152,
0.056921470910310745,
-0.018998900428414345,
-0.015798481181263924,
-0.05813980475068092,
-0.06012646481394768,
-0.08096898347139359,
0.24446715414524078,
-0.13180133700370789,
-0.05503063648939133,
-0.12426472455263138,
0.0827646255493164,
0.1011551171541214,
-0.08041948825120926,
0.05282754451036453,
-0.007135522086173296,
0.05781066045165062,
0.05596914887428284,
-0.0925506204366684,
0.13486355543136597,
-0.02157004550099373,
-0.19289404153823853,
-0.07769788056612015,
0.11250138282775879,
0.027115581557154655,
0.013499795459210873,
-0.013420704752206802,
0.0607600063085556,
0.05820837244391441,
-0.09438906610012054,
0.09044960141181946,
0.020732348784804344,
-0.006247640121728182,
0.02133924327790737,
0.007753841113299131,
-0.06351996213197708,
-0.04261062666773796,
0.021085569635033607,
0.0722813829779625,
0.27721285820007324,
-0.09531545639038086,
0.006116361822932959,
0.0362786166369915,
-0.08251084387302399,
-0.17523711919784546,
0.019057022407650948,
0.1126217246055603,
0.0035583325661718845,
-0.04775817319750786,
-0.20399320125579834,
0.07213824987411499,
0.09533417969942093,
-0.007943868637084961,
0.12127003818750381,
-0.286866694688797,
-0.14925138652324677,
0.09402942657470703,
0.09623998403549194,
0.005844397470355034,
-0.1931121051311493,
-0.08219172060489655,
-0.004884649068117142,
-0.06288962811231613,
0.11172308027744293,
-0.01908300630748272,
0.07595182210206985,
0.03021983988583088,
0.00868509616702795,
0.03223813697695732,
-0.027894066646695137,
0.13324813544750214,
-0.04545934125781059,
0.07496589422225952,
-0.06113629788160324,
-0.011083525605499744,
0.04079022631049156,
-0.10172708332538605,
0.012070667929947376,
-0.048741091042757034,
0.03148617595434189,
-0.1207551509141922,
-0.011860324069857597,
-0.06809794902801514,
0.060861922800540924,
-0.06387179344892502,
-0.022869117558002472,
-0.015619360841810703,
0.05316640064120293,
0.07561596482992172,
-0.015111273154616356,
0.082149438560009,
-0.009400779381394386,
0.16199831664562225,
0.15808157622814178,
0.047475773841142654,
0.05614226683974266,
-0.06786008179187775,
0.057850345969200134,
-0.019819006323814392,
0.03989965096116066,
-0.17865502834320068,
0.03573798015713692,
0.1408018171787262,
0.01307743787765503,
0.12827405333518982,
0.04918178915977478,
-0.0560925155878067,
0.015134837478399277,
0.07503487914800644,
-0.1330091655254364,
-0.09452756494283676,
0.0011969780316576362,
-0.04293106868863106,
-0.07468841224908829,
0.02739783562719822,
0.15455026924610138,
-0.02826414443552494,
0.029585156589746475,
0.016442887485027313,
0.04068492352962494,
-0.05363050848245621,
0.1561192274093628,
0.0028386563062667847,
0.0762731283903122,
-0.07759563624858856,
0.10118585079908371,
0.08308839797973633,
-0.11778812110424042,
0.11200766265392303,
0.047746192663908005,
-0.0503724180161953,
-0.017580004408955574,
0.04392092674970627,
0.0912330374121666,
0.0446462519466877,
-0.028310557827353477,
-0.08616691827774048,
-0.13821767270565033,
0.09042000025510788,
0.1611306071281433,
0.026009676977992058,
0.07287516444921494,
-0.03447583317756653,
0.0012790925102308393,
-0.09039179980754852,
0.09474358707666397,
0.0776851698756218,
0.05386979132890701,
-0.12462207674980164,
0.14694461226463318,
-0.002085785148665309,
-0.03342752531170845,
0.004810801707208157,
-0.011762784793972969,
-0.17107143998146057,
-0.012592286802828312,
-0.12204206734895706,
0.030521921813488007,
-0.015634680166840553,
0.0069518680684268475,
0.024677693843841553,
-0.03781278058886528,
-0.04800073429942131,
0.021181359887123108,
-0.08788859099149704,
-0.056303538382053375,
0.027408350259065628,
0.08273237943649292,
-0.1393289715051651,
-0.05669421702623367,
-0.006473703309893608,
-0.12780481576919556,
0.06059234216809273,
0.005603456404060125,
0.018048211932182312,
0.025066355243325233,
-0.08093679696321487,
0.008104288950562477,
0.03678140789270401,
-0.011916137300431728,
0.03587726503610611,
-0.15957431495189667,
0.020675254985690117,
-0.04099346697330475,
0.008371308445930481,
0.0032963149715214968,
0.05531967058777809,
-0.09791889041662216,
-0.02811518870294094,
-0.02392101287841797,
-0.005668282508850098,
-0.05160496383905411,
0.051630035042762756,
0.13827158510684967,
-0.02696952410042286,
0.14821380376815796,
-0.11218195408582687,
0.03445693850517273,
-0.2064313292503357,
-0.02612886019051075,
0.018191292881965637,
-0.06881226599216461,
-0.11725904792547226,
-0.019695760682225227,
0.11313463002443314,
-0.08855769783258438,
0.05344054847955704,
-0.055040858685970306,
0.07381617277860641,
0.02228490635752678,
-0.12819859385490417,
-0.08206067979335785,
0.09654762595891953,
0.1614738553762436,
0.07383500039577484,
-0.02283790335059166,
0.030008437111973763,
-0.010905619710683823,
0.050566527992486954,
0.08541344851255417,
0.16326263546943665,
0.09883517026901245,
0.029546748846769333,
0.088271863758564,
0.05993980914354324,
-0.11677183955907822,
-0.09247642010450363,
0.1391574889421463,
-0.11344742029905319,
0.1769697517156601,
-0.047809749841690063,
0.08261460810899734,
0.028064748272299767,
-0.17074504494667053,
0.03542771562933922,
-0.07286105304956436,
-0.09122000634670258,
-0.10352542251348495,
-0.11300217360258102,
-0.08984502404928207,
-0.09176675230264664,
0.00011402141535654664,
-0.11193102598190308,
0.0451478585600853,
0.04802447929978371,
0.042134709656238556,
0.002819983521476388,
0.07646787911653519,
0.013156476430594921,
0.017158271744847298,
0.1157003864645958,
0.015973323956131935,
-0.015760671347379684,
-0.04855853691697121,
-0.07048186659812927,
-0.0024807185400277376,
0.030600417405366898,
0.03847059607505798,
0.004611152224242687,
-0.012173866853117943,
0.06796340644359589,
-0.0046506584621965885,
-0.08708572387695312,
0.06457506120204926,
0.028427207842469215,
0.009610971435904503,
0.05847972258925438,
0.04149510711431503,
-0.023557240143418312,
-0.013940593227744102,
0.1208244189620018,
-0.08136556297540665,
-0.05680772662162781,
-0.15642674267292023,
0.26767775416374207,
0.00023232011881191283,
0.04831249266862869,
0.014750268310308456,
-0.0672190934419632,
-0.019008345901966095,
0.12857431173324585,
0.140849307179451,
-0.024072052910923958,
-0.008676515892148018,
0.08009255677461624,
-0.008076904341578484,
-0.04365650564432144,
0.13818670809268951,
0.07015026360750198,
0.029190151020884514,
-0.02577974833548069,
-0.05035592243075371,
0.01224630419164896,
-0.039823319762945175,
-0.04562416300177574,
0.05818929523229599,
0.02026711031794548,
0.011668401770293713,
-0.012029093690216541,
0.07366710901260376,
-0.08731227368116379,
-0.1633009910583496,
0.11234238743782043,
-0.2209644317626953,
-0.18099164962768555,
-0.03598581254482269,
0.006363163236528635,
0.017951438203454018,
0.057517122477293015,
0.001539887278340757,
-0.03300981596112251,
0.11739356815814972,
-0.03400660678744316,
-0.01628105156123638,
-0.06812208145856857,
0.023186063393950462,
-0.04106190428137779,
0.17206202447414398,
-0.01022937148809433,
0.007215662393718958,
0.1429908126592636,
0.04679594933986664,
-0.10926354676485062,
0.009549952112138271,
0.09697412699460983,
-0.12472131848335266,
0.02082345075905323,
0.06093655899167061,
-0.023201439529657364,
0.13860969245433807,
0.09252916276454926,
-0.07835797220468521,
0.020350323989987373,
-0.03232093155384064,
-0.040357403457164764,
-0.05311283841729164,
-0.04096202924847603,
-0.05478376895189285,
0.12647728621959686,
0.2512288987636566,
-0.02676970884203911,
0.008140381425619125,
-0.03729012608528137,
0.026648150756955147,
0.03956400975584984,
0.07653607428073883,
-0.08420409262180328,
-0.20378585159778595,
0.09639808535575867,
0.04001644253730774,
0.06767641752958298,
-0.1431373953819275,
-0.07898875325918198,
0.04892880097031593,
-0.0029615380335599184,
-0.08627278357744217,
0.14867576956748962,
0.06121843680739403,
0.053843799978494644,
-0.05263632908463478,
-0.12527962028980255,
-0.05036356300115585,
0.17267374694347382,
-0.12457990646362305,
-0.07610868662595749
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Rocketknight1/gpt2-finetuned-wikitext2
This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 7.3062
- Validation Loss: 6.7676
- Epoch: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 7.3062 | 6.7676 | 0 |
### Framework versions
- Transformers 4.21.0.dev0
- TensorFlow 2.9.1
- Datasets 2.3.3.dev0
- Tokenizers 0.11.0
| {"license": "mit", "tags": ["generated_from_keras_callback"], "model-index": [{"name": "Rocketknight1/gpt2-finetuned-wikitext2", "results": []}]} | text-generation | Rocketknight1/gpt2-finetuned-wikitext2 | [
"transformers",
"tf",
"gpt2",
"text-generation",
"generated_from_keras_callback",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #tf #gpt2 #text-generation #generated_from_keras_callback #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| Rocketknight1/gpt2-finetuned-wikitext2
======================================
This model is a fine-tuned version of gpt2 on an unknown dataset.
It achieves the following results on the evaluation set:
* Train Loss: 7.3062
* Validation Loss: 6.7676
* Epoch: 0
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* optimizer: {'name': 'AdamWeightDecay', 'learning\_rate': 2e-05, 'decay': 0.0, 'beta\_1': 0.9, 'beta\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight\_decay\_rate': 0.01}
* training\_precision: float32
### Training results
### Framework versions
* Transformers 4.21.0.dev0
* TensorFlow 2.9.1
* Datasets 2.3.3.dev0
* Tokenizers 0.11.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'AdamWeightDecay', 'learning\\_rate': 2e-05, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight\\_decay\\_rate': 0.01}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.21.0.dev0\n* TensorFlow 2.9.1\n* Datasets 2.3.3.dev0\n* Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #tf #gpt2 #text-generation #generated_from_keras_callback #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'AdamWeightDecay', 'learning\\_rate': 2e-05, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight\\_decay\\_rate': 0.01}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.21.0.dev0\n* TensorFlow 2.9.1\n* Datasets 2.3.3.dev0\n* Tokenizers 0.11.0"
] | [
62,
118,
4,
37
] | [
"passage: TAGS\n#transformers #tf #gpt2 #text-generation #generated_from_keras_callback #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'AdamWeightDecay', 'learning\\_rate': 2e-05, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight\\_decay\\_rate': 0.01}\n* training\\_precision: float32### Training results### Framework versions\n\n\n* Transformers 4.21.0.dev0\n* TensorFlow 2.9.1\n* Datasets 2.3.3.dev0\n* Tokenizers 0.11.0"
] | [
-0.04409562423825264,
0.057451892644166946,
-0.004836505278944969,
0.07043064385652542,
0.13536220788955688,
0.020488949492573738,
0.14316433668136597,
0.1535649299621582,
-0.1223229393362999,
0.09346318989992142,
0.14351092278957367,
0.17565859854221344,
0.0319792777299881,
0.16045565903186798,
-0.10776986181735992,
-0.18290191888809204,
0.06721068918704987,
0.014021731913089752,
-0.03839897736907005,
0.09620316326618195,
0.07966947555541992,
-0.09098351746797562,
0.10672830790281296,
0.004273632541298866,
-0.19684897363185883,
0.03048691526055336,
0.09148579835891724,
-0.08026809990406036,
0.10302694141864777,
0.06953097879886627,
0.06690961867570877,
0.026940524578094482,
0.005915133748203516,
-0.16979853808879852,
0.011916290037333965,
0.10601150244474411,
-0.017959317192435265,
0.07846495509147644,
0.013127177953720093,
-0.0291579756885767,
0.1661047339439392,
-0.08872518688440323,
0.03671799227595329,
0.046269554644823074,
-0.1362747699022293,
-0.2326621115207672,
-0.1257558912038803,
0.017972489818930626,
0.04324687644839287,
0.06383036822080612,
-0.0002502993156667799,
0.19710387289524078,
0.004552201367914677,
0.10529125481843948,
0.22507773339748383,
-0.3475132882595062,
-0.042599357664585114,
0.05258065089583397,
0.01664804294705391,
0.04615461081266403,
-0.05230683460831642,
0.05115523934364319,
0.05699845030903816,
0.03969603776931763,
0.09667716175317764,
-0.032901469618082047,
-0.02562890760600567,
-0.024056535214185715,
-0.10681740194559097,
-0.04766090586781502,
0.12437533587217331,
0.012522177770733833,
-0.06830126792192459,
-0.06384972482919693,
-0.08571522682905197,
-0.17975300550460815,
-0.025282585993409157,
-0.05940699204802513,
0.029412295669317245,
0.013063469901680946,
-0.07629318535327911,
-0.055811792612075806,
-0.0555647648870945,
-0.05483910068869591,
-0.08463701605796814,
0.17902150750160217,
0.017135176807641983,
0.04708699509501457,
-0.0348081961274147,
0.06026098504662514,
-0.11250538378953934,
-0.12131531536579132,
-0.03634236380457878,
0.0017078490927815437,
0.0004542440583463758,
-0.05142919719219208,
-0.06873811036348343,
-0.13346882164478302,
0.0739101842045784,
0.156513512134552,
-0.12590311467647552,
0.07869692146778107,
-0.11251316964626312,
0.023630216717720032,
-0.09526848793029785,
0.11962728947401047,
-0.002874024212360382,
-0.012338976375758648,
0.044870685786008835,
0.021823367103934288,
0.07161764800548553,
-0.046436842530965805,
-0.08211124688386917,
0.0041736639104783535,
0.09373382478952408,
0.024953866377472878,
-0.045678719878196716,
0.10054387897253036,
-0.051553063094615936,
-0.00828003603965044,
-0.0036232811398804188,
-0.0952569842338562,
0.03227069228887558,
-0.015903085470199585,
-0.06183137744665146,
0.010428313165903091,
0.044970378279685974,
-0.005845241714268923,
-0.06057605892419815,
0.04744883254170418,
-0.07752405107021332,
-0.011676136404275894,
-0.08261583000421524,
-0.13898296654224396,
0.03245716914534569,
-0.11077013611793518,
-0.033272963017225266,
-0.07228481769561768,
-0.17977012693881989,
-0.005926727782934904,
0.04884278401732445,
-0.04971304535865784,
-0.0023340529296547174,
-0.062120575457811356,
-0.14418438076972961,
0.04683994874358177,
-0.012553858570754528,
0.08249253779649734,
-0.06224299594759941,
0.07161255180835724,
0.027467377483844757,
0.08077537268400192,
-0.10329055786132812,
0.03210701048374176,
-0.06078657507896423,
0.01899847388267517,
-0.23378784954547882,
0.08241751790046692,
-0.059553954750299454,
0.024488354101777077,
-0.13648760318756104,
-0.0661451667547226,
-0.0029416850302368402,
0.0037051262333989143,
0.10780494660139084,
0.11855956166982651,
-0.1559882014989853,
-0.07374272495508194,
0.1699420064687729,
-0.11908134073019028,
-0.10037244111299515,
0.11515267193317413,
-0.04970154911279678,
0.028643963858485222,
0.09315966814756393,
0.14321516454219818,
-0.0255335234105587,
-0.09518618881702423,
-0.016632825136184692,
-0.04718032106757164,
-0.03935028612613678,
0.00408153235912323,
0.010714861564338207,
-0.03317978233098984,
-0.0017089629545807838,
0.007441083434969187,
-0.0011224340414628386,
0.0028606366831809282,
-0.06976313143968582,
-0.05082260072231293,
-0.06701944768428802,
-0.03827860206365585,
0.058504533022642136,
0.002873077057301998,
0.060115545988082886,
-0.11406196653842926,
-0.12725262343883514,
0.0954803004860878,
0.021604221314191818,
-0.0637536272406578,
0.05800521373748779,
-0.10939329862594604,
0.07722947001457214,
-0.0273352712392807,
0.025604266673326492,
-0.1820085346698761,
-0.052220817655324936,
0.028781473636627197,
0.059170957654714584,
0.022086312994360924,
-0.021982746198773384,
0.07709862291812897,
0.019934136420488358,
-0.06455513834953308,
0.031222684308886528,
-0.0012300972593948245,
0.005253383424133062,
-0.083442322909832,
-0.23074479401111603,
0.008537489920854568,
-0.037171315401792526,
0.035739365965127945,
-0.20900721848011017,
0.039392586797475815,
0.12256190180778503,
0.10866239666938782,
0.04708095267415047,
-0.028747139498591423,
-0.03945087268948555,
0.041329123079776764,
-0.04317494109272957,
-0.07089734822511673,
0.03955398127436638,
0.030955789610743523,
-0.12559431791305542,
0.02698660083115101,
-0.19679629802703857,
0.15083900094032288,
0.1632603257894516,
-0.06386695057153702,
-0.09889596700668335,
0.009152431041002274,
-0.02436809241771698,
-0.010288584977388382,
-0.01937255822122097,
0.004265236668288708,
0.14806771278381348,
0.018062809482216835,
0.14501330256462097,
-0.07622817158699036,
-0.04760722070932388,
0.04873626306653023,
-0.013157600536942482,
-0.014588001184165478,
0.04882580414414406,
-0.006093701347708702,
-0.12716427445411682,
0.116550974547863,
0.10704914480447769,
-0.07389660179615021,
0.12592267990112305,
-0.043418679386377335,
-0.05728641897439957,
-0.05414484068751335,
0.03569675609469414,
0.03816857933998108,
0.0664825439453125,
-0.12447003275156021,
-0.0012115046847611666,
0.02087983302772045,
0.03814132884144783,
-0.003914178814738989,
-0.17543525993824005,
0.013475246727466583,
0.0002645170025061816,
-0.08451849222183228,
-0.00305609661154449,
0.02635522373020649,
0.016472401097416878,
0.1227220892906189,
0.037816040217876434,
-0.0024366555735468864,
0.08243821561336517,
0.0038012724835425615,
-0.07718591392040253,
0.1954757422208786,
-0.12150533497333527,
-0.11691399663686752,
-0.11251386255025864,
-0.10255320370197296,
-0.09421209245920181,
0.014820973388850689,
0.04133721441030502,
-0.09690944105386734,
-0.06579690426588058,
-0.08281084150075912,
0.022078176960349083,
-0.010077803395688534,
0.05662316828966141,
0.060702133923769,
-0.028157629072666168,
0.09341957420110703,
-0.11378205567598343,
-0.06049733608961105,
-0.03054848499596119,
-0.055786073207855225,
0.04080302268266678,
0.026275455951690674,
0.03701438009738922,
0.10623578727245331,
-0.05048508942127228,
0.05032365396618843,
-0.06538985669612885,
0.2179708331823349,
-0.06165969371795654,
-0.007063782308250666,
0.14389470219612122,
-0.013877728953957558,
0.07002022117376328,
0.10484538227319717,
0.038208119571208954,
-0.1329488754272461,
0.028703849762678146,
0.07777688652276993,
-0.05178751423954964,
-0.22951923310756683,
-0.017691072076559067,
-0.03669602796435356,
-0.052838847041130066,
0.05110279098153114,
0.05102621763944626,
0.15346820652484894,
0.02505013532936573,
0.008831197395920753,
0.10825695842504501,
0.03268074989318848,
0.10561881214380264,
0.21108417212963104,
0.062422435730695724,
0.12420719861984253,
-0.05026949942111969,
-0.0015916296979412436,
0.07506375759840012,
-0.020505789667367935,
0.21369239687919617,
0.019573688507080078,
0.12318727374076843,
0.08000942319631577,
0.0462910421192646,
0.0040035126730799675,
0.01596740633249283,
-0.007559558842331171,
-0.028464071452617645,
-0.007729329168796539,
-0.06971034407615662,
-0.035840876400470734,
0.03088712878525257,
-0.09954069554805756,
0.04839816316962242,
-0.08829076588153839,
0.06126221641898155,
0.08140489459037781,
0.26187795400619507,
0.04329657927155495,
-0.34675443172454834,
-0.1034761443734169,
0.009983871132135391,
-0.032073941081762314,
-0.038607679307460785,
-0.009394600056111813,
0.0864037498831749,
-0.06106981262564659,
0.12123065441846848,
-0.08768565952777863,
0.07388720661401749,
-0.01900353468954563,
0.0588911771774292,
0.0387459397315979,
0.12014780193567276,
-0.024377644062042236,
-0.004574551247060299,
-0.33335429430007935,
0.2543977200984955,
0.051511138677597046,
0.13148707151412964,
-0.07471945881843567,
0.03530497848987579,
0.042068418115377426,
0.05202865228056908,
0.08853277564048767,
-0.02785567007958889,
-0.1161997839808464,
-0.10261305421590805,
-0.03255661949515343,
0.007501745130866766,
0.12820200622081757,
0.06685643643140793,
0.10270465165376663,
-0.04591677710413933,
0.016210371628403664,
0.07228156924247742,
-0.0362023264169693,
-0.10228614509105682,
-0.05845252424478531,
0.026934003457427025,
0.061670899391174316,
-0.015634281560778618,
-0.05878491327166557,
-0.07400933653116226,
-0.05375877022743225,
0.23533464968204498,
-0.05713816359639168,
-0.05953602120280266,
-0.1365031749010086,
0.10589642077684402,
0.05648798495531082,
-0.050045643001794815,
0.04944193363189697,
-0.007129152305424213,
0.08194731920957565,
0.02724750153720379,
-0.11790147423744202,
0.13403742015361786,
-0.021403364837169647,
-0.18321289122104645,
-0.030316613614559174,
0.0955762192606926,
0.016738109290599823,
0.04207640513777733,
0.004742191638797522,
0.06596685945987701,
0.039697885513305664,
-0.10336899012327194,
0.07092572748661041,
0.017756564542651176,
0.058944232761859894,
-0.00934160128235817,
-0.00566210737451911,
-0.021969163790345192,
-0.03630487993359566,
0.019904574379324913,
0.15393666923046112,
0.2860342860221863,
-0.08115758001804352,
0.06302645057439804,
0.0034812944941222668,
-0.08152695745229721,
-0.20337331295013428,
0.08939944207668304,
0.04299411550164223,
-0.010877304710447788,
-0.04492820054292679,
-0.15945123136043549,
0.05980977416038513,
0.10378540307283401,
-0.0035428358241915703,
0.10703051090240479,
-0.29042306542396545,
-0.15006615221500397,
0.056743014603853226,
0.13171076774597168,
0.1897120624780655,
-0.15819644927978516,
-0.05972541496157646,
-0.05698886513710022,
-0.08903276175260544,
0.1633853316307068,
-0.14629162847995758,
0.10252095013856888,
0.0016430324176326394,
0.0495123527944088,
0.009646757505834103,
-0.03922581672668457,
0.08461794257164001,
-0.019606158137321472,
0.1015833243727684,
-0.06975680589675903,
-0.013442914001643658,
0.1531819850206375,
-0.06720301508903503,
0.046499308198690414,
-0.0730782002210617,
0.01605641283094883,
-0.06938739120960236,
-0.009085373021662235,
-0.06773670017719269,
0.06111285462975502,
-0.030990134924650192,
-0.04179902374744415,
-0.02919960208237171,
0.009744161739945412,
0.06497741490602493,
-0.034829508513212204,
0.17614246904850006,
-0.020633576437830925,
0.17645318806171417,
0.17757576704025269,
0.10726004093885422,
-0.10203628987073898,
0.04803398624062538,
0.08903560787439346,
-0.04316900297999382,
0.06331083923578262,
-0.19099146127700806,
0.054932601749897,
0.10186106711626053,
0.002832553582265973,
0.12469258904457092,
0.07902849465608597,
-0.04707473888993263,
0.047729525715112686,
0.06952007859945297,
-0.16196675598621368,
-0.1269410252571106,
0.02978622354567051,
-0.06773493438959122,
-0.07075970619916916,
0.08900237083435059,
0.15951640903949738,
-0.029934488236904144,
0.023818641901016235,
0.021300198510289192,
0.010161631740629673,
-0.06968211382627487,
0.11793403327465057,
-0.007889743894338608,
0.03539071977138519,
-0.1036980003118515,
0.11598041653633118,
0.01057033333927393,
-0.10053309053182602,
0.10408912599086761,
0.06481112539768219,
-0.09008006751537323,
-0.007301718462258577,
0.02053232118487358,
0.13389666378498077,
-0.033141810446977615,
-0.04006531462073326,
-0.1402663290500641,
-0.13213378190994263,
0.08211394399404526,
0.27790454030036926,
0.05856713280081749,
0.04917733743786812,
-0.04688064381480217,
0.016397884115576744,
-0.11977489292621613,
0.047198306769132614,
0.01844337210059166,
0.06150991469621658,
-0.12455327063798904,
0.16678760945796967,
-0.013050427660346031,
-0.001049538841471076,
-0.041161246597766876,
0.01986309140920639,
-0.15044860541820526,
-0.012593153864145279,
-0.1313590556383133,
-0.007759814150631428,
-0.041803911328315735,
-0.02179069072008133,
0.01053637731820345,
-0.05099853128194809,
-0.09150221198797226,
0.0385565422475338,
-0.09478296339511871,
-0.028519153594970703,
0.0403946116566658,
0.02130734920501709,
-0.14415474236011505,
-0.026254436001181602,
-0.013763981871306896,
-0.0806366503238678,
0.07511617243289948,
0.06002863496541977,
-0.013362376019358635,
0.05032707750797272,
-0.08989549428224564,
0.006556230131536722,
0.09398841857910156,
-0.027419330552220345,
0.06083425506949425,
-0.08423633873462677,
0.009773927740752697,
0.0037292353808879852,
0.052756257355213165,
0.048138219863176346,
0.12455569952726364,
-0.08911111950874329,
-0.008552740328013897,
-0.03723519295454025,
-0.03438759967684746,
-0.04872097820043564,
0.06675495207309723,
0.15694552659988403,
-0.004566898103803396,
0.18033047020435333,
-0.10969454795122147,
-0.021168183535337448,
-0.16989853978157043,
0.014106620103120804,
0.02103140763938427,
-0.12544763088226318,
-0.09577608853578568,
-0.015793614089488983,
0.08162890374660492,
-0.07257869839668274,
0.13399319350719452,
-0.02766125090420246,
0.040916599333286285,
0.07491400092840195,
-0.05149621143937111,
-0.08471327275037766,
0.023947883397340775,
0.20747911930084229,
0.054131947457790375,
-0.04140142351388931,
0.03894711658358574,
0.00991913117468357,
0.09748803824186325,
0.05404491722583771,
0.23842769861221313,
0.11203764379024506,
-0.0021158233284950256,
0.13474103808403015,
0.05016454681754112,
-0.03161678463220596,
-0.15033994615077972,
0.13849225640296936,
-0.08542732149362564,
0.15562452375888824,
-0.0388660803437233,
0.11774236708879471,
0.12273501604795456,
-0.1568327248096466,
0.03282557427883148,
-0.022710014134645462,
-0.07269934564828873,
-0.1489572525024414,
-0.09979464113712311,
-0.10208378732204437,
-0.16383963823318481,
0.019088251516222954,
-0.1274321973323822,
0.0834350660443306,
0.033187136054039,
0.03151731565594673,
-0.010346805676817894,
0.12894801795482635,
-0.04596727341413498,
-0.001965765841305256,
0.08648865669965744,
-0.02504563145339489,
-0.02619563788175583,
-0.04537729173898697,
-0.07820215821266174,
0.039822086691856384,
-0.0030906691681593657,
0.04833627864718437,
0.029450172558426857,
0.025438543409109116,
0.03864898160099983,
-0.06721481680870056,
-0.10025891661643982,
0.046428125351667404,
0.07048224657773972,
0.02575293369591236,
0.05126213654875755,
0.03821129724383354,
-0.02927812933921814,
-0.016837233677506447,
0.15600043535232544,
-0.09131452441215515,
-0.04966557025909424,
-0.15429474413394928,
0.2609582841396332,
0.03948095813393593,
0.031152404844760895,
0.0020381214562803507,
-0.07418246567249298,
-0.038289625197649,
0.16078999638557434,
0.15510419011116028,
-0.020403718575835228,
-0.007288328371942043,
0.02587062306702137,
-0.006069357506930828,
-0.018798308447003365,
0.12625904381275177,
0.06899843364953995,
-0.01639825478196144,
-0.03724491968750954,
-0.057729750871658325,
-0.02673032321035862,
-0.010918183252215385,
-0.044661980122327805,
0.07379215955734253,
0.022449849173426628,
-0.01446212362498045,
-0.007688245736062527,
0.04757477715611458,
-0.057928066700696945,
-0.08086330443620682,
0.04365880414843559,
-0.211650550365448,
-0.12955856323242188,
0.015663577243685722,
-0.022673621773719788,
-0.02517116814851761,
0.05648021027445793,
-0.0019424320198595524,
-0.011225090362131596,
0.09464418143033981,
-0.029466450214385986,
-0.07214967906475067,
-0.08204419165849686,
0.06618870794773102,
-0.09396588057279587,
0.16258956491947174,
-0.021530993282794952,
0.026558730751276016,
0.13902615010738373,
0.024109335616230965,
-0.0932324081659317,
0.059879522770643234,
0.02297767996788025,
-0.09535715728998184,
-0.018777890130877495,
0.09265372157096863,
-0.02599712833762169,
0.11186497658491135,
0.06278979778289795,
-0.09273280203342438,
0.012479132041335106,
-0.09245844185352325,
-0.09098372608423233,
-0.04240822046995163,
-0.06065860390663147,
-0.0828222706913948,
0.11076217889785767,
0.21745912730693817,
-0.014812255278229713,
0.04078543186187744,
-0.060621485114097595,
0.01297032181173563,
0.07697862386703491,
0.011625238694250584,
-0.06822032481431961,
-0.25149697065353394,
0.048593100160360336,
0.1455947756767273,
-0.005175477359443903,
-0.24429264664649963,
-0.07936692237854004,
-0.010400202125310898,
0.0028983554802834988,
-0.12152771651744843,
0.08690175414085388,
0.07694579660892487,
0.046538203954696655,
-0.04898479953408241,
-0.14837445318698883,
-0.019424768164753914,
0.17279770970344543,
-0.10768388956785202,
-0.06281255185604095
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Rocketknight1/marian-finetuned-kde4-en-to-fr
This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-fr](https://huggingface.co/Helsinki-NLP/opus-mt-en-fr) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.6862
- Validation Loss: 0.8050
- Epoch: 2
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 5e-05, 'decay_steps': 17733, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 1.0615 | 0.8832 | 0 |
| 0.7983 | 0.8211 | 1 |
| 0.6862 | 0.8050 | 2 |
### Framework versions
- Transformers 4.16.0.dev0
- TensorFlow 2.7.0
- Datasets 1.17.0
- Tokenizers 0.10.3
| {"license": "apache-2.0", "tags": ["generated_from_keras_callback"], "model-index": [{"name": "Rocketknight1/marian-finetuned-kde4-en-to-fr", "results": []}]} | text2text-generation | Rocketknight1/marian-finetuned-kde4-en-to-fr | [
"transformers",
"tf",
"marian",
"text2text-generation",
"generated_from_keras_callback",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #tf #marian #text2text-generation #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
| Rocketknight1/marian-finetuned-kde4-en-to-fr
============================================
This model is a fine-tuned version of Helsinki-NLP/opus-mt-en-fr on an unknown dataset.
It achieves the following results on the evaluation set:
* Train Loss: 0.6862
* Validation Loss: 0.8050
* Epoch: 2
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* optimizer: {'name': 'AdamWeightDecay', 'learning\_rate': {'class\_name': 'PolynomialDecay', 'config': {'initial\_learning\_rate': 5e-05, 'decay\_steps': 17733, 'end\_learning\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\_1': 0.9, 'beta\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight\_decay\_rate': 0.01}
* training\_precision: float32
### Training results
### Framework versions
* Transformers 4.16.0.dev0
* TensorFlow 2.7.0
* Datasets 1.17.0
* Tokenizers 0.10.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'AdamWeightDecay', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 5e-05, 'decay\\_steps': 17733, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight\\_decay\\_rate': 0.01}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* TensorFlow 2.7.0\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #tf #marian #text2text-generation #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'AdamWeightDecay', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 5e-05, 'decay\\_steps': 17733, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight\\_decay\\_rate': 0.01}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* TensorFlow 2.7.0\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] | [
57,
197,
4,
36
] | [
"passage: TAGS\n#transformers #tf #marian #text2text-generation #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'AdamWeightDecay', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 5e-05, 'decay\\_steps': 17733, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight\\_decay\\_rate': 0.01}\n* training\\_precision: float32### Training results### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* TensorFlow 2.7.0\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] | [
-0.09032284468412399,
0.08431869745254517,
-0.007340869400650263,
0.053751781582832336,
0.1049356535077095,
0.04173829406499863,
0.09936468303203583,
0.1390717774629593,
-0.09560412913560867,
0.13664507865905762,
0.141617089509964,
0.126487597823143,
0.05142904445528984,
0.13560979068279266,
-0.08296030014753342,
-0.14286498725414276,
0.05168166011571884,
-0.03115250915288925,
-0.0789380893111229,
0.0690312534570694,
0.08449030667543411,
-0.06661327928304672,
0.08349338173866272,
-0.02350998856127262,
-0.08142505586147308,
0.027540789917111397,
0.06186056137084961,
-0.06802736967802048,
0.09938453882932663,
0.07750965654850006,
0.0686982274055481,
0.011057237163186073,
0.012274322099983692,
-0.18149904906749725,
0.0053756278939545155,
0.1135464459657669,
0.002450535073876381,
0.07723090797662735,
0.047000229358673096,
-0.02368047460913658,
0.12619778513908386,
-0.1046462431550026,
0.03948454558849335,
0.03292948007583618,
-0.13204722106456757,
-0.253723680973053,
-0.11422450840473175,
0.005296633113175631,
0.08783229440450668,
0.08214962482452393,
0.007161834742873907,
0.10765112191438675,
-0.022538049146533012,
0.08940926939249039,
0.16650131344795227,
-0.2756183445453644,
-0.04041379690170288,
0.04366178810596466,
0.021385621279478073,
0.041386619210243225,
-0.04490183666348457,
0.032684627920389175,
0.0385543629527092,
0.029007650911808014,
0.054624784737825394,
-0.015505428425967693,
0.06073722988367081,
-0.037606131285429,
-0.07634386420249939,
-0.06922204047441483,
0.12290751188993454,
0.027470726519823074,
-0.04453185945749283,
-0.07206925004720688,
-0.046993423253297806,
-0.179103821516037,
-0.004048760049045086,
-0.02127816714346409,
0.00924795214086771,
0.012946614064276218,
-0.009802608750760555,
-0.005824577063322067,
-0.052934300154447556,
-0.041749872267246246,
-0.00663513271138072,
0.11117694526910782,
0.03165639936923981,
0.0385063961148262,
-0.008013148792088032,
0.0582851842045784,
-0.030250059440732002,
-0.12893056869506836,
-0.03183877468109131,
0.008016076870262623,
-0.06124651059508324,
-0.015325307846069336,
-0.06226131692528725,
-0.022954940795898438,
0.08202965557575226,
0.16596993803977966,
-0.1211329773068428,
0.1175665333867073,
-0.022053200751543045,
0.018217910081148148,
-0.09771401435136795,
0.09774567931890488,
-0.003563803620636463,
-0.05795075371861458,
0.01411696057766676,
0.059425558894872665,
0.03084220178425312,
-0.03730413690209389,
-0.03246430307626724,
0.02085677906870842,
0.0876498743891716,
0.04278931766748428,
-0.01630074344575405,
0.061427220702171326,
-0.06045322120189667,
-0.0005320938653312624,
0.019621649757027626,
-0.12579168379306793,
0.03351830318570137,
0.028734056279063225,
-0.08543087542057037,
0.038871247321367264,
0.05501203238964081,
-0.016221342608332634,
-0.0793365016579628,
0.03836355730891228,
-0.050260767340660095,
-0.06550419330596924,
-0.08510240167379379,
-0.11227069050073624,
0.03283024951815605,
-0.08474741876125336,
-0.02475929819047451,
-0.07719872146844864,
-0.16955681145191193,
-0.062391459941864014,
0.07755037397146225,
-0.06239897385239601,
-0.0176970474421978,
-0.04932556301355362,
-0.14522284269332886,
0.08314961940050125,
-0.011266658082604408,
0.11435937881469727,
-0.0512990728020668,
0.06027965247631073,
0.007376565132290125,
0.06449469178915024,
0.005422335118055344,
0.03478618338704109,
-0.027286631986498833,
0.04409385100007057,
-0.16782517731189728,
0.0985741913318634,
-0.08334194868803024,
0.024117164313793182,
-0.1632090061903,
-0.0557321272790432,
0.013547120615839958,
0.012881320901215076,
0.11933807283639908,
0.1137661561369896,
-0.1525617390871048,
-0.05925062671303749,
0.11055532842874527,
-0.09170685708522797,
-0.0720822662115097,
0.09037735313177109,
-0.019774997606873512,
-0.029822133481502533,
0.07013768702745438,
0.07463878393173218,
0.04687173292040825,
-0.08205434679985046,
0.0017559133702889085,
-0.05790909007191658,
0.021391507238149643,
0.023286158218979836,
0.04618145525455475,
-0.04497048631310463,
-0.039218783378601074,
0.009275577962398529,
-0.013727664947509766,
0.011054161004722118,
-0.057969603687524796,
-0.04759657010436058,
-0.027639610692858696,
-0.05474472790956497,
0.02372094802558422,
0.020770523697137833,
0.02762565016746521,
-0.0899055153131485,
-0.14943669736385345,
0.00532821100205183,
0.057537663727998734,
-0.06589528918266296,
0.03494838625192642,
-0.08001863956451416,
0.05152487754821777,
0.035860639065504074,
0.02824251353740692,
-0.1774267852306366,
-0.055523112416267395,
0.015392550267279148,
-0.003757430939003825,
-0.012855487875640392,
-0.04081624746322632,
0.06293652206659317,
0.026375895366072655,
-0.04525250568985939,
-0.008373348973691463,
-0.012677923776209354,
-0.0013371218228712678,
-0.06526219099760056,
-0.2302810251712799,
-0.010361176915466785,
-0.027859335765242577,
0.06367097795009613,
-0.2634707987308502,
0.008940319530665874,
0.07727181911468506,
0.12778140604496002,
0.03589088097214699,
-0.032753657549619675,
-0.027448754757642746,
0.05140497162938118,
-0.039114415645599365,
-0.07084537297487259,
0.02771877311170101,
0.011281553655862808,
-0.09660401940345764,
0.02006007544696331,
-0.17221398651599884,
0.0870836079120636,
0.11634822934865952,
-0.03353525325655937,
-0.11491939425468445,
0.039808690547943115,
-0.029215451329946518,
-0.032385073602199554,
-0.003724302863702178,
-0.008122866041958332,
0.15674610435962677,
0.03654753044247627,
0.1386549174785614,
-0.032876934856176376,
-0.04394521191716194,
0.030423123389482498,
-0.007191592827439308,
-0.036155618727207184,
0.11400832235813141,
-0.03031761944293976,
-0.0806318074464798,
0.0708635002374649,
0.10013051331043243,
-0.09687919914722443,
0.09634091705083847,
-0.05123418942093849,
-0.07455775141716003,
-0.06473507732152939,
0.04133538529276848,
0.06954118609428406,
0.06383228302001953,
-0.0745498538017273,
-0.0032715878915041685,
0.015295269899070263,
0.03175707906484604,
-0.005700723733752966,
-0.14342202246189117,
0.025807125493884087,
0.01215304248034954,
-0.06060759350657463,
0.05669320747256279,
0.007138184737414122,
-0.0007837290177121758,
0.11660531908273697,
0.026371117681264877,
-0.03890461474657059,
0.03887835890054703,
-0.02838391251862049,
-0.08700436353683472,
0.2204963117837906,
-0.11880422383546829,
-0.13569802045822144,
-0.11106450110673904,
-0.03071630373597145,
-0.026701653376221657,
-0.01056028064340353,
0.014195047318935394,
-0.09856804460287094,
-0.07733066380023956,
-0.0689312219619751,
-0.0037567438557744026,
-0.02824181504547596,
0.04034077376127243,
0.0642704963684082,
-0.0008284779614768922,
0.11893842369318008,
-0.10466853529214859,
-0.03845112398266792,
-0.0006408707122318447,
-0.04498962312936783,
0.009349389933049679,
0.01697126217186451,
0.011608824133872986,
0.11762626469135284,
-0.006614529062062502,
0.0251119676977396,
-0.040333617478609085,
0.19879956543445587,
-0.05397966504096985,
0.01629786379635334,
0.14464278519153595,
-0.014573374763131142,
0.062376365065574646,
0.09398884326219559,
0.03143047168850899,
-0.10507854074239731,
0.02867821417748928,
0.07740958034992218,
-0.026163604110479355,
-0.26693060994148254,
-0.027240488678216934,
-0.036712922155857086,
-0.045269954949617386,
0.07576056569814682,
0.04576157405972481,
0.13066466152668,
0.0030151763930916786,
0.00017039287195075303,
0.09990301728248596,
0.05266551673412323,
0.07237514108419418,
0.171352818608284,
0.06487994641065598,
0.09921212494373322,
-0.04161186143755913,
0.037505682557821274,
0.06017972156405449,
-0.012535234913229942,
0.200514554977417,
0.011414683423936367,
0.12217716127634048,
0.06676719337701797,
0.050765108317136765,
-0.011426152661442757,
0.010262941010296345,
-0.0028659214731305838,
0.0059800222516059875,
0.015295716933906078,
-0.06423153728246689,
-0.039102911949157715,
0.04334978386759758,
-0.03883591294288635,
0.024442914873361588,
-0.10195723921060562,
0.07908844202756882,
0.039639268070459366,
0.23834796249866486,
0.10031578689813614,
-0.3229031562805176,
-0.10663607716560364,
0.012373184785246849,
-0.016127940267324448,
-0.0499747209250927,
0.004874206148087978,
0.0817238911986351,
-0.04322328418493271,
0.08802884072065353,
-0.05606462433934212,
0.06520801782608032,
-0.07234926521778107,
0.03066609613597393,
0.10154309123754501,
0.10607267171144485,
0.015005056746304035,
-0.0022926130332052708,
-0.3133943974971771,
0.27236396074295044,
0.022582121193408966,
0.11865171045064926,
-0.06047828122973442,
0.06039278581738472,
0.032565612345933914,
-0.01841316744685173,
0.07369236648082733,
-0.010740963742136955,
-0.19598819315433502,
-0.14783428609371185,
-0.07523909956216812,
-0.017954237759113312,
0.11176042258739471,
-0.0031989894341677427,
0.09523355960845947,
-0.04268278181552887,
-0.004017178900539875,
0.054643888026475906,
-0.03397021070122719,
-0.1557694524526596,
-0.059332214295864105,
0.05722108483314514,
0.05183552950620651,
-0.028165286406874657,
-0.06896816194057465,
-0.06727669388055801,
-0.07165558636188507,
0.2086050808429718,
-0.14884838461875916,
-0.05005602538585663,
-0.1418953239917755,
0.09051966667175293,
0.10109097510576248,
-0.07078945636749268,
0.036705534905195236,
0.005371619947254658,
0.0852803960442543,
0.044815123081207275,
-0.09205925464630127,
0.11763729900121689,
-0.043960314244031906,
-0.20099176466464996,
-0.06732937693595886,
0.10628069192171097,
0.019427770748734474,
0.03440723195672035,
-0.005827679764479399,
0.07321355491876602,
0.05096115544438362,
-0.09150854498147964,
0.10519082844257355,
0.04136456176638603,
0.044510193169116974,
0.02246016263961792,
-0.00857224129140377,
-0.04963412135839462,
-0.032075174152851105,
-0.008695038966834545,
0.09183130413293839,
0.27032092213630676,
-0.0846281424164772,
0.03321124240756035,
0.0363270565867424,
-0.09625310450792313,
-0.20445753633975983,
0.04199709743261337,
0.1049354299902916,
0.01027743611484766,
-0.06441128253936768,
-0.19126178324222565,
0.047284889966249466,
0.11044708639383316,
-0.009116897359490395,
0.07208514958620071,
-0.3131898045539856,
-0.1542816460132599,
0.06714170426130295,
0.0942884236574173,
0.10269541293382645,
-0.18767113983631134,
-0.07129326462745667,
-0.05816785246133804,
-0.05638333782553673,
0.09951988607645035,
-0.0766340047121048,
0.07908760011196136,
0.014749427326023579,
0.002301366301253438,
0.008105491288006306,
-0.04502592235803604,
0.13087935745716095,
-0.027526456862688065,
0.06617772579193115,
-0.05535831302404404,
-0.005610432010143995,
0.08855441957712173,
-0.10210846364498138,
0.047024890780448914,
-0.049029555171728134,
0.03898301348090172,
-0.14479123055934906,
0.0026928868610411882,
-0.05978982523083687,
0.045454323291778564,
-0.05645367130637169,
-0.00433943560346961,
-0.008049927651882172,
0.042574331164360046,
0.08489073067903519,
-0.021199768409132957,
0.12876571714878082,
-0.00784276332706213,
0.15438413619995117,
0.1638113409280777,
0.0688716471195221,
0.01660279557108879,
-0.04091461002826691,
0.06843181699514389,
-0.025374634191393852,
0.055872246623039246,
-0.20059028267860413,
0.04242147505283356,
0.13277585804462433,
-0.0052829040214419365,
0.1222841814160347,
0.04983530193567276,
-0.06757767498493195,
0.0313597247004509,
0.07883022725582123,
-0.12080411612987518,
-0.07401654869318008,
0.013330716639757156,
0.003430937882512808,
-0.05554170906543732,
0.029349733144044876,
0.16475586593151093,
-0.03031594678759575,
0.0358281247317791,
0.009460331872105598,
0.0461081899702549,
-0.048794038593769073,
0.16623996198177338,
-0.010548423044383526,
0.07903583347797394,
-0.0756465494632721,
0.1356268674135208,
0.06408799439668655,
-0.1288665235042572,
0.11171717196702957,
0.08014658093452454,
-0.063880555331707,
-0.01919451728463173,
0.021122539415955544,
0.08106326311826706,
0.01694418676197529,
-0.05533324182033539,
-0.11601178348064423,
-0.13877423107624054,
0.10969407111406326,
0.18611611425876617,
0.02337620221078396,
0.0646229162812233,
-0.028315911069512367,
-0.0138569800183177,
-0.08366106450557709,
0.06964036077260971,
0.06061052158474922,
0.04143695533275604,
-0.12490122765302658,
0.1470753401517868,
0.0037367052864283323,
-0.02305418811738491,
-0.008820232935249805,
0.008782091550529003,
-0.16055026650428772,
-0.003204380627721548,
-0.15478642284870148,
0.03136331960558891,
0.0002801635710056871,
-0.020416835322976112,
0.011538702063262463,
-0.043569423258304596,
-0.06672105193138123,
0.050098519772291183,
-0.08012206107378006,
-0.054551757872104645,
0.011157873086631298,
0.05859747529029846,
-0.12334698438644409,
-0.05852259695529938,
-0.002295244485139847,
-0.10631424933671951,
0.05777470022439957,
0.02381936088204384,
0.010777255520224571,
0.01745958998799324,
-0.04740027338266373,
0.027122123166918755,
0.0517423041164875,
-0.0060633765533566475,
0.032106030732393265,
-0.16804854571819305,
0.01493150182068348,
-0.026349611580371857,
0.012464483268558979,
0.025387315079569817,
0.06344606727361679,
-0.085213802754879,
-0.02854638174176216,
-0.02019416354596615,
-0.008307048119604588,
-0.04445721581578255,
0.05251602083444595,
0.152543306350708,
-0.02285839058458805,
0.1527932733297348,
-0.11613915115594864,
0.022771937772631645,
-0.1788070797920227,
0.008292468264698982,
0.015848590061068535,
-0.08007629215717316,
-0.0957476794719696,
0.012766897678375244,
0.11614992469549179,
-0.10052982717752457,
0.07619959861040115,
-0.04306964576244354,
0.07913719862699509,
0.060966480523347855,
-0.09269750118255615,
-0.08459359407424927,
0.07504980266094208,
0.1458493322134018,
0.060046352446079254,
-0.01454833708703518,
0.03900936618447304,
-0.013256208039820194,
0.08254780620336533,
0.07990660518407822,
0.20734170079231262,
0.12283036857843399,
0.07578175514936447,
0.1224779561161995,
0.0488680861890316,
-0.09258145093917847,
-0.12927024066448212,
0.131631538271904,
-0.10947196185588837,
0.16382044553756714,
-0.04055815935134888,
0.0746517926454544,
0.08771146088838577,
-0.1781601905822754,
0.031300682574510574,
-0.0726795569062233,
-0.08191926777362823,
-0.1235179677605629,
-0.12644438445568085,
-0.08622024953365326,
-0.0930001437664032,
-0.006211564876139164,
-0.11891255527734756,
0.05806475877761841,
0.05060340091586113,
0.04375788941979408,
0.014288359321653843,
0.06714270263910294,
-0.03319840878248215,
0.013089356012642384,
0.11219517141580582,
0.005939011462032795,
-0.008614430204033852,
-0.013987543061375618,
-0.06839924305677414,
0.037706293165683746,
0.002680813428014517,
0.042423348873853683,
0.022995544597506523,
-0.003731535980477929,
0.07178939878940582,
-0.002872254466637969,
-0.09737890958786011,
0.04739003628492355,
0.0296992938965559,
0.020257072523236275,
0.08263733983039856,
0.05052100867033005,
-0.020106801763176918,
-0.019815605133771896,
0.13914158940315247,
-0.08338260650634766,
-0.029861759394407272,
-0.15242816507816315,
0.2390826940536499,
0.018263675272464752,
0.03136206045746803,
0.007777099031955004,
-0.06749842315912247,
-0.035416871309280396,
0.16300873458385468,
0.13269004225730896,
-0.011629125103354454,
-0.028979765251278877,
0.06285069137811661,
-0.005674708168953657,
-0.02753320336341858,
0.139500692486763,
0.08526364713907242,
0.024671318009495735,
-0.038536738604307175,
-0.06310725212097168,
-0.012061052955687046,
-0.03021971881389618,
-0.05828149989247322,
0.0753382220864296,
-0.0007086287951096892,
-0.010686998255550861,
0.004947783891111612,
0.06522104889154434,
-0.0500134602189064,
-0.12798503041267395,
0.07260781526565552,
-0.20690764486789703,
-0.16006359457969666,
-0.011439310386776924,
0.01725275069475174,
-0.0022932521533221006,
0.0542977936565876,
0.012498221360147,
-0.017651965841650963,
0.1174001693725586,
-0.044603653252124786,
-0.025513848289847374,
-0.09955199062824249,
0.0265332218259573,
-0.03014340251684189,
0.1812134087085724,
-0.014814630150794983,
0.03642599657177925,
0.14040084183216095,
0.02117430604994297,
-0.11709024012088776,
0.03623891621828079,
0.07261097431182861,
-0.10607552528381348,
0.0178049448877573,
0.10653407871723175,
-0.031689077615737915,
0.14287777245044708,
0.07510445266962051,
-0.06902603060007095,
0.010546749457716942,
-0.0478215366601944,
-0.06699507683515549,
-0.027500474825501442,
-0.038220878690481186,
-0.06614727526903152,
0.12804076075553894,
0.22541889548301697,
-0.034054920077323914,
0.0007134823827072978,
-0.032209739089012146,
0.013503995724022388,
0.027250271290540695,
0.04071163386106491,
-0.05191739648580551,
-0.22113047540187836,
0.07917957007884979,
0.03628057241439819,
0.053326547145843506,
-0.156111478805542,
-0.08329281210899353,
0.018575813621282578,
-0.010050449520349503,
-0.10554134845733643,
0.11484243720769882,
0.055885326117277145,
0.04377276077866554,
-0.051301635801792145,
-0.1470416784286499,
-0.021384360268712044,
0.17115817964076996,
-0.1124689131975174,
-0.08207988739013672
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Rocketknight1/model-card-callback-test-new
This model is a fine-tuned version of [distilbert-base-cased](https://huggingface.co/distilbert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0031
- Train Accuracy: 1.0
- Validation Loss: 0.0000
- Validation Accuracy: 1.0
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': 0.001, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Epoch |
|:----------:|:--------------:|:---------------:|:-------------------:|:-----:|
| 0.4647 | 0.6406 | 0.0057 | 1.0 | 0 |
| 0.0031 | 1.0 | 0.0000 | 1.0 | 1 |
### Framework versions
- Transformers 4.14.0.dev0
- TensorFlow 2.6.0
- Datasets 1.16.2.dev0
- Tokenizers 0.10.3
| {"license": "apache-2.0", "tags": ["generated_from_keras_callback"], "model-index": [{"name": "Rocketknight1/model-card-callback-test-new", "results": []}]} | text-classification | Rocketknight1/model-card-callback-test-new | [
"transformers",
"tf",
"distilbert",
"text-classification",
"generated_from_keras_callback",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #tf #distilbert #text-classification #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
| Rocketknight1/model-card-callback-test-new
==========================================
This model is a fine-tuned version of distilbert-base-cased on an unknown dataset.
It achieves the following results on the evaluation set:
* Train Loss: 0.0031
* Train Accuracy: 1.0
* Validation Loss: 0.0000
* Validation Accuracy: 1.0
* Epoch: 1
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* optimizer: {'name': 'Adam', 'learning\_rate': 0.001, 'decay': 0.0, 'beta\_1': 0.9, 'beta\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
* training\_precision: float32
### Training results
### Framework versions
* Transformers 4.14.0.dev0
* TensorFlow 2.6.0
* Datasets 1.16.2.dev0
* Tokenizers 0.10.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': 0.001, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.14.0.dev0\n* TensorFlow 2.6.0\n* Datasets 1.16.2.dev0\n* Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #tf #distilbert #text-classification #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': 0.001, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.14.0.dev0\n* TensorFlow 2.6.0\n* Datasets 1.16.2.dev0\n* Tokenizers 0.10.3"
] | [
56,
98,
4,
38
] | [
"passage: TAGS\n#transformers #tf #distilbert #text-classification #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': 0.001, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}\n* training\\_precision: float32### Training results### Framework versions\n\n\n* Transformers 4.14.0.dev0\n* TensorFlow 2.6.0\n* Datasets 1.16.2.dev0\n* Tokenizers 0.10.3"
] | [
-0.05712924897670746,
0.04366665706038475,
-0.0029171172063797712,
0.08681166172027588,
0.17227046191692352,
0.02706579677760601,
0.11626747250556946,
0.11798474937677383,
-0.1500394642353058,
0.02997574210166931,
0.14649225771427155,
0.2012167125940323,
0.026691162958741188,
0.11326029151678085,
-0.11794641613960266,
-0.16555781662464142,
0.05551532283425331,
0.0015271671582013369,
-0.05595577135682106,
0.09039784222841263,
0.10665587335824966,
-0.08643574267625809,
0.10915423184633255,
-0.0023350634146481752,
-0.21033547818660736,
0.041972752660512924,
0.10812661051750183,
-0.09859535098075867,
0.12602673470973969,
0.08725077658891678,
0.08779434114694595,
-0.009464295580983162,
0.020940499380230904,
-0.15406574308872223,
0.01537960022687912,
0.10715257376432419,
-0.018155084922909737,
0.060458313673734665,
0.00956987775862217,
-0.015521081164479256,
0.14286041259765625,
-0.0874440148472786,
0.03497286140918732,
0.041556019335985184,
-0.12410850822925568,
-0.24992285668849945,
-0.1275438517332077,
-0.015190394595265388,
0.050461623817682266,
0.09505268186330795,
0.008987396024167538,
0.20905113220214844,
-0.04244212806224823,
0.11254490911960602,
0.14945949614048004,
-0.3285221457481384,
-0.05138081684708595,
0.033744536340236664,
-0.007060984615236521,
0.06334203481674194,
-0.03699769824743271,
0.03937511891126633,
0.06646940112113953,
0.052752114832401276,
0.07307673990726471,
-0.0401824489235878,
-0.15306182205677032,
-0.0052368189208209515,
-0.0968114584684372,
-0.013037048280239105,
0.16426993906497955,
0.013249899260699749,
-0.06478044390678406,
-0.010011466220021248,
-0.04842854663729668,
-0.10974095016717911,
0.0037017660215497017,
-0.05200151354074478,
0.0345037505030632,
0.012508025392889977,
-0.041112277656793594,
-0.04369181767106056,
-0.0775080993771553,
-0.04147547483444214,
-0.1273978054523468,
0.1454460620880127,
0.01204188633710146,
0.0650433823466301,
-0.061828043311834335,
0.04671635851264,
-0.06728459149599075,
-0.10757498443126678,
-0.0154681745916605,
-0.008886158466339111,
-0.010803778655827045,
-0.05257473886013031,
-0.12160087376832962,
-0.13390131294727325,
0.055051617324352264,
0.11406542360782623,
-0.05649648606777191,
0.08120560646057129,
-0.08103013038635254,
0.02500029280781746,
-0.11420322954654694,
0.15713529288768768,
-0.00522273126989603,
0.009704983793199062,
0.05705922842025757,
-0.022360116243362427,
0.058670539408922195,
-0.04058725759387016,
-0.10309331864118576,
-0.0002942944993264973,
0.08296390622854233,
0.013101478107273579,
-0.07143484801054001,
0.11370298266410828,
-0.054384294897317886,
-0.007457106374204159,
-0.03901705518364906,
-0.09414660930633545,
0.02978299930691719,
-0.024320730939507484,
-0.08419075608253479,
0.008320351131260395,
0.0822131335735321,
0.012659228406846523,
-0.051506128162145615,
0.03639071807265282,
-0.05953415483236313,
-0.009903362952172756,
-0.09807205200195312,
-0.13465943932533264,
0.02843901887536049,
-0.09044788777828217,
0.007615169510245323,
-0.11362776905298233,
-0.19967815279960632,
-0.008752609603106976,
0.06761760264635086,
-0.025661034509539604,
-0.006564737763255835,
-0.06144371256232262,
-0.14535681903362274,
0.035373229533433914,
-0.021746734157204628,
0.12961357831954956,
-0.06037279590964317,
0.06300882995128632,
0.00361413205973804,
0.06166383996605873,
-0.16365693509578705,
0.045582711696624756,
-0.051309678703546524,
-0.009559420868754387,
-0.18903441727161407,
0.06523438543081284,
-0.046395838260650635,
0.057020701467990875,
-0.12134255468845367,
-0.07023290544748306,
0.045252785086631775,
0.01881912350654602,
0.10157163441181183,
0.0744721069931984,
-0.19266310334205627,
-0.036021631211042404,
0.10629066079854965,
-0.07708939164876938,
-0.11690632253885269,
0.09522898495197296,
-0.07103953510522842,
0.05528302118182182,
0.10535795241594315,
0.11141009628772736,
-0.00046462754835374653,
-0.10036872327327728,
0.04840302839875221,
-0.025560755282640457,
-0.03789607435464859,
-0.026019230484962463,
0.0014921781839802861,
0.0004433405410964042,
-0.07769840955734253,
0.03014214336872101,
-0.013058144599199295,
0.029391592368483543,
-0.06554548442363739,
-0.06782453507184982,
-0.05498624965548515,
-0.07342808693647385,
0.0368327833712101,
0.02140077017247677,
0.0785767138004303,
-0.11989517509937286,
-0.09811016917228699,
0.07911620289087296,
0.02112201415002346,
-0.028070278465747833,
0.04611147195100784,
-0.0960729643702507,
0.02640148065984249,
0.025457721203565598,
0.018914196640253067,
-0.19289955496788025,
-0.034272804856300354,
0.005291254725307226,
0.0839407667517662,
0.038261860609054565,
-0.0015487938653677702,
0.06499393284320831,
0.002284154761582613,
-0.05486709251999855,
0.04640131816267967,
0.022475367411971092,
0.03064226545393467,
-0.09541501104831696,
-0.21937350928783417,
0.030335519462823868,
-0.023789482191205025,
0.07367914170026779,
-0.23054105043411255,
0.01645900309085846,
0.042556725442409515,
0.09658921509981155,
0.028758566826581955,
0.018987176939845085,
-0.06296383589506149,
0.0648394301533699,
-0.046272922307252884,
-0.057417143136262894,
0.04441866651177406,
0.04156069457530975,
-0.11789202690124512,
0.00955503061413765,
-0.13610483705997467,
0.12714456021785736,
0.16815796494483948,
-0.13855485618114471,
-0.10851059854030609,
0.06500277668237686,
-0.00752095365896821,
-0.009577667340636253,
0.0009809141047298908,
0.022129150107502937,
0.1736897975206375,
-0.01457470003515482,
0.14570245146751404,
-0.06052993983030319,
-0.02752751111984253,
0.03500663861632347,
-0.036660585552453995,
-0.008861520327627659,
0.07622697949409485,
0.002517002634704113,
-0.15466515719890594,
0.1064402237534523,
0.14757299423217773,
-0.11735325306653976,
0.08738981932401657,
-0.04307540878653526,
-0.03334486857056618,
-0.036148685961961746,
0.0011046885047107935,
0.04133570194244385,
0.06592172384262085,
-0.09248503297567368,
0.0035935966297984123,
0.0207360852509737,
0.03552967682480812,
-0.0072801848873496056,
-0.19833119213581085,
-0.00713457353413105,
0.0026866060215979815,
-0.03741134703159332,
-0.01168380118906498,
0.0188528411090374,
0.02147207036614418,
0.1348329335451126,
0.0224078930914402,
-0.04898146539926529,
0.10277261584997177,
-0.002671054797247052,
-0.08826475590467453,
0.20947642624378204,
-0.1681947410106659,
-0.1265135109424591,
-0.11026177555322647,
-0.10382810980081558,
-0.07766234874725342,
0.022583724930882454,
0.03517487645149231,
-0.10259468108415604,
-0.06873046606779099,
-0.06238168478012085,
-0.01522785983979702,
-0.018635256215929985,
0.05446162447333336,
0.04392092302441597,
-0.01121001411229372,
0.10908844321966171,
-0.10462036728858948,
-0.0475817434489727,
-0.03200561925768852,
-0.05278770998120308,
0.0556148998439312,
-0.007362678647041321,
0.045659806579351425,
0.1104247197508812,
-0.04790922999382019,
0.022131651639938354,
-0.054999906569719315,
0.2320437729358673,
-0.042534369975328445,
-0.020758388563990593,
0.15333245694637299,
-0.03403855487704277,
0.03894615173339844,
0.10135575383901596,
0.025498192757368088,
-0.1388639509677887,
0.05682849884033203,
0.04509511590003967,
-0.04623213782906532,
-0.24339213967323303,
-0.032357245683670044,
-0.04526607319712639,
-0.10851564258337021,
0.016286581754684448,
0.03147989884018898,
0.14993104338645935,
0.030117208138108253,
0.05404139682650566,
0.13927192986011505,
0.008634707890450954,
0.062035687267780304,
0.21297599375247955,
0.05540072172880173,
0.10972293466329575,
-0.060217875987291336,
-0.004153605550527573,
0.07211893051862717,
-0.03470243886113167,
0.20321550965309143,
0.027748117223381996,
0.029510950669646263,
0.07129348814487457,
0.08216436952352524,
-0.020331639796495438,
0.019489765167236328,
0.017895309254527092,
-0.030919726938009262,
-0.0202015433460474,
-0.05246745049953461,
-0.05969397351145744,
0.0517733059823513,
-0.11823005229234695,
0.05248649790883064,
-0.08586478233337402,
0.03474608436226845,
0.06920969486236572,
0.2577660083770752,
0.03504679724574089,
-0.3424689471721649,
-0.10669898241758347,
0.008510411716997623,
-0.01644810661673546,
-0.0374557264149189,
-0.00027257204055786133,
0.0715852603316307,
-0.0648169070482254,
0.11671452224254608,
-0.05963306128978729,
0.07107164710760117,
0.018376382067799568,
0.0667826309800148,
0.07488624006509781,
0.10879620909690857,
0.004695057403296232,
0.01981295272707939,
-0.37436404824256897,
0.2527911961078644,
0.03697524219751358,
0.13797195255756378,
-0.09858790040016174,
0.016639292240142822,
0.04223001375794411,
0.06292962282896042,
0.06850265711545944,
-0.01631617732346058,
-0.15394112467765808,
-0.12723815441131592,
0.0004387192311696708,
0.028462747111916542,
0.13016833364963531,
0.09835049510002136,
0.08687729388475418,
-0.04286975413560867,
0.027839776128530502,
0.09432481229305267,
0.010408032685518265,
-0.12861356139183044,
-0.059222396463155746,
0.0035002322401851416,
0.07921497523784637,
-0.05243242532014847,
-0.0518329031765461,
-0.07425545901060104,
-0.09736212342977524,
0.19009153544902802,
-0.06828558444976807,
-0.03374006599187851,
-0.13208834826946259,
0.09264785051345825,
0.039528992027044296,
-0.04893381893634796,
0.047926001250743866,
-0.003871275344863534,
0.04968493804335594,
0.060622818768024445,
-0.1422816514968872,
0.14239159226417542,
-0.03470608592033386,
-0.16016389429569244,
-0.05544430762529373,
0.04096456989645958,
0.028502237051725388,
0.04551165550947189,
0.007608095649629831,
0.05511178821325302,
0.018243372440338135,
-0.09429188072681427,
0.06115633249282837,
0.03499320149421692,
0.0623500719666481,
0.019644448533654213,
-0.035858944058418274,
-0.038462184369564056,
-0.03895668685436249,
-0.009439258836209774,
0.16284501552581787,
0.23756855726242065,
-0.09099563211202621,
0.037050507962703705,
-0.012708720751106739,
-0.09053747355937958,
-0.236450657248497,
0.12098988890647888,
0.05353172495961189,
0.007565109524875879,
-0.02897159941494465,
-0.14570742845535278,
0.10360489785671234,
0.08366593718528748,
-0.005201784428209066,
0.0958467349410057,
-0.2589166462421417,
-0.1520106941461563,
0.09811677038669586,
0.13770096004009247,
0.2143089473247528,
-0.14757326245307922,
-0.025306109338998795,
-0.08741756528615952,
-0.07030881196260452,
0.16835951805114746,
-0.1697363257408142,
0.09370561689138412,
0.018042057752609253,
0.07603776454925537,
-0.005355945322662592,
-0.023428475484251976,
0.09679584950208664,
-0.031587887555360794,
0.12714140117168427,
-0.08756665140390396,
-0.01648486591875553,
0.11694826930761337,
-0.05076230689883232,
0.025576600804924965,
-0.05423448979854584,
0.036373838782310486,
-0.06152353435754776,
0.0051383585669100285,
-0.07501702755689621,
0.04918226972222328,
-0.02830987051129341,
-0.03904932737350464,
-0.03200814127922058,
0.028455141931772232,
0.07755842804908752,
-0.05175383388996124,
0.14754965901374817,
-0.012636956758797169,
0.16455866396427155,
0.15554074943065643,
0.10335472971200943,
-0.06347049027681351,
0.08630069345235825,
0.0746045708656311,
-0.0409797765314579,
0.08112380653619766,
-0.18032968044281006,
0.051554303616285324,
0.1173095628619194,
-0.02042061649262905,
0.12951424717903137,
0.07940782606601715,
-0.03543766960501671,
0.028773710131645203,
0.06552322953939438,
-0.15637154877185822,
-0.09058656543493271,
0.03383687138557434,
-0.003518134355545044,
-0.044581081718206406,
0.07578287273645401,
0.15325897932052612,
-0.042609803378582,
0.024122698232531548,
0.0018147293012589216,
0.006544931326061487,
-0.08764192461967468,
0.1282888799905777,
0.02042574994266033,
0.0057142809964716434,
-0.09734170138835907,
0.1348412185907364,
0.031217137351632118,
-0.07950017601251602,
0.09584414958953857,
0.022875037044286728,
-0.09049757570028305,
-0.02414250187575817,
0.09338458627462387,
0.14226195216178894,
-0.04000947251915932,
-0.07237754017114639,
-0.11464806646108627,
-0.1686306893825531,
0.07440988719463348,
0.26396262645721436,
0.07387934625148773,
0.03960774093866348,
-0.047509659081697464,
-0.014592031016945839,
-0.08719233423471451,
0.034062039107084274,
0.025173401460051537,
0.04856479540467262,
-0.13513816893100739,
0.16796262562274933,
-0.02645363286137581,
0.02295040898025036,
-0.04722427949309349,
0.033302124589681625,
-0.14763455092906952,
0.010401605628430843,
-0.1999538540840149,
0.004888080060482025,
0.009536659345030785,
0.005840391851961613,
0.029896574094891548,
-0.06352832168340683,
-0.09271599352359772,
0.04924451559782028,
-0.10439129918813705,
-0.02074415422976017,
0.05555294081568718,
0.03727024048566818,
-0.11149876564741135,
-0.08230900019407272,
-0.0019213494379073381,
-0.060699462890625,
0.044796332716941833,
0.07949884235858917,
-0.03071117401123047,
0.08267957717180252,
-0.14937898516654968,
-0.014509634114801884,
0.09418601542711258,
0.012162989005446434,
0.08817937970161438,
-0.09429256618022919,
-0.009343093261122704,
0.033604856580495834,
0.06005549803376198,
0.041134659200906754,
0.13030415773391724,
-0.07913687080144882,
-0.04134576395153999,
-0.020707860589027405,
-0.04147500544786453,
-0.05316254124045372,
0.04315228760242462,
0.1544552445411682,
0.009763340465724468,
0.2020968496799469,
-0.11688463389873505,
-0.022852156311273575,
-0.14380128681659698,
0.009090391919016838,
-0.005221435334533453,
-0.13425365090370178,
-0.1250625103712082,
-0.02765246108174324,
0.08779508620500565,
-0.07169690728187561,
0.13121633231639862,
-0.008668278343975544,
0.07906289398670197,
0.05919446796178818,
-0.03519917279481888,
-0.07472728192806244,
0.044511038810014725,
0.20098638534545898,
0.03654927387833595,
-0.025027893483638763,
0.03212912008166313,
0.016046153381466866,
0.10442148149013519,
0.10153073817491531,
0.2480224221944809,
0.13556113839149475,
0.003602967131882906,
0.14636041224002838,
0.03979792445898056,
-0.03126221150159836,
-0.07744431495666504,
0.09468121826648712,
-0.09052885323762894,
0.15111836791038513,
-0.05112358182668686,
0.08875297009944916,
0.05817318335175514,
-0.15866175293922424,
0.017917726188898087,
-0.09911859035491943,
-0.08249031752347946,
-0.1428937464952469,
-0.07178180664777756,
-0.1052015870809555,
-0.12200628966093063,
-0.0022811968810856342,
-0.103099025785923,
0.052678342908620834,
0.04310282692313194,
0.024569405242800713,
-0.03176270052790642,
0.10278324037790298,
-0.07366166263818741,
-0.0012023322051391006,
0.09023439139127731,
-0.027908440679311752,
-0.03373122587800026,
-0.06234224885702133,
-0.07476557791233063,
0.03139175847172737,
-0.0010986927663907409,
0.027144793421030045,
0.002742151962593198,
-0.0008767132530920208,
0.03781844303011894,
-0.06397993117570877,
-0.08328220993280411,
0.049755848944187164,
0.05937417224049568,
0.017871756106615067,
0.040283046662807465,
0.05271482840180397,
-0.00516148004680872,
0.0015014033997431397,
0.1556469053030014,
-0.09647597372531891,
-0.06556647270917892,
-0.14934170246124268,
0.29385095834732056,
0.01995115354657173,
0.03735066577792168,
0.008345618844032288,
-0.06008106842637062,
-0.043911099433898926,
0.21433395147323608,
0.17027583718299866,
-0.101059190928936,
-0.01772395707666874,
0.009979638271033764,
-0.0015786237781867385,
-0.05387342348694801,
0.1583545207977295,
0.10184863209724426,
-0.04715196415781975,
-0.05069530010223389,
-0.050978194922208786,
-0.030215565115213394,
0.009590022265911102,
-0.03457712009549141,
0.06124608591198921,
0.01618793234229088,
-0.026851192116737366,
-0.00204785680398345,
0.052209705114364624,
-0.08089835941791534,
-0.09820621460676193,
0.05199397727847099,
-0.1849180907011032,
-0.14867541193962097,
0.005533110816031694,
0.0029788112733513117,
-0.014866869896650314,
0.06622970104217529,
-0.03150591626763344,
0.007603638805449009,
0.08870550245046616,
-0.04845622554421425,
-0.03520938381552696,
-0.07860670238733292,
0.09097246080636978,
-0.10639496147632599,
0.16498985886573792,
-0.02282550558447838,
0.06294845044612885,
0.13114526867866516,
0.05999849736690521,
-0.07620491087436676,
0.06806886196136475,
0.027520671486854553,
-0.0771968737244606,
0.008467786945402622,
0.04489666223526001,
-0.04249498248100281,
0.09905703365802765,
0.06496763974428177,
-0.07366351783275604,
0.04882051423192024,
-0.09316085278987885,
-0.1042848452925682,
-0.03330478444695473,
-0.05141807720065117,
-0.10114922374486923,
0.11075421422719955,
0.2235797494649887,
-0.026381706818938255,
0.04874757304787636,
-0.06185594201087952,
-0.005701693240553141,
0.05831624194979668,
-0.021343130618333817,
-0.08256354182958603,
-0.20779716968536377,
0.04861106351017952,
0.12252165377140045,
0.007945844903588295,
-0.18581224977970123,
-0.0717785581946373,
-0.019217325374484062,
-0.03098764643073082,
-0.09496982395648956,
0.09156087785959244,
0.07984750717878342,
0.0370275042951107,
-0.06442878395318985,
-0.13369886577129364,
-0.040713462978601456,
0.15681777894496918,
-0.07799328118562698,
-0.09029624611139297
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# model_card_test2
This model is a fine-tuned version of [distilbert-base-cased](https://huggingface.co/distilbert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0031
- Train Accuracy: 1.0
- Validation Loss: 0.0000
- Validation Accuracy: 1.0
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': 0.001, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Epoch |
|:----------:|:--------------:|:---------------:|:-------------------:|:-----:|
| 0.4647 | 0.6406 | 0.0057 | 1.0 | 0 |
| 0.0031 | 1.0 | 0.0000 | 1.0 | 1 |
### Framework versions
- Transformers 4.14.0.dev0
- TensorFlow 2.6.0
- Datasets 1.16.2.dev0
- Tokenizers 0.10.3
| {"license": "apache-2.0", "tags": ["generated_from_keras_callback"], "model-index": [{"name": "model_card_test2", "results": []}]} | text-classification | Rocketknight1/model_card_test2 | [
"transformers",
"tf",
"distilbert",
"text-classification",
"generated_from_keras_callback",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #tf #distilbert #text-classification #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
| model\_card\_test2
==================
This model is a fine-tuned version of distilbert-base-cased on an unknown dataset.
It achieves the following results on the evaluation set:
* Train Loss: 0.0031
* Train Accuracy: 1.0
* Validation Loss: 0.0000
* Validation Accuracy: 1.0
* Epoch: 1
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* optimizer: {'name': 'Adam', 'learning\_rate': 0.001, 'decay': 0.0, 'beta\_1': 0.9, 'beta\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
* training\_precision: float32
### Training results
### Framework versions
* Transformers 4.14.0.dev0
* TensorFlow 2.6.0
* Datasets 1.16.2.dev0
* Tokenizers 0.10.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': 0.001, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.14.0.dev0\n* TensorFlow 2.6.0\n* Datasets 1.16.2.dev0\n* Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #tf #distilbert #text-classification #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': 0.001, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.14.0.dev0\n* TensorFlow 2.6.0\n* Datasets 1.16.2.dev0\n* Tokenizers 0.10.3"
] | [
56,
98,
4,
38
] | [
"passage: TAGS\n#transformers #tf #distilbert #text-classification #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': 0.001, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}\n* training\\_precision: float32### Training results### Framework versions\n\n\n* Transformers 4.14.0.dev0\n* TensorFlow 2.6.0\n* Datasets 1.16.2.dev0\n* Tokenizers 0.10.3"
] | [
-0.05712924897670746,
0.04366665706038475,
-0.0029171172063797712,
0.08681166172027588,
0.17227046191692352,
0.02706579677760601,
0.11626747250556946,
0.11798474937677383,
-0.1500394642353058,
0.02997574210166931,
0.14649225771427155,
0.2012167125940323,
0.026691162958741188,
0.11326029151678085,
-0.11794641613960266,
-0.16555781662464142,
0.05551532283425331,
0.0015271671582013369,
-0.05595577135682106,
0.09039784222841263,
0.10665587335824966,
-0.08643574267625809,
0.10915423184633255,
-0.0023350634146481752,
-0.21033547818660736,
0.041972752660512924,
0.10812661051750183,
-0.09859535098075867,
0.12602673470973969,
0.08725077658891678,
0.08779434114694595,
-0.009464295580983162,
0.020940499380230904,
-0.15406574308872223,
0.01537960022687912,
0.10715257376432419,
-0.018155084922909737,
0.060458313673734665,
0.00956987775862217,
-0.015521081164479256,
0.14286041259765625,
-0.0874440148472786,
0.03497286140918732,
0.041556019335985184,
-0.12410850822925568,
-0.24992285668849945,
-0.1275438517332077,
-0.015190394595265388,
0.050461623817682266,
0.09505268186330795,
0.008987396024167538,
0.20905113220214844,
-0.04244212806224823,
0.11254490911960602,
0.14945949614048004,
-0.3285221457481384,
-0.05138081684708595,
0.033744536340236664,
-0.007060984615236521,
0.06334203481674194,
-0.03699769824743271,
0.03937511891126633,
0.06646940112113953,
0.052752114832401276,
0.07307673990726471,
-0.0401824489235878,
-0.15306182205677032,
-0.0052368189208209515,
-0.0968114584684372,
-0.013037048280239105,
0.16426993906497955,
0.013249899260699749,
-0.06478044390678406,
-0.010011466220021248,
-0.04842854663729668,
-0.10974095016717911,
0.0037017660215497017,
-0.05200151354074478,
0.0345037505030632,
0.012508025392889977,
-0.041112277656793594,
-0.04369181767106056,
-0.0775080993771553,
-0.04147547483444214,
-0.1273978054523468,
0.1454460620880127,
0.01204188633710146,
0.0650433823466301,
-0.061828043311834335,
0.04671635851264,
-0.06728459149599075,
-0.10757498443126678,
-0.0154681745916605,
-0.008886158466339111,
-0.010803778655827045,
-0.05257473886013031,
-0.12160087376832962,
-0.13390131294727325,
0.055051617324352264,
0.11406542360782623,
-0.05649648606777191,
0.08120560646057129,
-0.08103013038635254,
0.02500029280781746,
-0.11420322954654694,
0.15713529288768768,
-0.00522273126989603,
0.009704983793199062,
0.05705922842025757,
-0.022360116243362427,
0.058670539408922195,
-0.04058725759387016,
-0.10309331864118576,
-0.0002942944993264973,
0.08296390622854233,
0.013101478107273579,
-0.07143484801054001,
0.11370298266410828,
-0.054384294897317886,
-0.007457106374204159,
-0.03901705518364906,
-0.09414660930633545,
0.02978299930691719,
-0.024320730939507484,
-0.08419075608253479,
0.008320351131260395,
0.0822131335735321,
0.012659228406846523,
-0.051506128162145615,
0.03639071807265282,
-0.05953415483236313,
-0.009903362952172756,
-0.09807205200195312,
-0.13465943932533264,
0.02843901887536049,
-0.09044788777828217,
0.007615169510245323,
-0.11362776905298233,
-0.19967815279960632,
-0.008752609603106976,
0.06761760264635086,
-0.025661034509539604,
-0.006564737763255835,
-0.06144371256232262,
-0.14535681903362274,
0.035373229533433914,
-0.021746734157204628,
0.12961357831954956,
-0.06037279590964317,
0.06300882995128632,
0.00361413205973804,
0.06166383996605873,
-0.16365693509578705,
0.045582711696624756,
-0.051309678703546524,
-0.009559420868754387,
-0.18903441727161407,
0.06523438543081284,
-0.046395838260650635,
0.057020701467990875,
-0.12134255468845367,
-0.07023290544748306,
0.045252785086631775,
0.01881912350654602,
0.10157163441181183,
0.0744721069931984,
-0.19266310334205627,
-0.036021631211042404,
0.10629066079854965,
-0.07708939164876938,
-0.11690632253885269,
0.09522898495197296,
-0.07103953510522842,
0.05528302118182182,
0.10535795241594315,
0.11141009628772736,
-0.00046462754835374653,
-0.10036872327327728,
0.04840302839875221,
-0.025560755282640457,
-0.03789607435464859,
-0.026019230484962463,
0.0014921781839802861,
0.0004433405410964042,
-0.07769840955734253,
0.03014214336872101,
-0.013058144599199295,
0.029391592368483543,
-0.06554548442363739,
-0.06782453507184982,
-0.05498624965548515,
-0.07342808693647385,
0.0368327833712101,
0.02140077017247677,
0.0785767138004303,
-0.11989517509937286,
-0.09811016917228699,
0.07911620289087296,
0.02112201415002346,
-0.028070278465747833,
0.04611147195100784,
-0.0960729643702507,
0.02640148065984249,
0.025457721203565598,
0.018914196640253067,
-0.19289955496788025,
-0.034272804856300354,
0.005291254725307226,
0.0839407667517662,
0.038261860609054565,
-0.0015487938653677702,
0.06499393284320831,
0.002284154761582613,
-0.05486709251999855,
0.04640131816267967,
0.022475367411971092,
0.03064226545393467,
-0.09541501104831696,
-0.21937350928783417,
0.030335519462823868,
-0.023789482191205025,
0.07367914170026779,
-0.23054105043411255,
0.01645900309085846,
0.042556725442409515,
0.09658921509981155,
0.028758566826581955,
0.018987176939845085,
-0.06296383589506149,
0.0648394301533699,
-0.046272922307252884,
-0.057417143136262894,
0.04441866651177406,
0.04156069457530975,
-0.11789202690124512,
0.00955503061413765,
-0.13610483705997467,
0.12714456021785736,
0.16815796494483948,
-0.13855485618114471,
-0.10851059854030609,
0.06500277668237686,
-0.00752095365896821,
-0.009577667340636253,
0.0009809141047298908,
0.022129150107502937,
0.1736897975206375,
-0.01457470003515482,
0.14570245146751404,
-0.06052993983030319,
-0.02752751111984253,
0.03500663861632347,
-0.036660585552453995,
-0.008861520327627659,
0.07622697949409485,
0.002517002634704113,
-0.15466515719890594,
0.1064402237534523,
0.14757299423217773,
-0.11735325306653976,
0.08738981932401657,
-0.04307540878653526,
-0.03334486857056618,
-0.036148685961961746,
0.0011046885047107935,
0.04133570194244385,
0.06592172384262085,
-0.09248503297567368,
0.0035935966297984123,
0.0207360852509737,
0.03552967682480812,
-0.0072801848873496056,
-0.19833119213581085,
-0.00713457353413105,
0.0026866060215979815,
-0.03741134703159332,
-0.01168380118906498,
0.0188528411090374,
0.02147207036614418,
0.1348329335451126,
0.0224078930914402,
-0.04898146539926529,
0.10277261584997177,
-0.002671054797247052,
-0.08826475590467453,
0.20947642624378204,
-0.1681947410106659,
-0.1265135109424591,
-0.11026177555322647,
-0.10382810980081558,
-0.07766234874725342,
0.022583724930882454,
0.03517487645149231,
-0.10259468108415604,
-0.06873046606779099,
-0.06238168478012085,
-0.01522785983979702,
-0.018635256215929985,
0.05446162447333336,
0.04392092302441597,
-0.01121001411229372,
0.10908844321966171,
-0.10462036728858948,
-0.0475817434489727,
-0.03200561925768852,
-0.05278770998120308,
0.0556148998439312,
-0.007362678647041321,
0.045659806579351425,
0.1104247197508812,
-0.04790922999382019,
0.022131651639938354,
-0.054999906569719315,
0.2320437729358673,
-0.042534369975328445,
-0.020758388563990593,
0.15333245694637299,
-0.03403855487704277,
0.03894615173339844,
0.10135575383901596,
0.025498192757368088,
-0.1388639509677887,
0.05682849884033203,
0.04509511590003967,
-0.04623213782906532,
-0.24339213967323303,
-0.032357245683670044,
-0.04526607319712639,
-0.10851564258337021,
0.016286581754684448,
0.03147989884018898,
0.14993104338645935,
0.030117208138108253,
0.05404139682650566,
0.13927192986011505,
0.008634707890450954,
0.062035687267780304,
0.21297599375247955,
0.05540072172880173,
0.10972293466329575,
-0.060217875987291336,
-0.004153605550527573,
0.07211893051862717,
-0.03470243886113167,
0.20321550965309143,
0.027748117223381996,
0.029510950669646263,
0.07129348814487457,
0.08216436952352524,
-0.020331639796495438,
0.019489765167236328,
0.017895309254527092,
-0.030919726938009262,
-0.0202015433460474,
-0.05246745049953461,
-0.05969397351145744,
0.0517733059823513,
-0.11823005229234695,
0.05248649790883064,
-0.08586478233337402,
0.03474608436226845,
0.06920969486236572,
0.2577660083770752,
0.03504679724574089,
-0.3424689471721649,
-0.10669898241758347,
0.008510411716997623,
-0.01644810661673546,
-0.0374557264149189,
-0.00027257204055786133,
0.0715852603316307,
-0.0648169070482254,
0.11671452224254608,
-0.05963306128978729,
0.07107164710760117,
0.018376382067799568,
0.0667826309800148,
0.07488624006509781,
0.10879620909690857,
0.004695057403296232,
0.01981295272707939,
-0.37436404824256897,
0.2527911961078644,
0.03697524219751358,
0.13797195255756378,
-0.09858790040016174,
0.016639292240142822,
0.04223001375794411,
0.06292962282896042,
0.06850265711545944,
-0.01631617732346058,
-0.15394112467765808,
-0.12723815441131592,
0.0004387192311696708,
0.028462747111916542,
0.13016833364963531,
0.09835049510002136,
0.08687729388475418,
-0.04286975413560867,
0.027839776128530502,
0.09432481229305267,
0.010408032685518265,
-0.12861356139183044,
-0.059222396463155746,
0.0035002322401851416,
0.07921497523784637,
-0.05243242532014847,
-0.0518329031765461,
-0.07425545901060104,
-0.09736212342977524,
0.19009153544902802,
-0.06828558444976807,
-0.03374006599187851,
-0.13208834826946259,
0.09264785051345825,
0.039528992027044296,
-0.04893381893634796,
0.047926001250743866,
-0.003871275344863534,
0.04968493804335594,
0.060622818768024445,
-0.1422816514968872,
0.14239159226417542,
-0.03470608592033386,
-0.16016389429569244,
-0.05544430762529373,
0.04096456989645958,
0.028502237051725388,
0.04551165550947189,
0.007608095649629831,
0.05511178821325302,
0.018243372440338135,
-0.09429188072681427,
0.06115633249282837,
0.03499320149421692,
0.0623500719666481,
0.019644448533654213,
-0.035858944058418274,
-0.038462184369564056,
-0.03895668685436249,
-0.009439258836209774,
0.16284501552581787,
0.23756855726242065,
-0.09099563211202621,
0.037050507962703705,
-0.012708720751106739,
-0.09053747355937958,
-0.236450657248497,
0.12098988890647888,
0.05353172495961189,
0.007565109524875879,
-0.02897159941494465,
-0.14570742845535278,
0.10360489785671234,
0.08366593718528748,
-0.005201784428209066,
0.0958467349410057,
-0.2589166462421417,
-0.1520106941461563,
0.09811677038669586,
0.13770096004009247,
0.2143089473247528,
-0.14757326245307922,
-0.025306109338998795,
-0.08741756528615952,
-0.07030881196260452,
0.16835951805114746,
-0.1697363257408142,
0.09370561689138412,
0.018042057752609253,
0.07603776454925537,
-0.005355945322662592,
-0.023428475484251976,
0.09679584950208664,
-0.031587887555360794,
0.12714140117168427,
-0.08756665140390396,
-0.01648486591875553,
0.11694826930761337,
-0.05076230689883232,
0.025576600804924965,
-0.05423448979854584,
0.036373838782310486,
-0.06152353435754776,
0.0051383585669100285,
-0.07501702755689621,
0.04918226972222328,
-0.02830987051129341,
-0.03904932737350464,
-0.03200814127922058,
0.028455141931772232,
0.07755842804908752,
-0.05175383388996124,
0.14754965901374817,
-0.012636956758797169,
0.16455866396427155,
0.15554074943065643,
0.10335472971200943,
-0.06347049027681351,
0.08630069345235825,
0.0746045708656311,
-0.0409797765314579,
0.08112380653619766,
-0.18032968044281006,
0.051554303616285324,
0.1173095628619194,
-0.02042061649262905,
0.12951424717903137,
0.07940782606601715,
-0.03543766960501671,
0.028773710131645203,
0.06552322953939438,
-0.15637154877185822,
-0.09058656543493271,
0.03383687138557434,
-0.003518134355545044,
-0.044581081718206406,
0.07578287273645401,
0.15325897932052612,
-0.042609803378582,
0.024122698232531548,
0.0018147293012589216,
0.006544931326061487,
-0.08764192461967468,
0.1282888799905777,
0.02042574994266033,
0.0057142809964716434,
-0.09734170138835907,
0.1348412185907364,
0.031217137351632118,
-0.07950017601251602,
0.09584414958953857,
0.022875037044286728,
-0.09049757570028305,
-0.02414250187575817,
0.09338458627462387,
0.14226195216178894,
-0.04000947251915932,
-0.07237754017114639,
-0.11464806646108627,
-0.1686306893825531,
0.07440988719463348,
0.26396262645721436,
0.07387934625148773,
0.03960774093866348,
-0.047509659081697464,
-0.014592031016945839,
-0.08719233423471451,
0.034062039107084274,
0.025173401460051537,
0.04856479540467262,
-0.13513816893100739,
0.16796262562274933,
-0.02645363286137581,
0.02295040898025036,
-0.04722427949309349,
0.033302124589681625,
-0.14763455092906952,
0.010401605628430843,
-0.1999538540840149,
0.004888080060482025,
0.009536659345030785,
0.005840391851961613,
0.029896574094891548,
-0.06352832168340683,
-0.09271599352359772,
0.04924451559782028,
-0.10439129918813705,
-0.02074415422976017,
0.05555294081568718,
0.03727024048566818,
-0.11149876564741135,
-0.08230900019407272,
-0.0019213494379073381,
-0.060699462890625,
0.044796332716941833,
0.07949884235858917,
-0.03071117401123047,
0.08267957717180252,
-0.14937898516654968,
-0.014509634114801884,
0.09418601542711258,
0.012162989005446434,
0.08817937970161438,
-0.09429256618022919,
-0.009343093261122704,
0.033604856580495834,
0.06005549803376198,
0.041134659200906754,
0.13030415773391724,
-0.07913687080144882,
-0.04134576395153999,
-0.020707860589027405,
-0.04147500544786453,
-0.05316254124045372,
0.04315228760242462,
0.1544552445411682,
0.009763340465724468,
0.2020968496799469,
-0.11688463389873505,
-0.022852156311273575,
-0.14380128681659698,
0.009090391919016838,
-0.005221435334533453,
-0.13425365090370178,
-0.1250625103712082,
-0.02765246108174324,
0.08779508620500565,
-0.07169690728187561,
0.13121633231639862,
-0.008668278343975544,
0.07906289398670197,
0.05919446796178818,
-0.03519917279481888,
-0.07472728192806244,
0.044511038810014725,
0.20098638534545898,
0.03654927387833595,
-0.025027893483638763,
0.03212912008166313,
0.016046153381466866,
0.10442148149013519,
0.10153073817491531,
0.2480224221944809,
0.13556113839149475,
0.003602967131882906,
0.14636041224002838,
0.03979792445898056,
-0.03126221150159836,
-0.07744431495666504,
0.09468121826648712,
-0.09052885323762894,
0.15111836791038513,
-0.05112358182668686,
0.08875297009944916,
0.05817318335175514,
-0.15866175293922424,
0.017917726188898087,
-0.09911859035491943,
-0.08249031752347946,
-0.1428937464952469,
-0.07178180664777756,
-0.1052015870809555,
-0.12200628966093063,
-0.0022811968810856342,
-0.103099025785923,
0.052678342908620834,
0.04310282692313194,
0.024569405242800713,
-0.03176270052790642,
0.10278324037790298,
-0.07366166263818741,
-0.0012023322051391006,
0.09023439139127731,
-0.027908440679311752,
-0.03373122587800026,
-0.06234224885702133,
-0.07476557791233063,
0.03139175847172737,
-0.0010986927663907409,
0.027144793421030045,
0.002742151962593198,
-0.0008767132530920208,
0.03781844303011894,
-0.06397993117570877,
-0.08328220993280411,
0.049755848944187164,
0.05937417224049568,
0.017871756106615067,
0.040283046662807465,
0.05271482840180397,
-0.00516148004680872,
0.0015014033997431397,
0.1556469053030014,
-0.09647597372531891,
-0.06556647270917892,
-0.14934170246124268,
0.29385095834732056,
0.01995115354657173,
0.03735066577792168,
0.008345618844032288,
-0.06008106842637062,
-0.043911099433898926,
0.21433395147323608,
0.17027583718299866,
-0.101059190928936,
-0.01772395707666874,
0.009979638271033764,
-0.0015786237781867385,
-0.05387342348694801,
0.1583545207977295,
0.10184863209724426,
-0.04715196415781975,
-0.05069530010223389,
-0.050978194922208786,
-0.030215565115213394,
0.009590022265911102,
-0.03457712009549141,
0.06124608591198921,
0.01618793234229088,
-0.026851192116737366,
-0.00204785680398345,
0.052209705114364624,
-0.08089835941791534,
-0.09820621460676193,
0.05199397727847099,
-0.1849180907011032,
-0.14867541193962097,
0.005533110816031694,
0.0029788112733513117,
-0.014866869896650314,
0.06622970104217529,
-0.03150591626763344,
0.007603638805449009,
0.08870550245046616,
-0.04845622554421425,
-0.03520938381552696,
-0.07860670238733292,
0.09097246080636978,
-0.10639496147632599,
0.16498985886573792,
-0.02282550558447838,
0.06294845044612885,
0.13114526867866516,
0.05999849736690521,
-0.07620491087436676,
0.06806886196136475,
0.027520671486854553,
-0.0771968737244606,
0.008467786945402622,
0.04489666223526001,
-0.04249498248100281,
0.09905703365802765,
0.06496763974428177,
-0.07366351783275604,
0.04882051423192024,
-0.09316085278987885,
-0.1042848452925682,
-0.03330478444695473,
-0.05141807720065117,
-0.10114922374486923,
0.11075421422719955,
0.2235797494649887,
-0.026381706818938255,
0.04874757304787636,
-0.06185594201087952,
-0.005701693240553141,
0.05831624194979668,
-0.021343130618333817,
-0.08256354182958603,
-0.20779716968536377,
0.04861106351017952,
0.12252165377140045,
0.007945844903588295,
-0.18581224977970123,
-0.0717785581946373,
-0.019217325374484062,
-0.03098764643073082,
-0.09496982395648956,
0.09156087785959244,
0.07984750717878342,
0.0370275042951107,
-0.06442878395318985,
-0.13369886577129364,
-0.040713462978601456,
0.15681777894496918,
-0.07799328118562698,
-0.09029624611139297
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Rocketknight1/opus-mt-en-ROMANCE-finetuned-en-to-ro
This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-ROMANCE](https://huggingface.co/Helsinki-NLP/opus-mt-en-ROMANCE) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.7140
- Validation Loss: 1.2757
- Train Bleu: 26.7914
- Train Gen Len: 41.4932
- Epoch: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Train Bleu | Train Gen Len | Epoch |
|:----------:|:---------------:|:----------:|:-------------:|:-----:|
| 0.7140 | 1.2757 | 26.7914 | 41.4932 | 0 |
### Framework versions
- Transformers 4.21.0.dev0
- TensorFlow 2.9.1
- Datasets 2.4.0
- Tokenizers 0.11.0
| {"license": "apache-2.0", "tags": ["generated_from_keras_callback"], "model-index": [{"name": "Rocketknight1/opus-mt-en-ROMANCE-finetuned-en-to-ro", "results": []}]} | text2text-generation | Rocketknight1/opus-mt-en-ROMANCE-finetuned-en-to-ro | [
"transformers",
"tf",
"tensorboard",
"marian",
"text2text-generation",
"generated_from_keras_callback",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #tf #tensorboard #marian #text2text-generation #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
| Rocketknight1/opus-mt-en-ROMANCE-finetuned-en-to-ro
===================================================
This model is a fine-tuned version of Helsinki-NLP/opus-mt-en-ROMANCE on an unknown dataset.
It achieves the following results on the evaluation set:
* Train Loss: 0.7140
* Validation Loss: 1.2757
* Train Bleu: 26.7914
* Train Gen Len: 41.4932
* Epoch: 0
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* optimizer: {'name': 'AdamWeightDecay', 'learning\_rate': 2e-05, 'decay': 0.0, 'beta\_1': 0.9, 'beta\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight\_decay\_rate': 0.01}
* training\_precision: float32
### Training results
### Framework versions
* Transformers 4.21.0.dev0
* TensorFlow 2.9.1
* Datasets 2.4.0
* Tokenizers 0.11.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'AdamWeightDecay', 'learning\\_rate': 2e-05, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight\\_decay\\_rate': 0.01}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.21.0.dev0\n* TensorFlow 2.9.1\n* Datasets 2.4.0\n* Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #tf #tensorboard #marian #text2text-generation #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'AdamWeightDecay', 'learning\\_rate': 2e-05, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight\\_decay\\_rate': 0.01}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.21.0.dev0\n* TensorFlow 2.9.1\n* Datasets 2.4.0\n* Tokenizers 0.11.0"
] | [
61,
118,
4,
35
] | [
"passage: TAGS\n#transformers #tf #tensorboard #marian #text2text-generation #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'AdamWeightDecay', 'learning\\_rate': 2e-05, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight\\_decay\\_rate': 0.01}\n* training\\_precision: float32### Training results### Framework versions\n\n\n* Transformers 4.21.0.dev0\n* TensorFlow 2.9.1\n* Datasets 2.4.0\n* Tokenizers 0.11.0"
] | [
-0.05787425488233566,
0.06413505226373672,
-0.004343563225120306,
0.08275371044874191,
0.1253664642572403,
0.019409634172916412,
0.15194250643253326,
0.1486431062221527,
-0.1226145550608635,
0.07476960867643356,
0.13669146597385406,
0.16476862132549286,
0.03483167290687561,
0.14259187877178192,
-0.11218810826539993,
-0.16815248131752014,
0.03623465076088905,
0.016561059281229973,
-0.08141206949949265,
0.08589383959770203,
0.08019382506608963,
-0.08562884479761124,
0.10401179641485214,
0.0032478116918355227,
-0.17904840409755707,
0.05374143272638321,
0.10135866701602936,
-0.07899874448776245,
0.11266031116247177,
0.08823790401220322,
0.08563152700662613,
0.028010498732328415,
0.024443279951810837,
-0.16381895542144775,
0.013244898989796638,
0.11149808764457703,
-0.016771819442510605,
0.08585964888334274,
0.031871430575847626,
-0.006412323098629713,
0.1391569972038269,
-0.1061406061053276,
0.02838793396949768,
0.03426077961921692,
-0.11706262826919556,
-0.21573564410209656,
-0.12173070013523102,
0.013277186080813408,
0.040986787527799606,
0.07999521493911743,
-0.0012880094582214952,
0.20420914888381958,
-0.005125831812620163,
0.10321156680583954,
0.20138157904148102,
-0.34155091643333435,
-0.05418514460325241,
0.0400935523211956,
0.007046337705105543,
0.05194012448191643,
-0.04584849253296852,
0.06429146230220795,
0.0720466822385788,
0.033209316432476044,
0.08500749617815018,
-0.03692208230495453,
-0.07003044337034225,
-0.02137545868754387,
-0.1019473522901535,
-0.020338919013738632,
0.14009085297584534,
0.029147248715162277,
-0.05933196470141411,
-0.03956161439418793,
-0.08171050995588303,
-0.1363421380519867,
-0.008851063437759876,
-0.07328886538743973,
0.031652286648750305,
0.012776844203472137,
-0.07982004433870316,
-0.07213929295539856,
-0.08868954330682755,
-0.0401465967297554,
-0.09936841577291489,
0.1325289011001587,
0.01539604365825653,
0.047812361270189285,
-0.04337313398718834,
0.051676858216524124,
-0.07738044857978821,
-0.13097955286502838,
0.004893383476883173,
0.011132563464343548,
-0.016916653141379356,
-0.044496987015008926,
-0.08346372842788696,
-0.1598319262266159,
0.07252712547779083,
0.11293890327215195,
-0.08431561291217804,
0.07918497174978256,
-0.11017940193414688,
0.031545866280794144,
-0.11116762459278107,
0.12959614396095276,
-0.023407209664583206,
-0.030180491507053375,
0.05550394952297211,
0.029299935325980186,
0.07795783132314682,
-0.04532945528626442,
-0.09262891113758087,
0.0019382934551686049,
0.08841762691736221,
0.011849584989249706,
-0.034055911004543304,
0.0948486477136612,
-0.046920400112867355,
-0.010622221976518631,
-0.029271777719259262,
-0.0959111750125885,
0.031952984631061554,
-0.011214377358555794,
-0.07372688502073288,
0.023723984137177467,
0.07375204563140869,
0.0075635965913534164,
-0.05820417031645775,
0.034992046654224396,
-0.06466945260763168,
-0.019489923492074013,
-0.0884285718202591,
-0.13249817490577698,
0.045950260013341904,
-0.0963473990559578,
-0.023145467042922974,
-0.08681465685367584,
-0.1687442660331726,
-0.01726844348013401,
0.06116429343819618,
-0.0513695664703846,
-0.017267821356654167,
-0.04791191965341568,
-0.14163543283939362,
0.05067061632871628,
-0.015089191496372223,
0.13223764300346375,
-0.04546583816409111,
0.059882596135139465,
0.013665834441781044,
0.07294541597366333,
-0.10699909180402756,
0.035358935594558716,
-0.04837583750486374,
0.004481227602809668,
-0.20046719908714294,
0.06935198605060577,
-0.054486967623233795,
0.024842211976647377,
-0.13800707459449768,
-0.04329100251197815,
-0.011280701495707035,
0.014553260058164597,
0.11516077816486359,
0.10938984155654907,
-0.19097496569156647,
-0.06242222338914871,
0.13070359826087952,
-0.09860159456729889,
-0.09921764582395554,
0.12127291411161423,
-0.0499132014811039,
0.038033317774534225,
0.08567534387111664,
0.12824133038520813,
-0.012286829762160778,
-0.09873627871274948,
0.01861790381371975,
-0.021958699449896812,
-0.01088071521371603,
-0.007166372612118721,
0.008875018917024136,
-0.03331451490521431,
-0.00872112438082695,
0.009012031368911266,
-0.00480298325419426,
0.015901656821370125,
-0.06202750280499458,
-0.0647900179028511,
-0.06323227286338806,
-0.06039132922887802,
0.04412468522787094,
0.02556736394762993,
0.07325288653373718,
-0.09978288412094116,
-0.12611819803714752,
0.07303997129201889,
0.02689526602625847,
-0.05821172520518303,
0.055039118975400925,
-0.08606141060590744,
0.04897249490022659,
-0.02664473094046116,
0.01847813092172146,
-0.18340304493904114,
-0.0474577397108078,
0.018532857298851013,
0.03432570397853851,
0.04128037393093109,
0.025492964312434196,
0.08050002157688141,
0.006473647430539131,
-0.06984030455350876,
0.03457733616232872,
0.0040346006862819195,
0.010838698595762253,
-0.09974491596221924,
-0.25622376799583435,
0.020866671577095985,
-0.0351533479988575,
0.06743956357240677,
-0.22977609932422638,
0.022395337000489235,
0.08015625178813934,
0.1209280714392662,
0.049045976251363754,
-0.0055529410019516945,
-0.045532457530498505,
0.04459889605641365,
-0.04745962470769882,
-0.06203128769993782,
0.03936411067843437,
0.031655676662921906,
-0.13796095550060272,
0.040132373571395874,
-0.1747598648071289,
0.1326150894165039,
0.1624644547700882,
-0.09858545660972595,
-0.09633328020572662,
0.043218374252319336,
-0.018292570486664772,
-0.02763773687183857,
-0.022046411409974098,
0.005853090435266495,
0.14220339059829712,
0.010175970382988453,
0.14636453986167908,
-0.06643354892730713,
-0.0428275465965271,
0.03663134202361107,
-0.023109663277864456,
-0.02642649970948696,
0.06748407334089279,
0.01553789246827364,
-0.1416965126991272,
0.11136701703071594,
0.13368414342403412,
-0.11682598292827606,
0.13198061287403107,
-0.04731889069080353,
-0.06870691478252411,
-0.048366766422986984,
0.023551613092422485,
0.04400787502527237,
0.07688248157501221,
-0.10654719918966293,
0.0016972374869510531,
0.020228276029229164,
0.020300934091210365,
0.003289601532742381,
-0.18780827522277832,
0.011139113456010818,
-0.0016239183023571968,
-0.06917349994182587,
0.03290792927145958,
0.01676812954246998,
0.013892704620957375,
0.1326625943183899,
0.020887039601802826,
-0.02570873685181141,
0.07541842013597488,
-0.008486929349601269,
-0.08497873693704605,
0.19433405995368958,
-0.11399570852518082,
-0.12921543419361115,
-0.12123993784189224,
-0.062463585287332535,
-0.08408397436141968,
0.009948807768523693,
0.02572784386575222,
-0.09317652136087418,
-0.06192229315638542,
-0.0803772434592247,
0.011251836083829403,
-0.0023813731968402863,
0.05796214938163757,
0.05540793016552925,
-0.009789421223104,
0.12702812254428864,
-0.10553064197301865,
-0.03909219056367874,
-0.043125636875629425,
-0.05050182342529297,
0.023784374818205833,
0.02591705322265625,
0.05033237859606743,
0.10907687246799469,
-0.04170146957039833,
0.03093220479786396,
-0.053747788071632385,
0.1844024360179901,
-0.05761447548866272,
0.0119128143414855,
0.15070205926895142,
-0.03603136166930199,
0.05321592465043068,
0.11081388592720032,
0.040149420499801636,
-0.10877366364002228,
0.03094174899160862,
0.08804025501012802,
-0.040132712572813034,
-0.2573256492614746,
-0.014903414063155651,
-0.03826099634170532,
-0.06221358850598335,
0.04268353804945946,
0.04249850660562515,
0.1460987776517868,
0.030879713594913483,
0.02190481312572956,
0.1172797828912735,
0.01544020976871252,
0.06553737819194794,
0.20339058339595795,
0.05588022992014885,
0.12941433489322662,
-0.060278914868831635,
-0.014389628544449806,
0.0758032575249672,
-0.009012917056679726,
0.20234565436840057,
0.013573548756539822,
0.10258085280656815,
0.07700689136981964,
0.06794532388448715,
-0.012482398189604282,
0.009755016304552555,
0.014888238161802292,
-0.03289158642292023,
-0.010343390516936779,
-0.07030103355646133,
-0.020993290469050407,
0.04294859245419502,
-0.08734357357025146,
0.0508609265089035,
-0.07838071137666702,
0.07676398754119873,
0.06971871107816696,
0.2557142376899719,
0.06244712322950363,
-0.3146151900291443,
-0.08570961654186249,
0.014238723553717136,
-0.02410759963095188,
-0.017701808363199234,
-0.007902590557932854,
0.10255876928567886,
-0.04941922798752785,
0.11469699442386627,
-0.08447404205799103,
0.06987737119197845,
-0.024618690833449364,
0.05421121418476105,
0.054428890347480774,
0.12364296615123749,
-0.0011689267121255398,
0.009526641108095646,
-0.35824817419052124,
0.2766174376010895,
0.047198791056871414,
0.12741440534591675,
-0.08006765693426132,
0.03724713623523712,
0.05034670978784561,
0.03395748883485794,
0.07930576801300049,
-0.03372998535633087,
-0.1258198469877243,
-0.09064316004514694,
-0.04717421531677246,
0.012887301854789257,
0.11659931391477585,
0.07257123291492462,
0.09372017532587051,
-0.05016016215085983,
0.013158462010324001,
0.09153373539447784,
0.007672651205211878,
-0.13995452225208282,
-0.07132004201412201,
0.023662880063056946,
0.07004977762699127,
-0.09657826274633408,
-0.060557253658771515,
-0.09342758357524872,
-0.08373789489269257,
0.22524549067020416,
-0.05254649743437767,
-0.03185363858938217,
-0.1347149908542633,
0.13357698917388916,
0.06303844600915909,
-0.06329367309808731,
0.03575849160552025,
-0.0013119765790179372,
0.07331709563732147,
0.039567235857248306,
-0.13271234929561615,
0.13404931128025055,
-0.03479502350091934,
-0.17824946343898773,
-0.04953215271234512,
0.08280957490205765,
0.017206856980919838,
0.047586407512426376,
0.0056177000515162945,
0.05104367434978485,
0.03937011957168579,
-0.08603426069021225,
0.07773231714963913,
0.010772189125418663,
0.04155238717794418,
-0.013819476589560509,
-0.029056593775749207,
-0.05910787731409073,
-0.03722688555717468,
-0.006438964046537876,
0.14969070255756378,
0.255734920501709,
-0.08661163598299026,
0.03436668589711189,
0.015437373891472816,
-0.09116829186677933,
-0.224229097366333,
0.09780161827802658,
0.06381669640541077,
-0.0005584982573054731,
-0.018541419878602028,
-0.14804387092590332,
0.08248292654752731,
0.08866947889328003,
-0.012754647061228752,
0.08110330253839493,
-0.282135546207428,
-0.16230349242687225,
0.08739497512578964,
0.12773995101451874,
0.17043210566043854,
-0.16693434119224548,
-0.05512908101081848,
-0.0629359632730484,
-0.051020823419094086,
0.15040810406208038,
-0.18601448833942413,
0.09061215817928314,
0.023290837183594704,
0.045651618391275406,
0.007668077014386654,
-0.037770599126815796,
0.08745263516902924,
-0.03746403381228447,
0.0972176343202591,
-0.06666893512010574,
-0.010315417312085629,
0.13500013947486877,
-0.06143919378519058,
0.03561193868517876,
-0.07892316579818726,
0.016725096851587296,
-0.05881696939468384,
0.0006427807966247201,
-0.06803567707538605,
0.06380272656679153,
-0.039145760238170624,
-0.037891678512096405,
-0.02021319419145584,
0.016813015565276146,
0.06025170907378197,
-0.038170117884874344,
0.1633916050195694,
-0.01690770871937275,
0.16528809070587158,
0.20416967570781708,
0.10926852375268936,
-0.05109382048249245,
0.06295531988143921,
0.07283436506986618,
-0.052246082574129105,
0.06656293570995331,
-0.1877763867378235,
0.05348491668701172,
0.10121241211891174,
-0.009754536673426628,
0.12420284003019333,
0.06119527295231819,
-0.05964459478855133,
0.033778946846723557,
0.07562948018312454,
-0.15476827323436737,
-0.11467298120260239,
0.020000964403152466,
-0.06732535362243652,
-0.0698152706027031,
0.07431727647781372,
0.17299491167068481,
-0.03513583168387413,
0.03041900508105755,
0.02228035405278206,
0.009370313957333565,
-0.07185729593038559,
0.1458025872707367,
0.019081447273492813,
0.01900417171418667,
-0.08750445395708084,
0.13340991735458374,
0.026325644925236702,
-0.10569705814123154,
0.10471069067716599,
0.05852532759308815,
-0.07015334069728851,
-0.022319620475172997,
0.05784168839454651,
0.15282772481441498,
-0.03268076851963997,
-0.05931645259261131,
-0.1463727205991745,
-0.13402757048606873,
0.07057525217533112,
0.2627796530723572,
0.0590381845831871,
0.037331949919462204,
-0.051266591995954514,
0.0009706538985483348,
-0.10249907523393631,
0.07214529812335968,
0.04717741161584854,
0.05763789266347885,
-0.13849355280399323,
0.17321276664733887,
-0.011781236156821251,
0.00009976408910006285,
-0.04220337048172951,
0.04431774467229843,
-0.14197883009910583,
-0.0148772606626153,
-0.16350862383842468,
0.000011866472050314769,
-0.0037129195407032967,
-0.02381686493754387,
0.016654837876558304,
-0.0652654841542244,
-0.09709729254245758,
0.044744085520505905,
-0.0918554738163948,
-0.029270373284816742,
0.04465241730213165,
0.026440897956490517,
-0.13539980351924896,
-0.048542000353336334,
-0.009336886927485466,
-0.07300117611885071,
0.058560561388731,
0.045944955199956894,
-0.004230767488479614,
0.05570409074425697,
-0.12069565802812576,
0.012425579130649567,
0.08746044337749481,
-0.0023808805271983147,
0.06514208018779755,
-0.08469937741756439,
0.0030529818031936884,
0.00945638120174408,
0.06913088262081146,
0.028122805058956146,
0.1336212456226349,
-0.08042850345373154,
-0.02283427305519581,
-0.0376366451382637,
-0.04474089294672012,
-0.048806577920913696,
0.06638137996196747,
0.1440328061580658,
0.004545373842120171,
0.1819189488887787,
-0.11668979376554489,
-0.01722954586148262,
-0.1817457675933838,
0.02885144203901291,
0.01124162320047617,
-0.11718859523534775,
-0.10496032238006592,
-0.02002088725566864,
0.0914284884929657,
-0.08457793295383453,
0.11388926953077316,
-0.010314623825252056,
0.060135312378406525,
0.07002156227827072,
-0.06479927897453308,
-0.06525411456823349,
0.042562782764434814,
0.19015806913375854,
0.03462647646665573,
-0.022705528885126114,
0.010771810077130795,
0.005138228181749582,
0.10185319185256958,
0.08448032289743423,
0.24185416102409363,
0.12105413526296616,
-0.013395128771662712,
0.14864908158779144,
0.06996330618858337,
-0.035937219858169556,
-0.12599831819534302,
0.13643784821033478,
-0.10355027765035629,
0.15876725316047668,
-0.04495314136147499,
0.09937001764774323,
0.1074318140745163,
-0.15825825929641724,
0.019993247464299202,
-0.05726655200123787,
-0.07390102744102478,
-0.15785692632198334,
-0.10349749773740768,
-0.11177440732717514,
-0.1301199048757553,
0.0034329527989029884,
-0.11373104900121689,
0.06331594288349152,
0.03156789764761925,
0.03116297535598278,
-0.027385909110307693,
0.12172550708055496,
-0.041543710976839066,
-0.0059959860518574715,
0.09529483318328857,
-0.0036428591702133417,
-0.022627124562859535,
-0.05564992502331734,
-0.07019107788801193,
0.03044457919895649,
0.0198330320417881,
0.020878994837403297,
0.00437017809599638,
0.013065647333860397,
0.05110260471701622,
-0.05202003940939903,
-0.09946156293153763,
0.04388958215713501,
0.04523672163486481,
0.04578882083296776,
0.05775245279073715,
0.040550991892814636,
-0.014992828480899334,
-0.01636095717549324,
0.16138048470020294,
-0.10204081237316132,
-0.05345583334565163,
-0.1624833047389984,
0.241654172539711,
0.007688076701015234,
0.029792973771691322,
-0.00312568130902946,
-0.07797657698392868,
-0.03618228808045387,
0.2022893726825714,
0.17742149531841278,
-0.05980610474944115,
-0.010001162067055702,
0.03969976678490639,
-0.0045398203656077385,
-0.04741910845041275,
0.13232631981372833,
0.07873483002185822,
-0.010412407107651234,
-0.049542736262083054,
-0.06789642572402954,
-0.01838880218565464,
-0.009550139307975769,
-0.04880188778042793,
0.10022614151239395,
0.02125285565853119,
-0.013401470147073269,
-0.0027518607676029205,
0.04624537378549576,
-0.030476782470941544,
-0.10997200012207031,
0.03472709283232689,
-0.2205335944890976,
-0.1498047262430191,
0.0009653969900682569,
0.018354367464780807,
-0.02186950296163559,
0.049503326416015625,
-0.019443269819021225,
0.008093546144664288,
0.08629286289215088,
-0.037530336529016495,
-0.05753551796078682,
-0.0618351474404335,
0.06164269149303436,
-0.11356789618730545,
0.16229362785816193,
-0.023186376318335533,
0.030566120520234108,
0.13547340035438538,
0.03350727632641792,
-0.08683179318904877,
0.05278804153203964,
0.03367207944393158,
-0.09622994065284729,
0.004714654292911291,
0.08517687022686005,
-0.03349452465772629,
0.11144909262657166,
0.07815958559513092,
-0.07837889343500137,
0.027221962809562683,
-0.10114215314388275,
-0.0976075604557991,
-0.038230206817388535,
-0.04226323589682579,
-0.10277360677719116,
0.11679445952177048,
0.21877647936344147,
-0.014769887551665306,
0.04024018719792366,
-0.059495288878679276,
-0.0022728615440428257,
0.06980375200510025,
0.01746494509279728,
-0.05828557163476944,
-0.2266518622636795,
0.05589479207992554,
0.09357346594333649,
0.015024333260953426,
-0.23497648537158966,
-0.0763532817363739,
-0.013035277836024761,
-0.02317933738231659,
-0.12158843129873276,
0.08335558325052261,
0.09863826632499695,
0.0434926301240921,
-0.04862212389707565,
-0.13978512585163116,
-0.02289033867418766,
0.14990882575511932,
-0.11362278461456299,
-0.06951866298913956
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Rocketknight1/t5-small-finetuned-xsum
This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 2.7172
- Validation Loss: 2.3977
- Train Rouge1: 28.7469
- Train Rouge2: 7.9005
- Train Rougel: 22.5917
- Train Rougelsum: 22.6162
- Train Gen Len: 18.875
- Epoch: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Train Rouge1 | Train Rouge2 | Train Rougel | Train Rougelsum | Train Gen Len | Epoch |
|:----------:|:---------------:|:------------:|:------------:|:------------:|:---------------:|:-------------:|:-----:|
| 2.7172 | 2.3977 | 28.7469 | 7.9005 | 22.5917 | 22.6162 | 18.875 | 0 |
### Framework versions
- Transformers 4.16.0.dev0
- TensorFlow 2.8.0-rc0
- Datasets 1.17.0
- Tokenizers 0.11.0
| {"license": "apache-2.0", "tags": ["generated_from_keras_callback"], "model-index": [{"name": "Rocketknight1/t5-small-finetuned-xsum", "results": []}]} | text2text-generation | Rocketknight1/t5-small-finetuned-xsum | [
"transformers",
"tf",
"tensorboard",
"t5",
"text2text-generation",
"generated_from_keras_callback",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #tf #tensorboard #t5 #text2text-generation #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| Rocketknight1/t5-small-finetuned-xsum
=====================================
This model is a fine-tuned version of t5-small on an unknown dataset.
It achieves the following results on the evaluation set:
* Train Loss: 2.7172
* Validation Loss: 2.3977
* Train Rouge1: 28.7469
* Train Rouge2: 7.9005
* Train Rougel: 22.5917
* Train Rougelsum: 22.6162
* Train Gen Len: 18.875
* Epoch: 0
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* optimizer: {'name': 'AdamWeightDecay', 'learning\_rate': 2e-05, 'decay': 0.0, 'beta\_1': 0.9, 'beta\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight\_decay\_rate': 0.01}
* training\_precision: float32
### Training results
### Framework versions
* Transformers 4.16.0.dev0
* TensorFlow 2.8.0-rc0
* Datasets 1.17.0
* Tokenizers 0.11.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'AdamWeightDecay', 'learning\\_rate': 2e-05, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight\\_decay\\_rate': 0.01}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* TensorFlow 2.8.0-rc0\n* Datasets 1.17.0\n* Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #tf #tensorboard #t5 #text2text-generation #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'AdamWeightDecay', 'learning\\_rate': 2e-05, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight\\_decay\\_rate': 0.01}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* TensorFlow 2.8.0-rc0\n* Datasets 1.17.0\n* Tokenizers 0.11.0"
] | [
70,
118,
4,
40
] | [
"passage: TAGS\n#transformers #tf #tensorboard #t5 #text2text-generation #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'AdamWeightDecay', 'learning\\_rate': 2e-05, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight\\_decay\\_rate': 0.01}\n* training\\_precision: float32### Training results### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* TensorFlow 2.8.0-rc0\n* Datasets 1.17.0\n* Tokenizers 0.11.0"
] | [
-0.0688353106379509,
0.1119588240981102,
-0.003656041342765093,
0.06594984233379364,
0.10057453066110611,
0.013652457855641842,
0.14350338280200958,
0.16346041858196259,
-0.11449269950389862,
0.10041054338216782,
0.1512230634689331,
0.16467174887657166,
0.059491969645023346,
0.14541497826576233,
-0.12799812853336334,
-0.15808497369289398,
0.06192586198449135,
0.013492530211806297,
-0.021465176716446877,
0.09597477316856384,
0.07813200354576111,
-0.09258787333965302,
0.10909410566091537,
-0.003466669237241149,
-0.1855696588754654,
0.012335756793618202,
0.07140978425741196,
-0.0791933611035347,
0.0843430906534195,
0.06055273115634918,
0.08077111840248108,
0.038734808564186096,
0.019176550209522247,
-0.16649159789085388,
0.01157398521900177,
0.10958126932382584,
-0.01437088381499052,
0.11514796316623688,
0.018378671258687973,
-0.049743928015232086,
0.12799520790576935,
-0.1082378700375557,
0.03038901835680008,
0.04277290403842926,
-0.11549682170152664,
-0.23155426979064941,
-0.1416327804327011,
0.047627657651901245,
0.04627598077058792,
0.06516743451356888,
0.007196440361440182,
0.2139727771282196,
0.01971743069589138,
0.10901478677988052,
0.21570169925689697,
-0.33604496717453003,
-0.03552648052573204,
0.05080963298678398,
0.041507426649332047,
0.057118579745292664,
-0.054157618433237076,
0.04284703731536865,
0.05620903894305229,
0.03947308287024498,
0.07341557741165161,
-0.033225543797016144,
-0.08694250136613846,
-0.034709811210632324,
-0.09618331491947174,
-0.04691619053483009,
0.1573549509048462,
0.028501510620117188,
-0.06625521183013916,
-0.03646860271692276,
-0.08092828840017319,
-0.17248499393463135,
-0.020917030051350594,
-0.024501189589500427,
0.025681953877210617,
-0.001954212784767151,
-0.08091596513986588,
-0.06607130914926529,
-0.06389341503381729,
-0.04255101457238197,
-0.09333828836679459,
0.11542242020368576,
0.017088264226913452,
0.07252264767885208,
-0.03738005459308624,
0.05435839667916298,
-0.08381479978561401,
-0.1328495293855667,
-0.041530806571245193,
-0.0008822246454656124,
-0.02366374433040619,
-0.04211827740073204,
-0.05877621844410896,
-0.12363243103027344,
0.07017055153846741,
0.14313746988773346,
-0.12425673007965088,
0.09211858361959457,
-0.1232132613658905,
0.03435458242893219,
-0.09665530174970627,
0.12020476907491684,
-0.03801877051591873,
0.02279268205165863,
0.049816641956567764,
0.01604781486093998,
0.08996447175741196,
-0.04088818281888962,
-0.09518077969551086,
0.007463822141289711,
0.0783609077334404,
0.029108474031090736,
-0.02083458937704563,
0.07550898939371109,
-0.04112396389245987,
-0.010731138288974762,
0.03383021056652069,
-0.09316728264093399,
0.03609832376241684,
0.004419480450451374,
-0.07793650031089783,
0.026268813759088516,
0.07255809009075165,
-0.0017123253783211112,
-0.0670863538980484,
0.015562635846436024,
-0.06399600207805634,
-0.007387044373899698,
-0.0828118696808815,
-0.13913819193840027,
0.05489904060959816,
-0.1382797211408615,
-0.03261233866214752,
-0.0616285540163517,
-0.1806042343378067,
-0.008740268647670746,
0.05709434300661087,
-0.06594635546207428,
0.011461207643151283,
-0.053520090878009796,
-0.155287504196167,
0.060721173882484436,
-0.011293530464172363,
0.1011456623673439,
-0.05709674581885338,
0.05191435292363167,
0.005111787933856249,
0.08419153094291687,
-0.11099924147129059,
0.03112972527742386,
-0.05285772681236267,
0.030238758772611618,
-0.19885855913162231,
0.10336771607398987,
-0.06516321748495102,
0.018495166674256325,
-0.15356996655464172,
-0.05105580762028694,
0.011014845222234726,
-0.021054400131106377,
0.11338883638381958,
0.11169479042291641,
-0.1975010335445404,
-0.042309317737817764,
0.1761709749698639,
-0.08476956188678741,
-0.1147519126534462,
0.11016036570072174,
-0.05312653258442879,
-0.0063416436314582825,
0.07739832997322083,
0.14479370415210724,
-0.014895223081111908,
-0.09964989870786667,
-0.04694583639502525,
-0.014430351555347443,
-0.005692839622497559,
-0.04128742590546608,
0.01979966275393963,
-0.023308249190449715,
0.029340578243136406,
0.0005428201402537525,
0.004090922884643078,
-0.002469181315973401,
-0.07220681011676788,
-0.05130971968173981,
-0.08022405207157135,
-0.04918140918016434,
0.02284236066043377,
0.01648818887770176,
0.05905884504318237,
-0.1148955449461937,
-0.1274741142988205,
0.05056101828813553,
0.043171972036361694,
-0.07014283537864685,
0.057374123483896255,
-0.10514749586582184,
0.07154510170221329,
-0.06286877393722534,
0.01955908164381981,
-0.18979932367801666,
-0.05280472710728645,
0.022983426228165627,
0.03995796665549278,
0.023890797048807144,
-0.05168980732560158,
0.07480882853269577,
0.014224033802747726,
-0.047371868044137955,
0.021780801936984062,
0.022881794720888138,
0.006018099840730429,
-0.09765990823507309,
-0.2515612244606018,
-0.005319993942975998,
-0.050792910158634186,
0.05963638052344322,
-0.20954087376594543,
0.027963312342762947,
0.09238063544034958,
0.16082894802093506,
0.05197907239198685,
-0.021762963384389877,
-0.010106178000569344,
0.03053469955921173,
-0.05582943558692932,
-0.07676711678504944,
0.026192273944616318,
0.02914448082447052,
-0.12949566543102264,
0.0529836006462574,
-0.17388410866260529,
0.09911247342824936,
0.16021402180194855,
-0.032600514590740204,
-0.0871715396642685,
0.04403079301118851,
-0.02118111215531826,
-0.012185520492494106,
-0.021576153114438057,
0.01934850960969925,
0.11972713470458984,
0.033702682703733444,
0.14727041125297546,
-0.07740312814712524,
-0.045533694326877594,
0.055039264261722565,
-0.03979368135333061,
-0.02610653080046177,
0.08524111658334732,
-0.027716193348169327,
-0.15211617946624756,
0.11789163947105408,
0.10944906622171402,
-0.08240893483161926,
0.11538214236497879,
-0.05454617366194725,
-0.07218189537525177,
-0.05319257825613022,
0.03286523371934891,
0.05215068161487579,
0.08165522664785385,
-0.08434353023767471,
-0.000013931694411439821,
0.03607165813446045,
0.018837928771972656,
-0.003340748604387045,
-0.17041389644145966,
0.03294853866100311,
-0.01315448246896267,
-0.08181837201118469,
-0.0000011322334785290877,
0.013848984614014626,
0.022284233942627907,
0.12873974442481995,
0.019092299044132233,
-0.02655801735818386,
0.07474222034215927,
0.0014425921253859997,
-0.08306285738945007,
0.20555241405963898,
-0.12371256947517395,
-0.11084313690662384,
-0.11583074927330017,
-0.09322337806224823,
-0.08240460604429245,
-0.0033515149261802435,
0.021720565855503082,
-0.0929916724562645,
-0.06959552317857742,
-0.10140582174062729,
-0.03019697032868862,
-0.0075335134752094746,
0.04208846017718315,
0.05520036816596985,
-0.013721521012485027,
0.09895340353250504,
-0.10234908014535904,
-0.03719634190201759,
-0.03622034937143326,
-0.021251266822218895,
0.02278577908873558,
0.01014276035130024,
0.03465433791279793,
0.11962447315454483,
-0.05455892160534859,
0.051878102123737335,
-0.040221892297267914,
0.17558474838733673,
-0.04587525501847267,
0.004545179661363363,
0.11898419260978699,
-0.030494539067149162,
0.06863980740308762,
0.11460236459970474,
0.041492585092782974,
-0.10254961997270584,
0.01049880776554346,
0.058381088078022,
-0.055213022977113724,
-0.2834039628505707,
-0.0191339161247015,
-0.04340359941124916,
-0.044046223163604736,
0.04280507564544678,
0.04266701266169548,
0.12955524027347565,
0.03823639079928398,
0.0024429005570709705,
0.09456334263086319,
-0.001311973319388926,
0.0914027988910675,
0.1705070286989212,
0.0641602873802185,
0.12209106981754303,
-0.08599786460399628,
0.022568965330719948,
0.06028670445084572,
0.0159174595028162,
0.20743796229362488,
0.004651555325835943,
0.15923857688903809,
0.06838776171207428,
0.08495298773050308,
0.003313267370685935,
0.02261318266391754,
0.0030600212048739195,
-0.015337247401475906,
0.006609456613659859,
-0.07013396918773651,
-0.014554671943187714,
0.03552445396780968,
-0.08175739645957947,
0.050031065940856934,
-0.06917475908994675,
0.06869825720787048,
0.06831011921167374,
0.27634647488594055,
0.0761011466383934,
-0.3816700577735901,
-0.10776615142822266,
0.015619203448295593,
0.0014811219880357385,
-0.04667681083083153,
-0.02616877853870392,
0.09195072948932648,
-0.04431165009737015,
0.1217879056930542,
-0.08175058662891388,
0.08043946325778961,
-0.04255494847893715,
0.021853307262063026,
0.03248322010040283,
0.1398225873708725,
-0.012918503023684025,
0.009406122379004955,
-0.30531007051467896,
0.27524909377098083,
0.06272329390048981,
0.12014468759298325,
-0.08352450281381607,
0.02788446471095085,
0.028566189110279083,
0.029711291193962097,
0.10664186626672745,
-0.02198198437690735,
-0.1651630699634552,
-0.08188073337078094,
-0.08253446966409683,
0.002735872520133853,
0.11962436884641647,
0.0804857388138771,
0.10242150723934174,
-0.01944408379495144,
-0.006312080658972263,
0.06870193779468536,
-0.05490226671099663,
-0.1075727641582489,
-0.07492144405841827,
0.021168716251850128,
0.09532562643289566,
-0.08649297058582306,
-0.044866155833005905,
-0.08408410102128983,
-0.05312665179371834,
0.22165027260780334,
-0.06911318749189377,
-0.056298453360795975,
-0.12982778251171112,
0.10753342509269714,
0.08957704901695251,
-0.06325843930244446,
0.02924734726548195,
-0.03125746175646782,
0.0992434024810791,
0.030465897172689438,
-0.13026343286037445,
0.12981492280960083,
-0.036854878067970276,
-0.17946968972682953,
-0.04275227338075638,
0.10409823060035706,
0.0284030269831419,
0.04103456437587738,
0.006412446033209562,
0.06952313333749771,
0.025467002764344215,
-0.08555383235216141,
0.08550344407558441,
0.02901769056916237,
0.06998030096292496,
0.01625262014567852,
0.005201523192226887,
-0.05491847172379494,
-0.030020296573638916,
0.01479526050388813,
0.15634411573410034,
0.23504377901554108,
-0.09781637042760849,
0.07037403434515,
0.015835842117667198,
-0.09000662714242935,
-0.20374810695648193,
0.09603128582239151,
0.05162503197789192,
0.002610436175018549,
-0.06081430986523628,
-0.14465755224227905,
0.06559630483388901,
0.07883624732494354,
-0.016406264156103134,
0.09694501012563705,
-0.27546802163124084,
-0.1320839375257492,
0.08453883230686188,
0.10280381888151169,
0.1376023292541504,
-0.1525716930627823,
-0.07148325443267822,
-0.06300666928291321,
-0.08201780915260315,
0.17339816689491272,
-0.1980937123298645,
0.10257396847009659,
-0.005094533320516348,
0.058598846197128296,
0.003693384351208806,
-0.05419515073299408,
0.09719884395599365,
-0.008025022223591805,
0.0841585323214531,
-0.05950893461704254,
0.017112471163272858,
0.16448664665222168,
-0.08692378550767899,
0.059101417660713196,
-0.08049140870571136,
0.04171063005924225,
-0.062073640525341034,
0.006941325031220913,
-0.06309523433446884,
0.05731111392378807,
-0.03447745367884636,
-0.012872862629592419,
-0.039489828050136566,
0.001620573690161109,
0.06851828843355179,
-0.03430023044347763,
0.1953319013118744,
0.0005993922241032124,
0.16313348710536957,
0.23554499447345734,
0.10852569341659546,
-0.06294439733028412,
0.05622820183634758,
0.07633943110704422,
-0.05753650516271591,
0.05145011842250824,
-0.22097483277320862,
0.04769100248813629,
0.11519987136125565,
-0.00045313884038478136,
0.11754389107227325,
0.05980820208787918,
-0.08187336474657059,
0.04656087979674339,
0.05542369931936264,
-0.14764612913131714,
-0.11755716055631638,
0.047201670706272125,
-0.0288500115275383,
-0.09104718267917633,
0.09884458780288696,
0.16904930770397186,
-0.04186514392495155,
0.01698325388133526,
0.013843024149537086,
0.024904677644371986,
-0.06303858011960983,
0.14026746153831482,
0.018713222816586494,
0.042376644909381866,
-0.10921730101108551,
0.13482677936553955,
0.026556367054581642,
-0.11010797321796417,
0.11389215290546417,
0.07745464146137238,
-0.09772404283285141,
-0.01808866113424301,
0.027275122702121735,
0.14360198378562927,
-0.05559968203306198,
-0.061465296894311905,
-0.13508257269859314,
-0.12755616009235382,
0.08857735246419907,
0.28600504994392395,
0.04125405475497246,
0.0423772968351841,
-0.03797995299100876,
-0.028126589953899384,
-0.12403285503387451,
0.06860493868589401,
0.036982182413339615,
0.06184329092502594,
-0.12718433141708374,
0.11815353482961655,
-0.0134956706315279,
0.018588032573461533,
-0.029815977439284325,
0.030128592625260353,
-0.13370655477046967,
-0.003594766603782773,
-0.1763862669467926,
0.027121642604470253,
-0.037119653075933456,
-0.029461678117513657,
-0.0026344815269112587,
-0.05540420487523079,
-0.08808233588933945,
0.04768258333206177,
-0.0966639593243599,
-0.043055541813373566,
0.028341088443994522,
0.01592434011399746,
-0.14863263070583344,
-0.03834330290555954,
-0.021114859730005264,
-0.07595393061637878,
0.06805819272994995,
0.07286014407873154,
-0.02079634927213192,
0.029213067144155502,
-0.11020299047231674,
-0.004433959256857634,
0.07697176933288574,
-0.007191751152276993,
0.08073021471500397,
-0.09302815049886703,
0.002258328255265951,
0.01998448744416237,
0.03502088412642479,
0.03340193256735802,
0.14117510616779327,
-0.07215279340744019,
-0.023221485316753387,
-0.05511235073208809,
-0.0008231943938881159,
-0.06023937091231346,
0.09771303832530975,
0.15137822926044464,
0.02342280186712742,
0.15393735468387604,
-0.10813510417938232,
-0.008833006024360657,
-0.16988229751586914,
0.01159214973449707,
0.005499791353940964,
-0.12925918400287628,
-0.08957815915346146,
0.004873217083513737,
0.09404891729354858,
-0.09361310303211212,
0.1307990998029709,
-0.030756451189517975,
0.04236351698637009,
0.06383053213357925,
-0.04915216565132141,
-0.09985582530498505,
0.03910932317376137,
0.19066782295703888,
0.030120106413960457,
-0.040795356035232544,
0.04975590482354164,
0.0017449341248720884,
0.09433149546384811,
0.0727107971906662,
0.2143869251012802,
0.12975174188613892,
0.03669769689440727,
0.14962032437324524,
0.07748083025217056,
-0.04430759698152542,
-0.1081315279006958,
0.14244164526462555,
-0.08529907464981079,
0.1644531488418579,
-0.02289803884923458,
0.08715657889842987,
0.12362153828144073,
-0.14900033175945282,
0.01171777956187725,
-0.04190858080983162,
-0.08671146631240845,
-0.14785179495811462,
-0.10247006267309189,
-0.11306094378232956,
-0.13821445405483246,
0.0030826320871710777,
-0.12273160368204117,
0.07638228684663773,
0.052267689257860184,
0.04084872454404831,
0.003667142242193222,
0.10354840010404587,
-0.014019396156072617,
0.007032349705696106,
0.08618602901697159,
-0.005693039391189814,
-0.031128402799367905,
-0.02398994378745556,
-0.07001768052577972,
0.0668816864490509,
0.0011822143569588661,
0.053841643035411835,
0.030742760747671127,
0.05891520902514458,
0.07043915241956711,
-0.05624905973672867,
-0.11439869552850723,
0.039375897496938705,
0.06761667132377625,
0.04394738748669624,
0.054220180958509445,
0.05094660073518753,
-0.024809569120407104,
-0.021192364394664764,
0.15452995896339417,
-0.10465078055858612,
-0.026861771941184998,
-0.1619737148284912,
0.2346874177455902,
0.010546195320785046,
-0.006826700177043676,
0.0162076223641634,
-0.07691540569067001,
-0.02885483019053936,
0.15370409190654755,
0.1766050159931183,
-0.019310040399432182,
-0.0245039165019989,
0.010205130092799664,
-0.012630493380129337,
-0.04291456937789917,
0.10281311720609665,
0.09467937052249908,
-0.01848035864531994,
-0.05430348962545395,
-0.0716351643204689,
-0.019907677546143532,
-0.016143517568707466,
-0.038129739463329315,
0.07027313113212585,
-0.0008173149544745684,
-0.022201206535100937,
-0.0015152801061049104,
0.0540720634162426,
-0.0748525932431221,
-0.11652880162000656,
0.059562552720308304,
-0.21376283466815948,
-0.13960228860378265,
0.00950699020177126,
-0.005987070966511965,
-0.010906794108450413,
0.03935425728559494,
-0.01486659049987793,
-0.006757813505828381,
0.11258870363235474,
-0.03960583731532097,
-0.07875408977270126,
-0.0705837607383728,
0.04373863339424133,
-0.14791186153888702,
0.17085324227809906,
-0.021171528846025467,
0.04309533163905144,
0.15367697179317474,
0.02738417498767376,
-0.11826495826244354,
0.039351556450128555,
0.03177191689610481,
-0.06335709989070892,
0.0019093494629487395,
0.11462392657995224,
-0.034247562289237976,
0.11232101172208786,
0.063266821205616,
-0.09998369961977005,
0.0006425846950151026,
-0.0995454266667366,
-0.07198316603899002,
-0.04915241524577141,
-0.06540942192077637,
-0.07577230036258698,
0.1185574010014534,
0.19559769332408905,
-0.030780469998717308,
0.03170696645975113,
-0.049930695444345474,
-0.004479198716580868,
0.06966698169708252,
-0.01735588163137436,
-0.04442664980888367,
-0.23447869718074799,
0.06049944460391998,
0.12570345401763916,
0.02616993710398674,
-0.24857255816459656,
-0.06696747243404388,
-0.004695525858551264,
-0.02402767911553383,
-0.10391045361757278,
0.07269442081451416,
0.09984621405601501,
0.06154308095574379,
-0.06240377202630043,
-0.07783862948417664,
-0.018056262284517288,
0.15452706813812256,
-0.10994325578212738,
-0.06906913965940475
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# test-model-tf
This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset.
It achieves the following results on the evaluation set:
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: None
- training_precision: float32
### Training results
### Framework versions
- Transformers 4.14.0.dev0
- TensorFlow 2.6.0
- Datasets 1.16.2.dev0
- Tokenizers 0.10.3
| {"tags": ["generated_from_keras_callback"], "model-index": [{"name": "test-model-tf", "results": []}]} | feature-extraction | Rocketknight1/test-model-tf | [
"transformers",
"tf",
"bert",
"feature-extraction",
"generated_from_keras_callback",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #tf #bert #feature-extraction #generated_from_keras_callback #endpoints_compatible #region-us
|
# test-model-tf
This model is a fine-tuned version of [](URL on an unknown dataset.
It achieves the following results on the evaluation set:
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: None
- training_precision: float32
### Training results
### Framework versions
- Transformers 4.14.0.dev0
- TensorFlow 2.6.0
- Datasets 1.16.2.dev0
- Tokenizers 0.10.3
| [
"# test-model-tf\n\nThis model is a fine-tuned version of [](URL on an unknown dataset.\nIt achieves the following results on the evaluation set:",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- optimizer: None\n- training_precision: float32",
"### Training results",
"### Framework versions\n\n- Transformers 4.14.0.dev0\n- TensorFlow 2.6.0\n- Datasets 1.16.2.dev0\n- Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #tf #bert #feature-extraction #generated_from_keras_callback #endpoints_compatible #region-us \n",
"# test-model-tf\n\nThis model is a fine-tuned version of [](URL on an unknown dataset.\nIt achieves the following results on the evaluation set:",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- optimizer: None\n- training_precision: float32",
"### Training results",
"### Framework versions\n\n- Transformers 4.14.0.dev0\n- TensorFlow 2.6.0\n- Datasets 1.16.2.dev0\n- Tokenizers 0.10.3"
] | [
39,
40,
6,
12,
8,
3,
33,
4,
38
] | [
"passage: TAGS\n#transformers #tf #bert #feature-extraction #generated_from_keras_callback #endpoints_compatible #region-us \n# test-model-tf\n\nThis model is a fine-tuned version of [](URL on an unknown dataset.\nIt achieves the following results on the evaluation set:## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- optimizer: None\n- training_precision: float32### Training results### Framework versions\n\n- Transformers 4.14.0.dev0\n- TensorFlow 2.6.0\n- Datasets 1.16.2.dev0\n- Tokenizers 0.10.3"
] | [
-0.07775215059518814,
0.024776943027973175,
-0.0010557319037616253,
0.07441964745521545,
0.15813614428043365,
0.02308465540409088,
0.07044433802366257,
0.12178193032741547,
-0.14878806471824646,
0.01306493952870369,
0.08078105747699738,
0.08351130038499832,
-0.0016278421971946955,
0.08702005445957184,
-0.04684620350599289,
-0.23305635154247284,
0.013330583460628986,
0.02660485729575157,
-0.08053750544786453,
0.09831557422876358,
0.08937880396842957,
-0.0989062562584877,
0.09639879316091537,
0.007841978222131729,
-0.21479034423828125,
0.045991815626621246,
0.046593569219112396,
-0.0838160291314125,
0.09407307207584381,
0.02928839810192585,
0.10544011741876602,
0.020460546016693115,
0.07207822054624557,
-0.09578946977853775,
0.021218005567789078,
0.09659209102392197,
0.005709352437406778,
0.07146120816469193,
0.041910309344530106,
-0.0012677764752879739,
0.143655925989151,
-0.04289039224386215,
0.07000226527452469,
0.044148385524749756,
-0.1012839525938034,
-0.16377536952495575,
-0.08722885698080063,
-0.00249045854434371,
-0.010975171811878681,
0.10095145553350449,
0.012984175235033035,
0.2989319860935211,
-0.0795588418841362,
0.10025616735219955,
0.16024324297904968,
-0.2929011285305023,
-0.08044842630624771,
0.042917199432849884,
0.08356505632400513,
0.05088769644498825,
-0.05827854201197624,
0.0444197840988636,
0.06740979850292206,
0.06883550435304642,
0.050337597727775574,
-0.04827781766653061,
-0.19431744515895844,
0.02643829770386219,
-0.1119220182299614,
0.021887827664613724,
0.21353855729103088,
0.011585995554924011,
-0.08160517364740372,
0.026892606168985367,
-0.07134546339511871,
-0.017494292929768562,
-0.008312133140861988,
-0.08316844701766968,
0.05855138599872589,
-0.013558630831539631,
-0.1115201786160469,
-0.08362062275409698,
-0.1024927869439125,
-0.08213354647159576,
-0.1157456487417221,
0.16521504521369934,
0.011562049388885498,
0.06819711625576019,
-0.14023257791996002,
0.0785641148686409,
-0.08560365438461304,
-0.11049750447273254,
-0.017602017149329185,
-0.047636546194553375,
-0.0442400686442852,
-0.097274549305439,
-0.09403213858604431,
-0.23494713008403778,
0.029712095856666565,
0.07535463571548462,
-0.09025438874959946,
0.0629735216498375,
-0.10698255896568298,
0.01725275255739689,
-0.01524859294295311,
0.168844074010849,
-0.04993641749024391,
0.029492560774087906,
0.024060817435383797,
-0.013048269785940647,
-0.026255255565047264,
-0.01555927749723196,
-0.08811502903699875,
0.007288627326488495,
0.036618445068597794,
0.0369587242603302,
-0.0719490796327591,
0.0858120545744896,
-0.0163805540651083,
-0.017863335087895393,
-0.008059049025177956,
-0.091084785759449,
0.02663758024573326,
-0.02456972748041153,
-0.07969000190496445,
0.05873808637261391,
0.10242439806461334,
-0.055047858506441116,
-0.04926483705639839,
-0.0019030234543606639,
-0.10558086633682251,
0.008847543969750404,
-0.08641916513442993,
-0.12261962890625,
0.011655338108539581,
-0.06862574815750122,
0.020414792001247406,
-0.11086839437484741,
-0.16465266048908234,
0.008006814867258072,
0.08073071390390396,
-0.06319216638803482,
0.07680273056030273,
-0.016740188002586365,
-0.09030788391828537,
-0.005610320251435041,
0.010524743236601353,
0.09348174184560776,
-0.04898734390735626,
0.05968019366264343,
0.040467165410518646,
0.04393625259399414,
-0.07170167565345764,
0.041770972311496735,
-0.09463579207658768,
0.01089976541697979,
-0.183927521109581,
0.07177725434303284,
-0.07661924511194229,
0.07469765096902847,
-0.09223934262990952,
-0.09892146289348602,
-0.02847209945321083,
0.01071834098547697,
0.10785666853189468,
0.1280503273010254,
-0.23051071166992188,
-0.011135795153677464,
0.1479705274105072,
-0.10112681239843369,
-0.1293664127588272,
0.07029373943805695,
-0.062235791236162186,
0.14064142107963562,
0.0751647800207138,
0.161423921585083,
0.07658232003450394,
-0.15630170702934265,
0.06611037999391556,
0.055115364491939545,
-0.053323518484830856,
-0.01842331327497959,
-0.019254345446825027,
-0.005035247653722763,
-0.06488802284002304,
0.033134035766124725,
-0.00259675201959908,
0.010479111224412918,
-0.11240153759717941,
-0.04621361196041107,
-0.08011413365602493,
-0.08668933063745499,
0.05119006335735321,
0.004757305141538382,
0.0868706926703453,
-0.09296764433383942,
-0.10029935091733932,
0.16170834004878998,
0.056240178644657135,
-0.030406080186367035,
0.03919888660311699,
-0.12789781391620636,
0.05775567889213562,
-0.08367536962032318,
-0.004892615135759115,
-0.23008094727993011,
-0.07713418453931808,
0.02577807381749153,
0.09821038693189621,
0.06503962725400925,
0.03857786953449249,
0.09661226719617844,
0.018732529133558273,
-0.034036148339509964,
0.033393148332834244,
-0.03536134213209152,
0.03310404717922211,
-0.10610250383615494,
-0.16549786925315857,
0.0019973251037299633,
-0.0549360029399395,
-0.044137392193078995,
-0.1950693279504776,
-0.019462531432509422,
0.06244523450732231,
0.13339118659496307,
0.05742226913571358,
-0.0010591814061626792,
-0.01682649552822113,
0.03081430494785309,
-0.023097584024071693,
-0.09181291610002518,
0.02468249201774597,
0.03647327050566673,
-0.08873793482780457,
-0.011666614562273026,
-0.07585269212722778,
0.10581877082586288,
0.12520909309387207,
-0.06843335926532745,
-0.12142641842365265,
0.07460199296474457,
-0.05253007635474205,
-0.00631675124168396,
0.002349586458876729,
0.035930734127759933,
0.14932781457901,
-0.004346433561295271,
0.13757529854774475,
-0.07032624632120132,
-0.03511160612106323,
0.05047822371125221,
-0.015521523542702198,
-0.01979568973183632,
0.035913240164518356,
0.0027324524708092213,
-0.1833677440881729,
0.04199371114373207,
0.08920541405677795,
-0.049839332699775696,
0.13914866745471954,
-0.06322294473648071,
-0.10526957362890244,
-0.040435731410980225,
-0.003971537109464407,
0.01712200790643692,
0.15198226273059845,
-0.13516239821910858,
-0.002448807004839182,
0.020001763477921486,
0.04861287400126457,
0.052144020795822144,
-0.15733100473880768,
0.000903053383808583,
0.02657264657318592,
-0.014197259210050106,
-0.06632860749959946,
0.02329784445464611,
-0.0004769695224240422,
0.10020043700933456,
0.02550916001200676,
-0.0008951162453740835,
0.06926321983337402,
0.011320149526000023,
-0.07638631016016006,
0.19901533424854279,
-0.1467314213514328,
-0.09584100544452667,
-0.09384303539991379,
0.024454353377223015,
-0.03870992735028267,
-0.010036543942987919,
0.04641959071159363,
-0.09593150019645691,
-0.06895992904901505,
-0.08273844420909882,
-0.01947229541838169,
-0.06625710427761078,
0.02866481803357601,
-0.0005173892132006586,
0.006620068568736315,
0.1057065799832344,
-0.15259654819965363,
-0.006194368936121464,
-0.04382551088929176,
-0.07393098622560501,
0.02348046377301216,
-0.021492229774594307,
0.10807638615369797,
0.09026828408241272,
-0.08147580921649933,
0.04721316322684288,
-0.035744599997997284,
0.22878959774971008,
-0.06857644766569138,
-0.006727917585521936,
0.10023995488882065,
-0.018879257142543793,
0.01901995576918125,
0.015658337622880936,
0.024189380928874016,
-0.12256442755460739,
0.056726276874542236,
0.03691736236214638,
-0.04585956037044525,
-0.2560133635997772,
-0.021639179438352585,
-0.038065727800130844,
-0.07889392226934433,
-0.003212108975276351,
0.055965621024370193,
0.04631471261382103,
0.045942142605781555,
0.0933070108294487,
0.08513253927230835,
-0.0868380144238472,
0.051529768854379654,
0.11127965152263641,
0.02980491891503334,
0.05622577294707298,
-0.08263476938009262,
-0.017251063138246536,
0.10282932221889496,
-0.027246374636888504,
0.26580366492271423,
-0.008202507160604,
0.04112936183810234,
0.07672426104545593,
0.10182306915521622,
-0.010517934337258339,
0.11098065227270126,
0.01117370743304491,
-0.04248864948749542,
-0.00007965903205331415,
-0.07191745191812515,
-0.014041700400412083,
0.0032354709692299366,
-0.06953729689121246,
0.026155410334467888,
-0.11348725110292435,
0.018064258620142937,
0.06250427663326263,
0.20843186974525452,
0.00013525942631531507,
-0.3195682466030121,
-0.11751808226108551,
-0.04053555801510811,
-0.014350635930895805,
-0.07346941530704498,
0.016745204105973244,
0.12632963061332703,
-0.06363386660814285,
0.06687761098146439,
-0.05630786716938019,
0.09950637817382812,
0.075123630464077,
0.029512224718928337,
0.05543069541454315,
0.11672279983758926,
-0.028845010325312614,
0.06078215688467026,
-0.2493213713169098,
0.2459743171930313,
0.042057104408741,
0.1364016830921173,
-0.05382177606225014,
-0.005596395581960678,
0.029780015349388123,
0.15307411551475525,
0.12096168845891953,
-0.021183477714657784,
-0.06674792617559433,
-0.10370643436908722,
0.009334265254437923,
0.04369397461414337,
0.13215473294258118,
0.09827324748039246,
0.10166765004396439,
-0.033119428902864456,
0.014264172874391079,
0.0793255865573883,
0.001302505494095385,
-0.1967744678258896,
-0.040792565792798996,
-0.011772015132009983,
-0.030552055686712265,
-0.09040907770395279,
-0.05292781442403793,
-0.07847576588392258,
0.017149221152067184,
0.16364125907421112,
0.07137798517942429,
-0.025404205545783043,
-0.16379833221435547,
0.045807112008333206,
0.12561650574207306,
-0.03199036046862602,
0.007594037335366011,
0.0024651112034916878,
0.06538457423448563,
0.07537294924259186,
-0.16226032376289368,
0.10826295614242554,
-0.0600111223757267,
-0.09968552738428116,
-0.04682717099785805,
0.05288593843579292,
0.048887208104133606,
0.026437044143676758,
0.005724859423935413,
-0.00018844951409846544,
0.018401023000478745,
-0.08952724188566208,
0.010290104895830154,
-0.015928830951452255,
0.035967957228422165,
0.00894547626376152,
-0.07129189372062683,
0.051667001098394394,
0.003796019358560443,
0.03641987219452858,
0.09743807464838028,
0.12266531586647034,
-0.07143652439117432,
0.07250591367483139,
0.0634269192814827,
-0.09397147595882416,
-0.21393603086471558,
0.16932274401187897,
0.026667684316635132,
0.010632794350385666,
0.052016038447618484,
-0.13986805081367493,
0.12862245738506317,
0.02559938281774521,
0.003892185166478157,
0.10301770269870758,
-0.22444748878479004,
-0.11972517520189285,
0.08526059240102768,
0.12386719882488251,
0.1329934298992157,
-0.14234790205955505,
-0.03336917236447334,
-0.03761562332510948,
-0.11214713752269745,
0.15308982133865356,
-0.2477876991033554,
0.09153519570827484,
0.0033412796910852194,
0.09622306376695633,
0.03127804026007652,
-0.023238511756062508,
0.07865002006292343,
-0.018106447532773018,
0.1268618106842041,
-0.06738132238388062,
0.013311697170138359,
0.2068599909543991,
-0.03875884786248207,
0.09318935126066208,
0.0465073436498642,
0.09353198111057281,
-0.04805964231491089,
-0.0061927917413413525,
-0.07684280723333359,
0.06581559777259827,
-0.02719797007739544,
-0.0634843111038208,
-0.044955939054489136,
0.03177792206406593,
0.04567335918545723,
-0.06347323954105377,
0.07298283278942108,
0.014940774999558926,
0.121042899787426,
0.12089387327432632,
0.15989018976688385,
-0.07809792459011078,
-0.028047408908605576,
0.08213891834020615,
-0.04023442044854164,
0.08974238485097885,
-0.17724989354610443,
0.026655973866581917,
0.10567466914653778,
0.016231901943683624,
0.0793977901339531,
0.08142605423927307,
-0.08798188716173172,
0.008103800937533379,
0.05095404386520386,
-0.13253116607666016,
-0.11880045384168625,
-0.025539269670844078,
-0.10343077033758163,
-0.05912601202726364,
0.08565554022789001,
0.15448050200939178,
-0.07783283293247223,
0.032197706401348114,
-0.018044210970401764,
-0.04837100952863693,
-0.10012682527303696,
0.15903432667255402,
0.06190086901187897,
0.015905382111668587,
-0.07573377341032028,
0.13234686851501465,
0.0014006446581333876,
-0.095322385430336,
0.07165666669607162,
0.049196258187294006,
-0.10371402651071548,
-0.047382015734910965,
0.09744113683700562,
0.23853513598442078,
-0.07960844784975052,
-0.05171080678701401,
-0.12550222873687744,
-0.06560046970844269,
0.036747414618730545,
0.25622373819351196,
0.05814148485660553,
0.038842927664518356,
-0.0730467364192009,
0.06037076562643051,
-0.1344287097454071,
0.047449663281440735,
0.02380458638072014,
0.04876532033085823,
-0.14195853471755981,
0.14936664700508118,
-0.024950027465820312,
0.08141843229532242,
-0.08642028272151947,
-0.000026968802558258176,
-0.12110412865877151,
0.013432078994810581,
-0.23245476186275482,
-0.0019067794783040881,
-0.00010405576176708564,
-0.008868852630257607,
0.024723641574382782,
-0.05005665123462677,
-0.07770496606826782,
0.062035996466875076,
-0.09100112318992615,
-0.0012115085264667869,
0.06227222830057144,
0.009007973596453667,
-0.11385530233383179,
-0.007141543086618185,
-0.014925687573850155,
-0.05697162449359894,
0.04791596159338951,
0.07905519008636475,
-0.052133362740278244,
0.08697809278964996,
-0.16012950241565704,
-0.0495542511343956,
0.030765146017074585,
0.009232858195900917,
0.11873548477888107,
0.014852982014417648,
0.0004343315085861832,
0.01679658703505993,
0.08480148762464523,
0.015068808570504189,
0.07301311194896698,
-0.0880047008395195,
-0.07168222218751907,
-0.034194234758615494,
-0.029563933610916138,
-0.05378780514001846,
0.032282620668411255,
0.11681121587753296,
0.061346836388111115,
0.13183876872062683,
-0.0906350240111351,
-0.0015228706179186702,
-0.1510031819343567,
-0.04521048814058304,
0.016071248799562454,
-0.06177860125899315,
-0.05475685000419617,
-0.021728571504354477,
0.08624814450740814,
-0.09349093586206436,
0.15884771943092346,
0.041461601853370667,
0.14131136238574982,
0.03981735557317734,
-0.04563425853848457,
-0.05062068626284599,
0.01389279030263424,
0.20510761439800262,
0.038920119404792786,
-0.013324340805411339,
-0.026896554976701736,
0.027844484895467758,
0.04623466730117798,
-0.023268265649676323,
0.20883868634700775,
0.08550484478473663,
-0.05048339441418648,
0.10927357524633408,
0.06326626986265182,
-0.019413907080888748,
-0.12683793902397156,
-0.01819358393549919,
-0.0295631755143404,
0.1231108233332634,
-0.049747321754693985,
0.012826338410377502,
0.09065619856119156,
-0.09239435195922852,
0.06752060353755951,
-0.08482906967401505,
-0.08112703263759613,
-0.11701322346925735,
-0.04354199022054672,
-0.08752430975437164,
-0.16030535101890564,
0.015457799658179283,
-0.11223757266998291,
0.03245595842599869,
0.022374944761395454,
0.01699022762477398,
-0.025875309482216835,
0.20471477508544922,
-0.05985157936811447,
-0.00330662471242249,
0.10179409384727478,
-0.016131427139043808,
-0.03318452462553978,
-0.00969407893717289,
-0.009427365846931934,
0.033018067479133606,
-0.004277227446436882,
0.006576508283615112,
-0.0061967214569449425,
-0.0143381142988801,
0.027685366570949554,
-0.03446083515882492,
-0.0717952623963356,
0.022047093138098717,
0.05287734419107437,
-0.026782697066664696,
-0.031412478536367416,
0.05223459005355835,
-0.029221413657069206,
-0.026273930445313454,
0.22719502449035645,
-0.09513000398874283,
-0.06384878605604172,
-0.14851312339305878,
0.30280622839927673,
0.04129720851778984,
0.0209877360612154,
0.04056752100586891,
-0.10502194613218307,
-0.02937638945877552,
0.2401113361120224,
0.17917627096176147,
-0.07916205376386642,
-0.01803823933005333,
0.01685812883079052,
-0.00685715489089489,
-0.07082871347665787,
0.17770381271839142,
0.05924738571047783,
0.05653231590986252,
-0.03604907542467117,
-0.055104367434978485,
-0.02859291061758995,
-0.02827109955251217,
-0.028066005557775497,
0.04055353254079819,
0.06730540841817856,
0.004906122572720051,
-0.04418191686272621,
0.07832448929548264,
-0.079932302236557,
-0.20155055820941925,
0.07155507802963257,
-0.12604333460330963,
-0.11611264944076538,
-0.05650368705391884,
0.021103765815496445,
-0.02264290675520897,
0.0833858773112297,
-0.04980507120490074,
0.00422576954588294,
0.12298764288425446,
-0.018283354118466377,
-0.0781981572508812,
-0.06314407289028168,
0.10275523364543915,
-0.12192071974277496,
0.18640513718128204,
0.003135308623313904,
0.05088664963841438,
0.09995151311159134,
0.04302959144115448,
-0.09058142453432083,
0.08247707039117813,
0.030202260240912437,
-0.06723043322563171,
0.00969358254224062,
0.10742644965648651,
-0.0380169041454792,
0.029642710462212563,
0.007947739213705063,
-0.13625313341617584,
0.004396167118102312,
-0.030201364308595657,
-0.03588106855750084,
-0.09245898574590683,
-0.052820149809122086,
-0.10280054062604904,
0.12360966205596924,
0.20718570053577423,
-0.020632591098546982,
0.05698539689183235,
-0.09856053441762924,
0.06344807893037796,
0.0716898962855339,
0.007729341275990009,
-0.05309249088168144,
-0.15844103693962097,
0.007482409477233887,
0.10021879523992538,
-0.033031318336725235,
-0.2637544870376587,
-0.05120411887764931,
0.04252644628286362,
-0.029841262847185135,
-0.028907235711812973,
0.08706974238157272,
0.11277445405721664,
0.06438367068767548,
-0.050378601998090744,
-0.05564263090491295,
-0.032744958996772766,
0.11746001988649368,
-0.13627836108207703,
-0.06063050776720047
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# transformers-qa
This model is a fine-tuned version of [distilbert-base-cased](https://huggingface.co/distilbert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.9300
- Validation Loss: 1.1437
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': 5e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: mixed_float16
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 1.5145 | 1.1500 | 0 |
| 0.9300 | 1.1437 | 1 |
### Framework versions
- Transformers 4.16.0.dev0
- TensorFlow 2.6.0
- Datasets 1.16.2.dev0
- Tokenizers 0.10.3
| {"license": "apache-2.0", "tags": ["generated_from_keras_callback"], "model-index": [{"name": "transformers-qa", "results": []}]} | question-answering | Rocketknight1/transformers-qa | [
"transformers",
"tf",
"distilbert",
"question-answering",
"generated_from_keras_callback",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #tf #distilbert #question-answering #generated_from_keras_callback #license-apache-2.0 #endpoints_compatible #region-us
| transformers-qa
===============
This model is a fine-tuned version of distilbert-base-cased on an unknown dataset.
It achieves the following results on the evaluation set:
* Train Loss: 0.9300
* Validation Loss: 1.1437
* Epoch: 1
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* optimizer: {'name': 'Adam', 'learning\_rate': 5e-05, 'decay': 0.0, 'beta\_1': 0.9, 'beta\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
* training\_precision: mixed\_float16
### Training results
### Framework versions
* Transformers 4.16.0.dev0
* TensorFlow 2.6.0
* Datasets 1.16.2.dev0
* Tokenizers 0.10.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': 5e-05, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}\n* training\\_precision: mixed\\_float16",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* TensorFlow 2.6.0\n* Datasets 1.16.2.dev0\n* Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #tf #distilbert #question-answering #generated_from_keras_callback #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': 5e-05, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}\n* training\\_precision: mixed\\_float16",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* TensorFlow 2.6.0\n* Datasets 1.16.2.dev0\n* Tokenizers 0.10.3"
] | [
49,
103,
4,
40
] | [
"passage: TAGS\n#transformers #tf #distilbert #question-answering #generated_from_keras_callback #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': 5e-05, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}\n* training\\_precision: mixed\\_float16### Training results### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* TensorFlow 2.6.0\n* Datasets 1.16.2.dev0\n* Tokenizers 0.10.3"
] | [
-0.0560205839574337,
-0.002418432617560029,
-0.0026303990744054317,
0.08025097101926804,
0.15722501277923584,
0.030229398980736732,
0.09985028952360153,
0.10531293600797653,
-0.13017107546329498,
0.042915619909763336,
0.13717935979366302,
0.17906680703163147,
0.014009634964168072,
0.06554858386516571,
-0.11579187214374542,
-0.14215101301670074,
0.04824719950556755,
0.009832045994699001,
-0.09302599728107452,
0.10004839301109314,
0.08794359117746353,
-0.10352345556020737,
0.09275210648775101,
-0.0006828158511780202,
-0.207704558968544,
0.04072770103812218,
0.09550140053033829,
-0.05273040756583214,
0.12886390089988708,
0.08130159974098206,
0.09680001437664032,
-0.013614941388368607,
0.010244440287351608,
-0.17950117588043213,
0.024157950654625893,
0.10666236281394958,
-0.019139444455504417,
0.05415240302681923,
0.005411123391240835,
-0.004084476735442877,
0.1242498978972435,
-0.088468037545681,
0.022896461188793182,
0.03637628257274628,
-0.13426730036735535,
-0.26945358514785767,
-0.14704415202140808,
-0.023744424805045128,
0.05629858002066612,
0.10273698717355728,
0.010641612112522125,
0.2347331941127777,
-0.07468851655721664,
0.10358668118715286,
0.19149576127529144,
-0.34202906489372253,
-0.05654970929026604,
0.028056424111127853,
0.021532874554395676,
0.056108277291059494,
-0.0397518016397953,
0.029864342883229256,
0.07120911777019501,
0.0509246326982975,
0.042325276881456375,
-0.05260032042860985,
-0.15278886258602142,
0.018765702843666077,
-0.11026234924793243,
-0.01807563193142414,
0.1346423625946045,
0.05552765354514122,
-0.06417787075042725,
0.0032143532298505306,
-0.06596345454454422,
-0.0738607794046402,
0.005186489783227444,
-0.07339062541723251,
0.048225030303001404,
0.00010412118717795238,
-0.05625290796160698,
-0.057901035994291306,
-0.08288222551345825,
-0.07105965912342072,
-0.15043562650680542,
0.18204565346240997,
0.01665675826370716,
0.07542099803686142,
-0.06691920757293701,
0.044555798172950745,
-0.09147974103689194,
-0.11503010243177414,
-0.025004280731081963,
0.008431501686573029,
-0.04233505576848984,
-0.05124465376138687,
-0.11655070632696152,
-0.13417403399944305,
0.07381439954042435,
0.12736792862415314,
-0.07246466726064682,
0.06783448904752731,
-0.07055968791246414,
0.017157210037112236,
-0.10624846071004868,
0.15285050868988037,
-0.043563101440668106,
0.027693048119544983,
0.035650163888931274,
-0.019001049920916557,
0.05162462219595909,
-0.02659088745713234,
-0.07960272580385208,
-0.011790702119469643,
0.054378245025873184,
0.01626015082001686,
-0.04286421835422516,
0.08616092801094055,
-0.05606262758374214,
-0.006152672227472067,
-0.029334215447306633,
-0.08310806751251221,
0.029783230274915695,
-0.020815202966332436,
-0.08917870372533798,
0.00476298900321126,
0.05830110237002373,
0.025871653109788895,
-0.026843614876270294,
0.03834218531847,
-0.0645967423915863,
0.0004604335408657789,
-0.09376750141382217,
-0.1303868591785431,
0.03306284174323082,
-0.06255125999450684,
0.012203597463667393,
-0.09328725934028625,
-0.18354365229606628,
-0.007020158693194389,
0.06491771340370178,
-0.0305482130497694,
0.022186679765582085,
-0.03575894609093666,
-0.13872937858104706,
0.009764713235199451,
-0.027675891295075417,
0.16343951225280762,
-0.06848205626010895,
0.06491563469171524,
0.0213212501257658,
0.05676294490695,
-0.14638209342956543,
0.03440787270665169,
-0.05795656517148018,
0.01476342137902975,
-0.19467906653881073,
0.05926452949643135,
-0.056711018085479736,
0.051990807056427,
-0.14505741000175476,
-0.07771581411361694,
0.036586660891771317,
0.0009998331079259515,
0.11722288280725479,
0.08008615672588348,
-0.18466340005397797,
-0.028711706399917603,
0.11367949098348618,
-0.07042644917964935,
-0.15195722877979279,
0.10645798593759537,
-0.07271182537078857,
0.036571454256772995,
0.09556283801794052,
0.12695284187793732,
-0.016983214765787125,
-0.1353643536567688,
0.04608805105090141,
-0.023968173190951347,
-0.04568379744887352,
-0.0034623462706804276,
0.01625724323093891,
-0.014489000663161278,
-0.061375074088573456,
0.009230077266693115,
-0.04527248442173004,
0.04140494018793106,
-0.08960798382759094,
-0.06622669100761414,
-0.06991686671972275,
-0.07898475974798203,
0.020731240510940552,
0.030311862006783485,
0.061801496893167496,
-0.12336945533752441,
-0.08818698674440384,
0.07346168160438538,
0.020408805459737778,
-0.030608192086219788,
0.04398737847805023,
-0.10066814720630646,
0.025507397949695587,
-0.01704735867679119,
0.005670130252838135,
-0.19405825436115265,
-0.05059003829956055,
0.0044008903205394745,
0.04803657531738281,
0.04244350641965866,
0.008852365426719189,
0.07959823310375214,
-0.01157171931117773,
-0.06843322515487671,
0.04412665590643883,
-0.0071128057315945625,
0.026362139731645584,
-0.08963628113269806,
-0.22129957377910614,
0.022143110632896423,
-0.02460654079914093,
0.03653353825211525,
-0.1951088160276413,
-0.0057565076276659966,
0.035395652055740356,
0.10974722355604172,
0.02398575283586979,
0.02131757326424122,
-0.05719045177102089,
0.05472053214907646,
-0.012958219274878502,
-0.05189700424671173,
0.04107491672039032,
0.025872113183140755,
-0.1407678723335266,
-0.009984780102968216,
-0.0995648130774498,
0.125592902302742,
0.14908529818058014,
-0.1301705688238144,
-0.09362197667360306,
0.059510331600904465,
-0.022875282913446426,
-0.021505309268832207,
0.0032549973111599684,
0.0375385619699955,
0.17206014692783356,
-0.0009343381971120834,
0.10778315365314484,
-0.06961531937122345,
-0.014621414244174957,
0.019046267494559288,
-0.031412720680236816,
0.018072351813316345,
0.08811971545219421,
0.011928294785320759,
-0.16614387929439545,
0.08559371531009674,
0.16500802338123322,
-0.10355325788259506,
0.0629429742693901,
-0.071048304438591,
-0.062133487313985825,
-0.06883980333805084,
0.030608631670475006,
0.03105250559747219,
0.08345462381839752,
-0.09106992185115814,
0.03112761490046978,
0.026752768084406853,
0.04913220927119255,
-0.0028438663575798273,
-0.21667683124542236,
-0.025226419791579247,
0.0014444205444306135,
-0.05136395990848541,
-0.029102768748998642,
0.03010440804064274,
0.02388932928442955,
0.11797575652599335,
0.02615983970463276,
-0.02982739545404911,
0.08719627559185028,
-0.02212563157081604,
-0.08119475096464157,
0.21650514006614685,
-0.14578668773174286,
-0.07074130326509476,
-0.07053297013044357,
-0.07490777224302292,
-0.08147558569908142,
0.0018249062122777104,
0.04203668609261513,
-0.10097763687372208,
-0.05846589058637619,
-0.0530540831387043,
0.002052081748843193,
-0.0039020210970193148,
0.03650268167257309,
0.028743045404553413,
-0.022301873192191124,
0.1052442342042923,
-0.12737300992012024,
-0.025095254182815552,
-0.042411018162965775,
-0.07317005842924118,
0.04616513475775719,
0.01581631414592266,
0.0715690478682518,
0.10442835092544556,
-0.0312674380838871,
0.01585358940064907,
-0.03762851655483246,
0.2696749269962311,
-0.059039268642663956,
-0.04164409637451172,
0.15314224362373352,
-0.014709390699863434,
0.042831290513277054,
0.11881425976753235,
0.039541129022836685,
-0.14290577173233032,
0.05279567837715149,
0.05596780404448509,
-0.033864084631204605,
-0.26525503396987915,
-0.0165663193911314,
-0.05159664526581764,
-0.12609095871448517,
-0.00744816567748785,
0.027017788961529732,
0.12212789058685303,
0.02781151607632637,
0.05267371982336044,
0.08275361359119415,
-0.02215113863348961,
0.04261283576488495,
0.21106062829494476,
0.06191035360097885,
0.09974709153175354,
-0.06768078356981277,
-0.011060577817261219,
0.059656400233507156,
-0.02380860224366188,
0.23835431039333344,
0.034322626888751984,
0.07715272158384323,
0.09715382754802704,
0.11434613168239594,
-0.03649337217211723,
0.02003570646047592,
0.014728082343935966,
-0.05446600168943405,
-0.01559468638151884,
-0.0528271310031414,
-0.028317004442214966,
0.03405347093939781,
-0.07517407834529877,
0.06232092157006264,
-0.0607229545712471,
0.005515232216566801,
0.07664861530065536,
0.25558385252952576,
0.03343106433749199,
-0.29370391368865967,
-0.09632039070129395,
-0.00046050685341469944,
-0.023717790842056274,
-0.007978295907378197,
-0.005102307070046663,
0.08920378983020782,
-0.06417986005544662,
0.09684471040964127,
-0.054909974336624146,
0.09224148094654083,
0.05015968903899193,
0.053238362073898315,
0.05639377608895302,
0.10555032640695572,
0.0031932303681969643,
0.025525042787194252,
-0.3718765676021576,
0.29405978322029114,
0.040795326232910156,
0.13602100312709808,
-0.07739893347024918,
-0.0009130114340223372,
0.031486786901950836,
0.041863813996315,
0.07602281123399734,
-0.019332796335220337,
-0.11702369898557663,
-0.09787978231906891,
0.009675762616097927,
0.050247687846422195,
0.13340000808238983,
0.09337174147367477,
0.09124667942523956,
-0.027635876089334488,
0.031089723110198975,
0.10350780189037323,
0.04772308096289635,
-0.14772464334964752,
-0.03631049394607544,
-0.0038181862328201532,
0.05879906937479973,
-0.06328358501195908,
-0.0599665530025959,
-0.0712435394525528,
-0.08710522949695587,
0.1444803774356842,
-0.0113504184409976,
-0.017544610425829887,
-0.12067680060863495,
0.09318768978118896,
0.07900911569595337,
-0.05778263509273529,
0.0358094647526741,
0.0033542669843882322,
0.0038031935691833496,
0.07640405744314194,
-0.12723082304000854,
0.14763766527175903,
-0.028005210682749748,
-0.1480962038040161,
-0.042087145149707794,
0.060514625161886215,
0.0440530963242054,
0.05267896130681038,
-0.002035805257037282,
0.057307176291942596,
0.002366678323596716,
-0.10920052230358124,
0.06652285158634186,
-0.015145629644393921,
0.059369005262851715,
0.05820886418223381,
0.009085483849048615,
0.0060948519967496395,
-0.036320604383945465,
-0.007277782540768385,
0.15405422449111938,
0.2518981397151947,
-0.08482173830270767,
0.005804851185530424,
-0.0010985229164361954,
-0.0561298206448555,
-0.21080198884010315,
0.13316090404987335,
0.0704360231757164,
-0.008989009074866772,
0.002673908369615674,
-0.11617371439933777,
0.11464446783065796,
0.09597979485988617,
-0.010063733905553818,
0.1058446541428566,
-0.2839157283306122,
-0.1454809606075287,
0.08081866800785065,
0.15456274151802063,
0.19795596599578857,
-0.16811908781528473,
-0.0286630280315876,
-0.06692911684513092,
-0.156484916806221,
0.16271230578422546,
-0.18253347277641296,
0.09238219261169434,
0.019603991881012917,
0.08060051500797272,
-0.013776533305644989,
-0.03169943392276764,
0.12233664095401764,
-0.01076753344386816,
0.14524082839488983,
-0.0656486451625824,
-0.007573840674012899,
0.12243729084730148,
-0.029270699247717857,
0.012182499282062054,
-0.028029222041368484,
0.046213626861572266,
-0.039432380348443985,
0.00886428914964199,
-0.09777025878429413,
0.041057221591472626,
-0.022506970912218094,
-0.03688355162739754,
-0.04618161544203758,
0.02369719371199608,
0.07643724977970123,
-0.045349400490522385,
0.14935153722763062,
-0.009030936285853386,
0.16589008271694183,
0.13341864943504333,
0.0876699686050415,
-0.08063892275094986,
0.019785339012742043,
0.06593264639377594,
-0.03552763909101486,
0.08063407987356186,
-0.19060418009757996,
0.05160127580165863,
0.12565501034259796,
0.013335070572793484,
0.13009440898895264,
0.07251360267400742,
-0.03856585547327995,
0.034945759922266006,
0.040512729436159134,
-0.14471301436424255,
-0.1859252005815506,
0.05350371450185776,
-0.06388237327337265,
-0.05052581802010536,
0.08622036874294281,
0.12672491371631622,
-0.02507946267724037,
0.027070432901382446,
0.006286080460995436,
-0.007125355303287506,
-0.10614338517189026,
0.15085673332214355,
0.040887460112571716,
0.011470618657767773,
-0.10642078518867493,
0.12953691184520721,
0.024085011333227158,
-0.11654229462146759,
0.06638681143522263,
0.0006776695372536778,
-0.07037591934204102,
-0.018408311530947685,
0.08381194621324539,
0.14758248627185822,
-0.035785336047410965,
-0.06521692126989365,
-0.1125718504190445,
-0.14966264367103577,
0.06831643730401993,
0.2488352656364441,
0.08133062720298767,
0.027447961270809174,
-0.02802058681845665,
0.004038089420646429,
-0.07274803519248962,
0.036970388144254684,
0.030177365988492966,
0.046024542301893234,
-0.12557797133922577,
0.13737091422080994,
-0.01831512711942196,
0.04089993238449097,
-0.04298781976103783,
0.03682680428028107,
-0.14005005359649658,
0.027570674195885658,
-0.23432514071464539,
-0.0045218635350465775,
0.00020643752941396087,
0.0008803969249129295,
0.031195838004350662,
-0.1004539281129837,
-0.10949092358350754,
0.05418764054775238,
-0.11510977894067764,
-0.0231185220181942,
0.07809601724147797,
0.025762198492884636,
-0.1368139088153839,
-0.08442704379558563,
0.027767730876803398,
-0.048361413180828094,
0.040526337921619415,
0.09838085621595383,
-0.022388627752661705,
0.08459567278623581,
-0.17824342846870422,
-0.04325751215219498,
0.07347475737333298,
0.011151736602187157,
0.09736519306898117,
-0.10900003463029861,
-0.017523659393191338,
0.033048246055841446,
0.0791419968008995,
0.03166784346103668,
0.09359540045261383,
-0.08226548880338669,
-0.062482789158821106,
-0.03780898079276085,
-0.04858308285474777,
-0.05171424150466919,
0.011959613300859928,
0.14262481033802032,
0.03945186734199524,
0.19054293632507324,
-0.09607832133769989,
-0.003079791786149144,
-0.16354815661907196,
-0.004070286173373461,
-0.009839221835136414,
-0.11889459937810898,
-0.11960122734308243,
-0.02058303728699684,
0.09137782454490662,
-0.08378015458583832,
0.13281744718551636,
-0.05370113253593445,
0.11144232749938965,
0.05478651821613312,
-0.025554273277521133,
-0.03826647251844406,
0.0351594015955925,
0.23071280121803284,
0.034017015248537064,
-0.017376577481627464,
0.046236827969551086,
0.025216737762093544,
0.07460420578718185,
0.09721656143665314,
0.25825628638267517,
0.16581907868385315,
0.008947636000812054,
0.12744539976119995,
0.05422808602452278,
-0.039740342646837234,
-0.055301912128925323,
0.07989360392093658,
-0.05011181905865669,
0.12843532860279083,
-0.0376928374171257,
0.10338332504034042,
0.0642874538898468,
-0.16131065785884857,
0.03852668032050133,
-0.10313770174980164,
-0.07934681326150894,
-0.12532232701778412,
-0.037925757467746735,
-0.10269522666931152,
-0.16438299417495728,
0.009655917063355446,
-0.10762536525726318,
0.06008405238389969,
0.07815069705247879,
0.032409146428108215,
-0.02624526619911194,
0.14844250679016113,
-0.04954216256737709,
0.009234938770532608,
0.06678292900323868,
-0.03381458297371864,
-0.026596615090966225,
-0.0355730839073658,
-0.05672857537865639,
0.05465627461671829,
-0.02779877372086048,
0.031756218522787094,
0.004592373035848141,
-0.03659701719880104,
0.040107082575559616,
-0.07247565686702728,
-0.0910191759467125,
0.03657262399792671,
0.08032097667455673,
0.0211739931255579,
0.04944978654384613,
0.05565307289361954,
0.018206151202321053,
0.00018443762382958084,
0.18153157830238342,
-0.10057422518730164,
-0.0755670964717865,
-0.14727526903152466,
0.2534328103065491,
0.006234509404748678,
0.033064208924770355,
0.01976417936384678,
-0.06945721805095673,
-0.04020716995000839,
0.19868789613246918,
0.14638596773147583,
-0.12314761430025101,
-0.013912736438214779,
0.0264398455619812,
-0.0015984942438080907,
-0.06445702165365219,
0.12392017990350723,
0.10560427606105804,
-0.03468141332268715,
-0.07087985426187515,
-0.07133042812347412,
-0.03748933970928192,
-0.0004816229920834303,
-0.026780184358358383,
0.032487399876117706,
0.03644264489412308,
-0.027630049735307693,
-0.007502132095396519,
0.07312962412834167,
-0.05903155729174614,
-0.13659678399562836,
0.06570878624916077,
-0.18381623923778534,
-0.1486893594264984,
-0.012934330850839615,
0.036340612918138504,
-0.0018911767983809114,
0.0677814707159996,
-0.044166285544633865,
0.007278576027601957,
0.08672644197940826,
-0.03799257054924965,
-0.047023892402648926,
-0.10488898307085037,
0.11668149381875992,
-0.09045206010341644,
0.14757657051086426,
-0.009246126748621464,
0.0946802943944931,
0.13482438027858734,
0.06086001172661781,
-0.078014075756073,
0.07762212306261063,
0.05656643584370613,
-0.12365353107452393,
-0.007189169526100159,
0.04020306468009949,
-0.029600463807582855,
0.06439273804426193,
0.06679892539978027,
-0.10937763750553131,
0.04434238001704216,
-0.055381692945957184,
-0.08081886917352676,
-0.06932993978261948,
-0.0558096244931221,
-0.10713037103414536,
0.11259463429450989,
0.2233796864748001,
-0.02710011787712574,
0.06105910986661911,
-0.062270477414131165,
0.0003842839796561748,
0.07427120953798294,
0.03304879739880562,
-0.0819774866104126,
-0.19776351749897003,
0.07393528521060944,
0.13217361271381378,
-0.005238606594502926,
-0.15532884001731873,
-0.08634553849697113,
0.017362967133522034,
-0.045081041753292084,
-0.051315225660800934,
0.07827610522508621,
0.09169608354568481,
0.04854103550314903,
-0.054422661662101746,
-0.20633670687675476,
-0.04633927717804909,
0.15963754057884216,
-0.09062530845403671,
-0.08681656420230865
] |
null | null | null | # Configuration
`title`: _string_
Display title for the Space
`emoji`: _string_
Space emoji (emoji-only character allowed)
`colorFrom`: _string_
Color for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray)
`colorTo`: _string_
Color for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray)
`sdk`: _string_
Can be either `gradio` or `streamlit`
`app_file`: _string_
Path to your main application file (which contains either `gradio` or `streamlit` Python code).
Path is relative to the root of the repository.
`pinned`: _boolean_
Whether the Space stays on top of your list. | {"title": "CLIP-Guided-Diffusion", "emoji": "\ud83d\udca9", "colorFrom": "purple", "colorTo": "red", "sdk": "gradio", "app_file": "app.py", "pinned": false} | null | Rodrigo/teste5 | [
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#region-us
| # Configuration
'title': _string_
Display title for the Space
'emoji': _string_
Space emoji (emoji-only character allowed)
'colorFrom': _string_
Color for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray)
'colorTo': _string_
Color for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray)
'sdk': _string_
Can be either 'gradio' or 'streamlit'
'app_file': _string_
Path to your main application file (which contains either 'gradio' or 'streamlit' Python code).
Path is relative to the root of the repository.
'pinned': _boolean_
Whether the Space stays on top of your list. | [
"# Configuration\r\n'title': _string_ \r\nDisplay title for the Space\r\n'emoji': _string_ \r\nSpace emoji (emoji-only character allowed)\r\n'colorFrom': _string_ \r\nColor for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray)\r\n'colorTo': _string_ \r\nColor for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray)\r\n'sdk': _string_ \r\nCan be either 'gradio' or 'streamlit'\r\n'app_file': _string_ \r\nPath to your main application file (which contains either 'gradio' or 'streamlit' Python code). \r\nPath is relative to the root of the repository.\r\n\r\n'pinned': _boolean_ \r\nWhether the Space stays on top of your list."
] | [
"TAGS\n#region-us \n",
"# Configuration\r\n'title': _string_ \r\nDisplay title for the Space\r\n'emoji': _string_ \r\nSpace emoji (emoji-only character allowed)\r\n'colorFrom': _string_ \r\nColor for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray)\r\n'colorTo': _string_ \r\nColor for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray)\r\n'sdk': _string_ \r\nCan be either 'gradio' or 'streamlit'\r\n'app_file': _string_ \r\nPath to your main application file (which contains either 'gradio' or 'streamlit' Python code). \r\nPath is relative to the root of the repository.\r\n\r\n'pinned': _boolean_ \r\nWhether the Space stays on top of your list."
] | [
6,
192
] | [
"passage: TAGS\n#region-us \n# Configuration\r\n'title': _string_ \r\nDisplay title for the Space\r\n'emoji': _string_ \r\nSpace emoji (emoji-only character allowed)\r\n'colorFrom': _string_ \r\nColor for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray)\r\n'colorTo': _string_ \r\nColor for Thumbnail gradient (red, yellow, green, blue, indigo, purple, pink, gray)\r\n'sdk': _string_ \r\nCan be either 'gradio' or 'streamlit'\r\n'app_file': _string_ \r\nPath to your main application file (which contains either 'gradio' or 'streamlit' Python code). \r\nPath is relative to the root of the repository.\r\n\r\n'pinned': _boolean_ \r\nWhether the Space stays on top of your list."
] | [
0.012548225931823254,
0.054766036570072174,
-0.005494763143360615,
-0.007291278801858425,
0.11097221821546555,
0.005342544987797737,
-0.009601701982319355,
0.08150327205657959,
0.10279306024312973,
0.108857162296772,
0.08700243383646011,
0.17721374332904816,
-0.004380662459880114,
0.059579528868198395,
0.0025956854224205017,
-0.24149034917354584,
0.06376194953918457,
-0.07534189522266388,
0.048147860914468765,
0.07315045595169067,
0.05222964659333229,
0.0017718472518026829,
0.0978437215089798,
0.04616836830973625,
-0.19165746867656708,
0.002507532015442848,
-0.01238244865089655,
-0.029938803985714912,
-0.006547658238559961,
-0.02601481042802334,
0.08664311468601227,
-0.060289740562438965,
-0.062265004962682724,
-0.13372035324573517,
0.018562516197562218,
0.09759262949228287,
-0.013191205449402332,
-0.02587929554283619,
0.17736153304576874,
-0.15595154464244843,
0.23208560049533844,
-0.07294197380542755,
0.08909338712692261,
0.006474909372627735,
-0.014912852086126804,
-0.13798052072525024,
0.05810157209634781,
-0.10320951789617538,
0.15774914622306824,
0.04108361154794693,
0.01260764803737402,
-0.0013002724153921008,
-0.15180975198745728,
0.10551553964614868,
0.08879764378070831,
0.030859213322401047,
-0.006310253404080868,
0.13923661410808563,
0.08914373070001602,
0.05497303977608681,
-0.08396018296480179,
-0.00926304142922163,
-0.015054180286824703,
0.006376792211085558,
-0.04494267329573631,
-0.059503354132175446,
-0.06850329786539078,
0.03937774524092674,
-0.08996736258268356,
-0.011980795301496983,
0.20609253644943237,
-0.0025977655313909054,
-0.059395112097263336,
-0.03980208933353424,
-0.05355312302708626,
-0.08221033215522766,
0.004038924816995859,
0.08800165355205536,
0.03512781858444214,
0.09463484585285187,
0.09114958345890045,
0.026578038930892944,
-0.10862688720226288,
-0.0338234007358551,
-0.03029523231089115,
0.15631739795207977,
-0.01995762810111046,
0.03678907826542854,
-0.12062539160251617,
0.043854836374521255,
-0.06946684420108795,
-0.07501384615898132,
0.05113285407423973,
-0.09483730047941208,
0.00989114586263895,
0.10602143406867981,
-0.09671590477228165,
-0.2013750672340393,
0.11144860088825226,
0.13035961985588074,
0.16560186445713043,
0.13418817520141602,
-0.036099422723054886,
0.07848802208900452,
0.07400429993867874,
0.16311466693878174,
-0.0593266598880291,
-0.034447140991687775,
-0.016311582177877426,
-0.1892862170934677,
0.061867184937000275,
-0.11341200768947601,
-0.13508251309394836,
0.024402080103754997,
-0.032652489840984344,
0.017639650031924248,
0.051559869199991226,
0.03118312917649746,
-0.11291158199310303,
-0.060339897871017456,
0.08965741842985153,
-0.02822756953537464,
0.07584074884653091,
0.016999324783682823,
-0.020283903926610947,
0.04872927442193031,
-0.044725313782691956,
0.015907974913716316,
0.05946936085820198,
0.18656909465789795,
-0.06873693317174911,
0.005064425989985466,
-0.19148176908493042,
-0.08754384517669678,
0.022430801764130592,
-0.05035243555903435,
0.070250503718853,
-0.08056901395320892,
-0.03265110403299332,
-0.06609731912612915,
0.06727933883666992,
0.036936402320861816,
0.09224827587604523,
0.012200449593365192,
-0.10960797220468521,
0.11638354510068893,
0.06192856654524803,
-0.04701443761587143,
-0.04949820414185524,
0.04986336827278137,
-0.05926838517189026,
0.09111497551202774,
-0.12658943235874176,
0.017447002232074738,
-0.06419092416763306,
0.03846186026930809,
-0.31984981894493103,
-0.021217940375208855,
-0.05778733640909195,
0.16408580541610718,
0.025673948228359222,
-0.032281093299388885,
0.0686369240283966,
-0.018420042470097542,
-0.06795413792133331,
0.02193503826856613,
-0.23543408513069153,
-0.017026180401444435,
0.1549483686685562,
-0.022251084446907043,
0.01633167266845703,
0.07991275936365128,
0.009538074024021626,
-0.17232534289360046,
-0.011852901428937912,
0.4425787329673767,
0.14412623643875122,
-0.11055953800678253,
0.044809237122535706,
-0.0044596330262720585,
-0.17890992760658264,
-0.01597258634865284,
0.10547800362110138,
-0.03777178376913071,
-0.1122899129986763,
0.07081657648086548,
-0.10039163380861282,
0.10317476093769073,
0.07347060739994049,
0.09628340601921082,
-0.0238035935908556,
0.022192230448126793,
0.12196392565965652,
0.006539805792272091,
-0.11059495061635971,
-0.14409470558166504,
-0.027005238458514214,
-0.03551199659705162,
0.1538730263710022,
-0.025064995512366295,
-0.001562215038575232,
-0.06940685212612152,
0.15811333060264587,
0.05910975858569145,
-0.04049052298069,
-0.08875393867492676,
0.02463158592581749,
0.024762842804193497,
0.09956438839435577,
-0.04225229471921921,
-0.049124013632535934,
0.002257999964058399,
0.035572540014982224,
0.0744512751698494,
-0.07388351112604141,
-0.004435575567185879,
-0.043466031551361084,
0.0839357003569603,
-0.02680456079542637,
0.0647156611084938,
-0.04076450318098068,
-0.05382281541824341,
-0.08102378994226456,
-0.003210409078747034,
0.12122425436973572,
0.05055805295705795,
0.10437069088220596,
-0.10817497223615646,
0.019461076706647873,
-0.10245491564273834,
-0.06412477791309357,
-0.03568374365568161,
-0.05880965292453766,
0.028168095275759697,
0.10483749210834503,
0.08230388909578323,
-0.14047519862651825,
0.0687602311372757,
0.11449714004993439,
-0.0853850468993187,
0.050805289298295975,
0.06431783735752106,
0.003938168752938509,
0.08253483474254608,
0.03871443122625351,
-0.01803617738187313,
-0.008819367736577988,
0.05388985201716423,
0.015390482731163502,
-0.0559665746986866,
0.002408711239695549,
0.013646505773067474,
-0.1427820920944214,
0.015211686491966248,
0.037150077521800995,
0.13400578498840332,
0.00658954493701458,
0.04841287434101105,
0.03252217918634415,
0.02562626451253891,
0.17207804322242737,
0.029288558289408684,
-0.0029730359092354774,
-0.02573099173605442,
-0.014820235781371593,
-0.04491903632879257,
0.041220661252737045,
-0.10889700055122375,
0.0031714155338704586,
0.031193725764751434,
0.012164357118308544,
0.017852136865258217,
-0.09065146744251251,
-0.07695957273244858,
-0.012945735827088356,
0.03183868154883385,
0.03946252912282944,
0.10097306966781616,
0.01237888541072607,
-0.008361270651221275,
-0.09183105826377869,
-0.05867232009768486,
-0.06950502842664719,
-0.05365058034658432,
-0.018440624698996544,
0.09993549436330795,
-0.23669518530368805,
-0.29223376512527466,
-0.07767502218484879,
-0.2350168377161026,
-0.010042339563369751,
0.10128766298294067,
0.10854319483041763,
-0.13289636373519897,
-0.059031806886196136,
-0.0020262831822037697,
0.008818189613521099,
-0.17020805180072784,
-0.06285272538661957,
-0.18612197041511536,
0.01669209636747837,
-0.08677129447460175,
-0.054826997220516205,
-0.03576773405075073,
0.05024665966629982,
0.115091472864151,
0.16202683746814728,
0.00463097682222724,
0.16779087483882904,
0.15043987333774567,
0.003735675010830164,
-0.009936853311955929,
0.0312972217798233,
0.14328311383724213,
-0.10409346967935562,
0.03756818547844887,
0.1162688285112381,
0.016386669129133224,
0.15327908098697662,
0.15270085632801056,
-0.014265387319028378,
-0.13317367434501648,
0.08522101491689682,
-0.00443862471729517,
0.006334707140922546,
-0.13488242030143738,
-0.12739059329032898,
-0.08858973532915115,
0.016643362119793892,
-0.009274779818952084,
0.08247798681259155,
0.02973197214305401,
0.05277206003665924,
0.05911652743816376,
-0.011509944684803486,
-0.10126907378435135,
0.14584015309810638,
0.045475900173187256,
-0.058478109538555145,
0.04358851537108421,
-0.013052468188107014,
0.0008471125620417297,
0.13503926992416382,
-0.032625071704387665,
0.10616768151521683,
0.06903840601444244,
0.02731309086084366,
0.06374798715114594,
0.1285437047481537,
0.07061322778463364,
-0.025189949199557304,
-0.021886054426431656,
-0.02009975165128708,
-0.05584510788321495,
0.01638147234916687,
-0.07349645346403122,
0.02099211886525154,
0.07410988956689835,
-0.09670253843069077,
0.02178051508963108,
-0.06738714873790741,
0.02255692146718502,
-0.027056798338890076,
-0.017572494223713875,
-0.19029125571250916,
0.15189503133296967,
0.07400965690612793,
0.055553313344717026,
-0.21273410320281982,
0.03827700391411781,
0.150101438164711,
-0.03604356199502945,
0.0005624376935884356,
0.03328721970319748,
0.07599321007728577,
0.004403860308229923,
0.0008259027381427586,
0.007654800079762936,
-0.005318471696227789,
0.01234047207981348,
0.11910387873649597,
-0.07329130172729492,
-0.04790380224585533,
-0.04506494477391243,
-0.06506969779729843,
0.013117244467139244,
-0.038542769849300385,
0.04392724484205246,
0.20034819841384888,
-0.0011285158107057214,
0.05887357145547867,
-0.14701403677463531,
-0.1404181569814682,
-0.024489426985383034,
-0.020559683442115784,
0.17528943717479706,
-0.07097800076007843,
-0.014746516942977905,
-0.023646557703614235,
-0.01572227291762829,
-0.027942903339862823,
-0.024369364604353905,
-0.039796382188797,
-0.1162823885679245,
0.020172417163848877,
0.028572993353009224,
0.07727652043104172,
-0.06295350193977356,
0.05682177096605301,
0.09931579232215881,
0.06250707805156708,
0.05664346367120743,
-0.006617766804993153,
-0.07876342535018921,
-0.21316707134246826,
0.05765118822455406,
-0.029179196804761887,
0.06623559445142746,
-0.030241362750530243,
0.1133892610669136,
0.05920267105102539,
-0.0069287861697375774,
0.0731038972735405,
-0.061963118612766266,
0.10461066663265228,
-0.16512468457221985,
0.01202385127544403,
-0.057918477803468704,
-0.02509060874581337,
-0.009524578228592873,
0.08361580967903137,
-0.1398026943206787,
-0.18520350754261017,
0.04275389015674591,
0.11865899711847305,
0.06280620396137238,
-0.012177489697933197,
0.019056206569075584,
0.06247711181640625,
0.014493336901068687,
-0.0042215450666844845,
0.09111209958791733,
0.14739377796649933,
-0.10563776642084122,
0.07295828312635422,
-0.01833450049161911,
-0.004809722304344177,
-0.08757287263870239,
0.021291928365826607,
0.0013029536930844188,
0.0011449779849499464,
0.012498855590820312,
-0.18699559569358826,
0.05537170544266701,
-0.0009447935153730214,
0.008364152163267136,
0.18837392330169678,
-0.19602075219154358,
-0.03156653046607971,
0.06384098529815674,
0.06108572334051132,
-0.03386125713586807,
-0.14102931320667267,
-0.08585206419229507,
-0.09431608766317368,
-0.07165680825710297,
0.13681180775165558,
0.012922383844852448,
0.058692775666713715,
-0.027316145598888397,
0.1039409339427948,
0.036210618913173676,
-0.07349187880754471,
0.14185373485088348,
-0.10979483276605606,
0.13107571005821228,
-0.10751990228891373,
-0.02675478719174862,
0.045766498893499374,
-0.04368895664811134,
0.0947076603770256,
-0.08095192909240723,
0.06465413421392441,
-0.22834104299545288,
0.009751265868544579,
-0.026712168008089066,
0.0091514578089118,
0.046295225620269775,
-0.059246670454740524,
-0.13940148055553436,
-0.03948559984564781,
-0.031845442950725555,
0.024656113237142563,
-0.1349220722913742,
-0.007944757118821144,
-0.17709095776081085,
-0.10202281922101974,
-0.15125639736652374,
0.005284594837576151,
-0.20035213232040405,
-0.02927933633327484,
-0.012690916657447815,
0.03775034472346306,
-0.18136785924434662,
-0.05521941930055618,
0.0005240063183009624,
-0.026781493797898293,
0.11785417795181274,
0.0002611283853184432,
-0.037719544023275375,
0.04082370549440384,
0.1279182881116867,
-0.11331592500209808,
0.05705087631940842,
-0.0281518567353487,
0.23065395653247833,
0.05670473724603653,
-0.11210328340530396,
-0.034284558147192,
0.09730003029108047,
-0.03579850122332573,
-0.026388317346572876,
0.04405829310417175,
0.06826426088809967,
-0.054665710777044296,
0.07633157074451447,
-0.022344401106238365,
-0.08796016126871109,
-0.03323567658662796,
0.05189550668001175,
-0.07153546810150146,
0.0032107937149703503,
0.07538849115371704,
-0.05994279682636261,
-0.04296112060546875,
0.1613233983516693,
0.19200901687145233,
0.09574057906866074,
-0.031119462102651596,
0.08986972272396088,
0.023326758295297623,
-0.014510304667055607,
0.006008524913340807,
0.09011256694793701,
0.011483064852654934,
-0.058724965900182724,
-0.03716714680194855,
0.02160726673901081,
0.04101927950978279,
-0.03351239114999771,
0.0685417428612709,
-0.11602627485990524,
-0.15951412916183472,
0.05390245094895363,
-0.01974547654390335,
-0.009836999699473381,
-0.060851261019706726,
-0.05614827945828438,
-0.047993242740631104,
-0.03567007556557655,
0.1280069649219513,
0.13481852412223816,
0.03739941120147705,
0.03387222811579704,
0.00014759371697437018,
-0.03762365132570267,
-0.05226732790470123,
-0.092371366918087,
-0.061642132699489594,
-0.011040770448744297,
0.05255268141627312,
-0.05723946541547775,
-0.07527677714824677,
0.19908812642097473,
-0.025185832753777504,
-0.04321778193116188,
0.015697335824370384,
0.00015147462545428425,
0.020724214613437653,
-0.1619454175233841,
-0.12635044753551483,
0.14768730103969574,
0.013012824580073357,
0.002924853703007102,
-0.04837746545672417,
0.01964457333087921,
-0.027109554037451744,
0.017084037885069847,
-0.0707034021615982,
0.0325155146420002,
-0.14941070973873138,
0.01593855395913124,
-0.026628779247403145,
-0.29082638025283813,
-0.07082834094762802,
-0.08563224971294403,
-0.04882572218775749,
0.08753105252981186,
0.1517496109008789,
0.06730319559574127,
0.04805949702858925,
0.029523149132728577,
-0.02579267881810665,
0.02712102234363556,
0.003392697311937809,
0.08496684581041336,
0.008310085162520409,
-0.0039947759360075,
-0.04354275390505791,
0.0742996484041214,
0.11916384100914001,
-0.10617011785507202,
-0.09989979863166809,
0.2116405963897705,
0.025679869577288628,
-0.004878422245383263,
0.13318637013435364,
0.0010201946133747697,
0.031187864020466805,
0.09848090261220932,
0.04509368911385536,
0.09497670084238052,
-0.003872771980240941,
-0.02210078574717045,
0.09069087356328964,
0.05362250655889511,
-0.06434976309537888,
-0.14123712480068207,
0.047980014234781265,
-0.2807200849056244,
-0.11014987528324127,
0.032278914004564285,
0.05536000430583954,
0.006106482818722725,
0.3163933753967285,
0.10956456512212753,
-0.07108898460865021,
0.06183304637670517,
0.04178810492157936,
-0.05300329625606537,
-0.07445839792490005,
-0.15380394458770752,
-0.030096594244241714,
-0.1793351173400879,
0.01875491626560688,
-0.08683933317661285,
0.06769977509975433,
0.026685062795877457,
-0.017454402521252632,
-0.011011066846549511,
0.04851324111223221,
-0.016219984740018845,
-0.13590107858181,
-0.025460073724389076,
-0.051424790173769,
-0.0738067477941513,
0.05013522133231163,
0.036237895488739014,
-0.020835891366004944,
-0.03704894334077835,
0.10319006443023682,
0.040597595274448395,
-0.06420111656188965,
-0.005961032118648291,
-0.16993358731269836,
-0.032458193600177765,
0.030575638636946678,
0.004156236071139574,
-0.09915613383054733,
0.07627253979444504,
0.03702806681394577,
0.0030930060893297195,
0.007579703349620104,
0.341336727142334,
0.0010511507280170918,
0.056300852447748184,
0.034168560057878494,
-0.1680261641740799,
-0.01094439160078764,
0.06313455104827881,
-0.09105530381202698,
-0.13442324101924896,
-0.13030940294265747,
0.21987715363502502,
-0.00474208127707243,
0.025522569194436073,
-0.023398859426379204,
0.04957011714577675,
0.0185542069375515,
0.02020305022597313,
0.14346154034137726,
0.050689488649368286,
0.23123173415660858,
-0.028310496360063553,
0.020859817042946815,
-0.04282950237393379,
-0.035982754081487656,
-0.1447831243276596,
-0.1397867053747177,
-0.009408324956893921,
-0.11481086909770966,
-0.06827494502067566,
0.12463181465864182,
-0.018149496987462044,
0.19553245604038239,
-0.011075558140873909,
0.045681316405534744,
-0.021241020411252975,
0.0766785517334938,
0.23116838932037354,
-0.020280327647924423,
0.10351812094449997,
0.0020942972041666508,
-0.09484588354825974,
0.02137063443660736,
-0.013896452262997627,
-0.19333350658416748,
-0.0254929568618536,
-0.007535372860729694,
-0.12604451179504395,
0.17077305912971497,
-0.003205350134521723,
0.023096315562725067,
0.049887191504240036,
0.061775434762239456,
-0.05507729947566986,
0.06859990209341049,
-0.03290253132581711,
-0.04155586659908295,
0.018864303827285767,
0.1685529500246048,
0.0009372899075970054,
-0.09594831615686417,
0.07331565767526627,
0.001893400214612484,
0.027710873633623123,
-0.06934823840856552,
0.1926528513431549,
-0.11165662109851837,
0.04455330967903137,
-0.19357822835445404,
-0.016511278226971626,
-0.015728335827589035,
0.008273706771433353,
-0.005304744467139244,
-0.03291423246264458,
0.01406734250485897,
-0.02179795689880848,
-0.09619352966547012,
-0.04117857292294502,
0.0313086211681366,
-0.07025980949401855,
0.2521654963493347,
0.018301961943507195,
-0.05332426354289055,
-0.013115817680954933,
-0.06896195560693741,
0.05001494288444519,
-0.08816885948181152,
0.07203519344329834,
-0.012550493702292442,
0.005043806973844767,
-0.0587657168507576,
-0.12346551567316055,
0.0203498937189579,
0.08393912762403488,
-0.10288088023662567,
-0.10387211292982101
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
#
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the NBAILAB/NPSC - 16K_MP3 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1957
- Wer: 0.1697
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7.5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 2000
- num_epochs: 20.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 4.4527 | 0.28 | 250 | 4.0144 | 1.0 |
| 3.1828 | 0.56 | 500 | 3.1369 | 1.0 |
| 2.9927 | 0.85 | 750 | 3.0183 | 1.0 |
| 2.9591 | 1.13 | 1000 | 2.9991 | 1.0 |
| 2.8989 | 1.41 | 1250 | 2.9000 | 1.0000 |
| 2.4286 | 1.69 | 1500 | 1.7688 | 0.9550 |
| 1.6765 | 1.98 | 1750 | 0.6842 | 0.4855 |
| 1.4521 | 2.26 | 2000 | 0.5096 | 0.3736 |
| 1.3589 | 2.54 | 2250 | 0.4479 | 0.3335 |
| 1.3136 | 2.82 | 2500 | 0.4056 | 0.3123 |
| 1.2856 | 3.11 | 2750 | 0.3870 | 0.2987 |
| 1.2283 | 3.39 | 3000 | 0.3646 | 0.2828 |
| 1.2053 | 3.67 | 3250 | 0.3499 | 0.2748 |
| 1.2087 | 3.95 | 3500 | 0.3345 | 0.2603 |
| 1.2002 | 4.24 | 3750 | 0.3320 | 0.2523 |
| 1.1383 | 4.52 | 4000 | 0.3117 | 0.2439 |
| 1.1364 | 4.8 | 4250 | 0.3198 | 0.2383 |
| 1.158 | 5.08 | 4500 | 0.3071 | 0.2342 |
| 1.108 | 5.37 | 4750 | 0.3011 | 0.2314 |
| 1.1025 | 5.65 | 5000 | 0.2875 | 0.2289 |
| 1.0697 | 5.93 | 5250 | 0.2926 | 0.2256 |
| 1.0904 | 6.21 | 5500 | 0.2695 | 0.2245 |
| 1.0802 | 6.5 | 5750 | 0.2602 | 0.2189 |
| 1.0882 | 6.78 | 6000 | 0.2603 | 0.2168 |
| 1.0881 | 7.06 | 6250 | 0.2540 | 0.2293 |
| 1.0378 | 7.34 | 6500 | 0.2614 | 0.2193 |
| 1.0397 | 7.63 | 6750 | 0.2707 | 0.2104 |
| 1.0296 | 7.91 | 7000 | 0.2483 | 0.2119 |
| 1.0249 | 8.19 | 7250 | 0.2483 | 0.2047 |
| 1.013 | 8.47 | 7500 | 0.2487 | 0.2042 |
| 1.0064 | 8.76 | 7750 | 0.2456 | 0.2016 |
| 1.0668 | 9.04 | 8000 | 0.2397 | 0.1995 |
| 1.0129 | 9.32 | 8250 | 0.2374 | 0.1994 |
| 1.0164 | 9.6 | 8500 | 0.2206 | 0.1992 |
| 0.975 | 9.89 | 8750 | 0.2247 | 0.1973 |
| 0.9849 | 10.17 | 9000 | 0.2325 | 0.1953 |
| 0.9826 | 10.45 | 9250 | 0.2301 | 0.1934 |
| 0.9835 | 10.73 | 9500 | 0.2192 | 0.1942 |
| 0.9676 | 11.02 | 9750 | 0.2266 | 0.1913 |
| 0.9627 | 11.3 | 10000 | 0.2193 | 0.1921 |
| 0.976 | 11.58 | 10250 | 0.2309 | 0.1882 |
| 0.969 | 11.86 | 10500 | 0.2268 | 0.1886 |
| 0.9611 | 12.15 | 10750 | 0.2322 | 0.1863 |
| 0.9397 | 12.43 | 11000 | 0.2197 | 0.1844 |
| 0.9601 | 12.71 | 11250 | 0.2211 | 0.1871 |
| 0.9718 | 12.99 | 11500 | 0.2079 | 0.1898 |
| 0.9347 | 13.28 | 11750 | 0.2054 | 0.1843 |
| 0.9377 | 13.56 | 12000 | 0.2031 | 0.1842 |
| 0.934 | 13.84 | 12250 | 0.2059 | 0.1806 |
| 0.9295 | 14.12 | 12500 | 0.2122 | 0.1861 |
| 0.935 | 14.41 | 12750 | 0.2072 | 0.1787 |
| 0.9021 | 14.69 | 13000 | 0.2105 | 0.1781 |
| 0.9193 | 14.97 | 13250 | 0.2035 | 0.1786 |
| 0.9214 | 15.25 | 13500 | 0.2035 | 0.1766 |
| 0.9048 | 15.54 | 13750 | 0.1964 | 0.1758 |
| 0.9006 | 15.82 | 14000 | 0.1984 | 0.1757 |
| 0.9027 | 16.1 | 14250 | 0.2022 | 0.1743 |
| 0.9083 | 16.38 | 14500 | 0.1969 | 0.1744 |
| 0.9761 | 16.67 | 14750 | 0.1963 | 0.1728 |
| 0.9311 | 16.95 | 15000 | 0.1960 | 0.1737 |
| 0.886 | 17.23 | 15250 | 0.1929 | 0.1726 |
| 0.8969 | 17.51 | 15500 | 0.1928 | 0.1734 |
| 0.9084 | 17.8 | 15750 | 0.1937 | 0.1713 |
| 0.8795 | 18.08 | 16000 | 0.1978 | 0.1709 |
| 0.8883 | 18.36 | 16250 | 0.1956 | 0.1703 |
| 0.8901 | 18.64 | 16500 | 0.1933 | 0.1705 |
| 0.8922 | 18.93 | 16750 | 0.1962 | 0.1711 |
| 0.8765 | 19.21 | 17000 | 0.1962 | 0.1711 |
| 0.8992 | 19.49 | 17250 | 0.1965 | 0.1703 |
| 0.8778 | 19.77 | 17500 | 0.1957 | 0.1699 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.0+cu113
- Datasets 1.18.1
- Tokenizers 0.11.0
| {"license": "apache-2.0", "tags": ["automatic-speech-recognition", "NbAiLab/NPSC", "generated_from_trainer"], "model-index": [{"name": "", "results": []}]} | automatic-speech-recognition | Rolv-Arild/xls-r-300m-npsc-4 | [
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"NbAiLab/NPSC",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #NbAiLab/NPSC #generated_from_trainer #license-apache-2.0 #endpoints_compatible #region-us
|
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the NBAILAB/NPSC - 16K\_MP3 dataset.
It achieves the following results on the evaluation set:
* Loss: 0.1957
* Wer: 0.1697
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 7.5e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* gradient\_accumulation\_steps: 4
* total\_train\_batch\_size: 64
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 2000
* num\_epochs: 20.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.0+cu113
* Datasets 1.18.1
* Tokenizers 0.11.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 64\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 20.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.0+cu113\n* Datasets 1.18.1\n* Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #NbAiLab/NPSC #generated_from_trainer #license-apache-2.0 #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 64\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 20.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.0+cu113\n* Datasets 1.18.1\n* Tokenizers 0.11.0"
] | [
64,
160,
4,
36
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #NbAiLab/NPSC #generated_from_trainer #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 64\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 20.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.0+cu113\n* Datasets 1.18.1\n* Tokenizers 0.11.0"
] | [
-0.12113774567842484,
0.0823439285159111,
-0.0031904191710054874,
0.056994952261447906,
0.12106344848871231,
0.012121351435780525,
0.09506338089704514,
0.1493162214756012,
-0.08382627367973328,
0.08963130414485931,
0.11039691418409348,
0.10493055731058121,
0.06186938285827637,
0.10902390629053116,
-0.015002072788774967,
-0.31186333298683167,
0.01264359150081873,
0.023525463417172432,
-0.10383190959692001,
0.11922507733106613,
0.08926782011985779,
-0.1211492195725441,
0.02728070504963398,
0.026982557028532028,
-0.12361420691013336,
0.006882528308779001,
-0.024661701172590256,
-0.0724065750837326,
0.12662886083126068,
0.02831326238811016,
0.10600724816322327,
0.024107977747917175,
0.09075609594583511,
-0.252449095249176,
0.014420678839087486,
0.06866355985403061,
0.04299943894147873,
0.08256805688142776,
0.11434557288885117,
-0.00409104535356164,
0.14613451063632965,
-0.08093045651912689,
0.07616595178842545,
0.03088505193591118,
-0.10671869665384293,
-0.32280194759368896,
-0.08022993803024292,
0.03667252138257027,
0.10158152133226395,
0.09439966082572937,
-0.022746657952666283,
0.0899679958820343,
-0.07597064226865768,
0.0884474590420723,
0.2188301682472229,
-0.2481820285320282,
-0.08081461489200592,
-0.0195327065885067,
0.04865805059671402,
0.05061081424355507,
-0.1233319416642189,
-0.03167622908949852,
0.01992643065750599,
0.03560001403093338,
0.119182288646698,
0.009926104918122292,
-0.025543371215462685,
0.02438489720225334,
-0.14335493743419647,
-0.04991813004016876,
0.0925280824303627,
0.06884835660457611,
-0.02150115557014942,
-0.08346490561962128,
-0.02954603172838688,
-0.20668244361877441,
-0.04875862970948219,
0.007542604114860296,
0.02721443586051464,
-0.04152678698301315,
-0.10195092856884003,
0.009274984709918499,
-0.08166304975748062,
-0.08771184831857681,
0.00819939561188221,
0.1444910168647766,
0.05250297859311104,
-0.01979217864573002,
0.004309153184294701,
0.10778987407684326,
0.04866497963666916,
-0.14037664234638214,
0.007050710264593363,
0.04180130735039711,
-0.10175306349992752,
-0.013740887865424156,
-0.03750891610980034,
-0.02496304363012314,
-0.002655965508893132,
0.12136273086071014,
-0.04011581465601921,
0.08696188777685165,
0.02448096312582493,
0.03171210363507271,
-0.10044985264539719,
0.179442897439003,
-0.07119502872228622,
-0.01443582121282816,
-0.04999251291155815,
0.09898312389850616,
-0.023084048181772232,
-0.013083121739327908,
-0.05906476452946663,
0.0210319422185421,
0.10726560652256012,
0.03331432864069939,
-0.02559451013803482,
0.02678908035159111,
-0.057414308190345764,
-0.025325026363134384,
-0.017239773645997047,
-0.10345982760190964,
0.0447990745306015,
0.023797281086444855,
-0.08789769560098648,
0.017866482958197594,
0.007342089433223009,
0.007826129905879498,
-0.0285433828830719,
0.1367064118385315,
-0.07054563611745834,
0.010169215500354767,
-0.10121354460716248,
-0.10116605460643768,
0.026831701397895813,
-0.04975994676351547,
0.005839263554662466,
-0.06944354623556137,
-0.1176958754658699,
-0.046990733593702316,
0.06412146985530853,
-0.0501394160091877,
-0.059061117470264435,
-0.058085910975933075,
-0.06620777398347855,
0.05227213725447655,
-0.030300768092274666,
0.18005536496639252,
-0.061285026371479034,
0.11552876979112625,
0.015247934497892857,
0.04575706273317337,
0.03698418661952019,
0.0690072774887085,
-0.04828434810042381,
0.03650374338030815,
-0.13088113069534302,
0.06439842283725739,
-0.08698756992816925,
0.051752761006355286,
-0.14129488170146942,
-0.12878338992595673,
-0.029687311500310898,
0.0008196525741368532,
0.10439068078994751,
0.08089505136013031,
-0.17687149345874786,
-0.0912129282951355,
0.17943133413791656,
-0.06999905407428741,
-0.08591285347938538,
0.12918177247047424,
-0.03188654035329819,
-0.016514834016561508,
0.036905884742736816,
0.16921919584274292,
0.07862558215856552,
-0.08487432450056076,
0.022411387413740158,
-0.04608399048447609,
0.11646310985088348,
0.011391767300665379,
0.09310294687747955,
-0.02800544537603855,
0.030663304030895233,
-0.0017650977242738008,
-0.032085318118333817,
0.07728360593318939,
-0.09270143508911133,
-0.08426504582166672,
-0.029076507315039635,
-0.07205532491207123,
0.013815772719681263,
0.07094129920005798,
0.0424388162791729,
-0.09062837809324265,
-0.13499049842357635,
0.030590921640396118,
0.10870979726314545,
-0.10527973622083664,
0.02569742687046528,
-0.07355234771966934,
0.04330984130501747,
-0.020324300974607468,
-0.011962750926613808,
-0.1751294881105423,
-0.01122658234089613,
0.02694176882505417,
-0.04851686209440231,
0.02355234883725643,
0.0005415093037299812,
0.08674658089876175,
0.0474834218621254,
-0.047144271433353424,
-0.07576386630535126,
-0.0735124871134758,
-0.008587894961237907,
-0.07756908237934113,
-0.22242508828639984,
-0.07245989143848419,
-0.029876092448830605,
0.1358700841665268,
-0.23130255937576294,
0.007792108226567507,
0.01367965992540121,
0.11113856732845306,
0.030479811131954193,
-0.043768659234046936,
-0.01640632376074791,
0.08309409767389297,
-0.01603527180850506,
-0.06825926899909973,
0.04530467465519905,
-0.003466870402917266,
-0.11576185375452042,
0.011323395185172558,
-0.11139357089996338,
0.09781478345394135,
0.11151932179927826,
-0.038415972143411636,
-0.07982413470745087,
-0.056649498641490936,
-0.06808337569236755,
-0.06625282019376755,
-0.01927930861711502,
0.011792114935815334,
0.21593482792377472,
0.03214920684695244,
0.12130951136350632,
-0.07992726564407349,
-0.04666668549180031,
0.026121603325009346,
0.013593851588666439,
-0.005898343864828348,
0.14208272099494934,
0.07959815859794617,
-0.049865443259477615,
0.10044191032648087,
0.09709527343511581,
-0.08970457315444946,
0.1402464210987091,
-0.06891298294067383,
-0.13388821482658386,
-0.011320141144096851,
0.026441054418683052,
0.02676803059875965,
0.1162647232413292,
-0.14132988452911377,
-0.0023958655074238777,
0.02341879904270172,
0.024407092481851578,
0.028669482097029686,
-0.21902722120285034,
-0.012171715497970581,
0.03914802148938179,
-0.059608448296785355,
-0.02672852948307991,
-0.02188093587756157,
0.00825827568769455,
0.09466614574193954,
0.006551126483827829,
-0.06962994486093521,
-0.010807493701577187,
-0.019409649074077606,
-0.07315412163734436,
0.1927829086780548,
-0.09897150844335556,
-0.14540515840053558,
-0.12794840335845947,
-0.034218139946460724,
0.0015010465867817402,
-0.020962757989764214,
0.03780027851462364,
-0.10953259468078613,
-0.03986363857984543,
-0.055189330130815506,
0.03763550892472267,
-0.06426150351762772,
0.028194565325975418,
-0.006047305651009083,
0.005710993427783251,
0.08018191158771515,
-0.10056623816490173,
0.023490730673074722,
-0.022691940888762474,
-0.03919511288404465,
0.03869657963514328,
0.04177090898156166,
0.09369944036006927,
0.16755065321922302,
0.02353285625576973,
0.018738726153969765,
-0.03309944272041321,
0.14663057029247284,
-0.09653877466917038,
-0.02734220027923584,
0.113962322473526,
-0.004926162771880627,
0.0466005839407444,
0.10553412139415741,
0.06322237849235535,
-0.0807587131857872,
0.01681952364742756,
0.04203563183546066,
-0.015755020081996918,
-0.24212141335010529,
-0.030374426394701004,
-0.057638660073280334,
-0.035012807697057724,
0.13084818422794342,
0.036773357540369034,
-0.009178309701383114,
0.03759375959634781,
-0.0006845543975941837,
0.006277440115809441,
-0.010883443057537079,
0.06596055626869202,
0.07647673785686493,
0.03914739936590195,
0.12182657420635223,
-0.017551913857460022,
-0.04759185016155243,
0.02726442739367485,
0.004270460922271013,
0.25959262251853943,
-0.010637996718287468,
0.16753242909908295,
0.04812992736697197,
0.1720549762248993,
0.015215884894132614,
0.07271057367324829,
0.006984098348766565,
-0.028347140178084373,
0.008541827090084553,
-0.05180218443274498,
-0.030833840370178223,
0.03784508258104324,
0.07000837475061417,
0.038225576281547546,
-0.11016702651977539,
-0.03302859514951706,
0.019084136933088303,
0.33505257964134216,
0.05004826933145523,
-0.31870952248573303,
-0.08689486235380173,
-0.00643242709338665,
-0.07687878608703613,
-0.05576276779174805,
0.02734854444861412,
0.10859889537096024,
-0.08858207613229752,
0.041681528091430664,
-0.08348394185304642,
0.09942295402288437,
-0.061440229415893555,
0.004689319524914026,
0.09172873198986053,
0.09359686821699142,
0.006364594213664532,
0.06942757219076157,
-0.2641921639442444,
0.3079148828983307,
-0.014584489166736603,
0.07032144069671631,
-0.05069262906908989,
0.03266561031341553,
0.034607745707035065,
-0.045599501579999924,
0.0640612319111824,
-0.0166215430945158,
-0.10798671096563339,
-0.18805809319019318,
-0.07890617102384567,
0.025398701429367065,
0.12913186848163605,
-0.040527377277612686,
0.11565358936786652,
-0.023659760132431984,
-0.006962021347135305,
0.0583987832069397,
-0.09056328982114792,
-0.08516572415828705,
-0.10884421318769455,
0.02109874226152897,
0.02624104917049408,
0.047659505158662796,
-0.10090980678796768,
-0.11792715638875961,
-0.08300036936998367,
0.16719570755958557,
-0.06503774970769882,
-0.0084636015817523,
-0.13460691273212433,
0.10231748968362808,
0.17164625227451324,
-0.06630212813615799,
0.05513012409210205,
0.020440388470888138,
0.10605225712060928,
0.03463556990027428,
-0.0206983033567667,
0.11312703788280487,
-0.07843407243490219,
-0.1813746690750122,
-0.05276105925440788,
0.14598838984966278,
0.03944652900099754,
0.0655822828412056,
-0.030776698142290115,
0.029091686010360718,
-0.03362099081277847,
-0.08357921242713928,
0.0486905463039875,
0.009612930938601494,
0.02337781712412834,
0.050412267446517944,
-0.04709359630942345,
0.013467085547745228,
-0.07078637182712555,
-0.05681702122092247,
0.15168915688991547,
0.27074652910232544,
-0.08551522344350815,
0.024402011185884476,
0.04069048538804054,
-0.04656274989247322,
-0.14233729243278503,
0.029313545674085617,
0.12453988194465637,
0.03136954456567764,
0.014051680453121662,
-0.23040327429771423,
0.062190040946006775,
0.09718970209360123,
-0.017363300547003746,
0.08976862579584122,
-0.32366862893104553,
-0.12483437359333038,
0.11203209310770035,
0.1253940314054489,
-0.037514928728342056,
-0.15515153110027313,
-0.056498922407627106,
0.003205978311598301,
-0.07978521287441254,
0.06887780129909515,
-0.0272294282913208,
0.12153292447328568,
-0.016638165339827538,
0.06691820174455643,
0.02388904057443142,
-0.061800260096788406,
0.15112227201461792,
-0.02261715941131115,
0.06334207206964493,
-0.010048925876617432,
0.06763394176959991,
0.007991361431777477,
-0.04545360058546066,
0.0003377001849003136,
-0.0725352019071579,
0.020097196102142334,
-0.13277192413806915,
-0.03174251317977905,
-0.08959389477968216,
0.019508270546793938,
-0.03828670084476471,
-0.04214910417795181,
-0.015191140584647655,
0.046115901321172714,
0.06088755279779434,
0.001583542674779892,
0.11319287866353989,
-0.040949881076812744,
0.1479066163301468,
0.07933963090181351,
0.08322365581989288,
-0.0072335959412157536,
-0.104046531021595,
-0.01355044450610876,
-0.0004635014047380537,
0.05229773744940758,
-0.12191115319728851,
0.01974303647875786,
0.1585894376039505,
0.03974982723593712,
0.14268909394741058,
0.06536420434713364,
-0.0767585039138794,
0.007468327414244413,
0.06689666211605072,
-0.09142155200242996,
-0.10936985164880753,
-0.007244864013046026,
0.06555473059415817,
-0.12242873758077621,
-0.00442162249237299,
0.0950690507888794,
-0.05367697775363922,
-0.008028654381632805,
0.015045175328850746,
0.01269984245300293,
-0.047398436814546585,
0.22437937557697296,
0.026826316490769386,
0.07455550134181976,
-0.09017348289489746,
0.07558958977460861,
0.06270246207714081,
-0.18294163048267365,
0.017114797607064247,
0.08799571543931961,
-0.04327469319105148,
-0.021538374945521355,
0.03122170828282833,
0.07754041254520416,
0.0010562994284555316,
-0.06112685799598694,
-0.11204042285680771,
-0.1379832625389099,
0.08740971237421036,
0.08524525910615921,
0.03661977872252464,
0.017410870641469955,
-0.04666613042354584,
0.03888964653015137,
-0.10729553550481796,
0.08614712953567505,
0.08442147076129913,
0.07664989680051804,
-0.12878338992595673,
0.1532260775566101,
0.02058466710150242,
0.009320689365267754,
0.001637361361645162,
0.0035589770413935184,
-0.09566251188516617,
0.03363579511642456,
-0.10017134249210358,
-0.016184469684958458,
-0.0526379719376564,
-0.006301389075815678,
0.005781113635748625,
-0.059376202523708344,
-0.050939418375492096,
0.013511884026229382,
-0.12255238741636276,
-0.042309556156396866,
-0.0011642742902040482,
0.06175735965371132,
-0.1016472727060318,
-0.02597656287252903,
0.0444151945412159,
-0.10626409947872162,
0.089144766330719,
0.053742509335279465,
0.022857170552015305,
0.04969903454184532,
-0.12365905940532684,
0.019274761900305748,
0.044546689838171005,
0.008353600278496742,
0.02729582041501999,
-0.15306337177753448,
-0.0021789665333926678,
-0.021199388429522514,
0.03381036967039108,
-0.006575746461749077,
0.02732657827436924,
-0.13664722442626953,
-0.043210625648498535,
-0.0339723564684391,
-0.07508727163076401,
-0.061347782611846924,
0.047625832259655,
0.05480394884943962,
0.04316461458802223,
0.15796710550785065,
-0.08545175194740295,
0.04775702580809593,
-0.2221120297908783,
0.013045253232121468,
-0.03361877426505089,
-0.07086224108934402,
-0.049957823008298874,
-0.03741759434342384,
0.08754805475473404,
-0.07192999124526978,
0.09874525666236877,
-0.046818770468235016,
0.04220958426594734,
0.03053084760904312,
-0.11750026792287827,
0.01597343198955059,
0.04217526689171791,
0.23303723335266113,
0.051226671785116196,
-0.021164104342460632,
0.06917115300893784,
0.01296372152864933,
0.05008368566632271,
0.17076075077056885,
0.1598730981349945,
0.19220969080924988,
0.05291931703686714,
0.09124480932950974,
0.05744394659996033,
-0.11767596751451492,
-0.12331123650074005,
0.12306287884712219,
-0.03599163889884949,
0.12740160524845123,
-0.014390267431735992,
0.2587718069553375,
0.10362067818641663,
-0.19306565821170807,
0.05530787259340286,
-0.031239347532391548,
-0.08892601728439331,
-0.09198247641324997,
-0.04992038756608963,
-0.066465824842453,
-0.17156526446342468,
0.011480466462671757,
-0.10368910431861877,
0.05258757248520851,
0.05436902120709419,
0.043203625828027725,
0.012417140416800976,
0.1332860291004181,
0.06192735210061073,
0.0014509912580251694,
0.1052112728357315,
0.02759755589067936,
-0.010768997482955456,
-0.06154652684926987,
-0.09725713729858398,
0.0233630184084177,
-0.03020085021853447,
0.04921695962548256,
-0.055798038840293884,
-0.09646020084619522,
0.06057373434305191,
0.012367486022412777,
-0.09608501195907593,
0.021264955401420593,
-0.015960225835442543,
0.06606829166412354,
0.08774003386497498,
0.025794925168156624,
-0.005502660758793354,
-0.03438428416848183,
0.24524502456188202,
-0.09718570858240128,
-0.04602383077144623,
-0.1188732236623764,
0.2462557703256607,
0.010495017282664776,
-0.025823917239904404,
0.02613898180425167,
-0.061280444264411926,
-0.004172847140580416,
0.16663502156734467,
0.13129816949367523,
-0.03651784732937813,
-0.017984649166464806,
0.02082887850701809,
-0.011981247924268246,
-0.05814175680279732,
0.0941614881157875,
0.13988767564296722,
0.05721794068813324,
-0.07704437524080276,
-0.055998366326093674,
-0.05552572011947632,
-0.03648029640316963,
-0.03865571320056915,
0.05170486867427826,
0.03732000291347504,
-0.01384270004928112,
-0.034766968339681625,
0.09911807626485825,
-0.06576424092054367,
-0.11086757481098175,
0.04206063970923424,
-0.1825212687253952,
-0.1919776052236557,
-0.03891767933964729,
0.0683894082903862,
0.014957860112190247,
0.04675877466797829,
-0.014336787164211273,
-0.020456211641430855,
0.07605194300413132,
0.0016252549830824137,
-0.04496326670050621,
-0.10635192692279816,
0.08876507729291916,
-0.07960452139377594,
0.18479910492897034,
-0.042202722281217575,
0.03283843770623207,
0.12257011234760284,
0.07672493904829025,
-0.0765100046992302,
0.05090877041220665,
0.07089658826589584,
-0.13586324453353882,
0.045615095645189285,
0.18674629926681519,
-0.0336562804877758,
0.11812648177146912,
0.03986974433064461,
-0.10617417842149734,
0.024655865505337715,
-0.10068003088235855,
-0.04758572578430176,
-0.05137763172388077,
-0.015655573457479477,
-0.041456181555986404,
0.1373451054096222,
0.22391599416732788,
-0.047891899943351746,
-0.01616188883781433,
-0.0661013126373291,
0.012087538838386536,
0.034738775342702866,
0.11887650936841965,
-0.04943561553955078,
-0.2617417871952057,
0.016645465046167374,
0.010852757841348648,
0.012255794368684292,
-0.24518680572509766,
-0.0965772271156311,
0.0373116098344326,
-0.06495293229818344,
-0.06607472896575928,
0.1137382835149765,
0.05970735475420952,
0.05379420518875122,
-0.052688077092170715,
-0.10142252594232559,
-0.03924550861120224,
0.17622536420822144,
-0.18267780542373657,
-0.053940314799547195
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
#
This model was trained from scratch on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2965
- Wer: 0.3144
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 20.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 2.888 | 0.51 | 400 | 3.7320 | 0.9440 |
| 3.1636 | 1.02 | 800 | 2.9188 | 1.1916 |
| 2.773 | 1.53 | 1200 | 2.3347 | 1.0134 |
| 0.7198 | 2.04 | 1600 | 0.6678 | 0.4826 |
| 0.5255 | 2.55 | 2000 | 0.4605 | 0.4135 |
| 0.3961 | 3.06 | 2400 | 0.4266 | 0.3955 |
| 0.3424 | 3.57 | 2800 | 0.3786 | 0.3741 |
| 0.3858 | 4.08 | 3200 | 0.3161 | 0.3552 |
| 0.3218 | 4.59 | 3600 | 0.3029 | 0.3510 |
| 0.199 | 5.1 | 4000 | 0.2988 | 0.3418 |
| 0.2054 | 5.61 | 4400 | 0.2873 | 0.3434 |
| 0.1704 | 6.12 | 4800 | 0.3129 | 0.3432 |
| 0.1805 | 6.63 | 5200 | 0.2963 | 0.3413 |
| 0.2091 | 7.14 | 5600 | 0.2755 | 0.3329 |
| 0.1971 | 7.65 | 6000 | 0.2706 | 0.3309 |
| 0.1237 | 8.16 | 6400 | 0.2823 | 0.3270 |
| 0.123 | 8.67 | 6800 | 0.2754 | 0.3246 |
| 0.103 | 9.18 | 7200 | 0.2917 | 0.3272 |
| 0.1143 | 9.69 | 7600 | 0.2885 | 0.3305 |
| 0.156 | 10.2 | 8000 | 0.2810 | 0.3288 |
| 0.167 | 10.71 | 8400 | 0.2689 | 0.3232 |
| 0.0815 | 11.22 | 8800 | 0.2899 | 0.3236 |
| 0.0844 | 11.73 | 9200 | 0.2798 | 0.3225 |
| 0.0775 | 12.24 | 9600 | 0.2894 | 0.3224 |
| 0.0677 | 12.75 | 10000 | 0.2838 | 0.3204 |
| 0.1383 | 13.27 | 10400 | 0.2959 | 0.3211 |
| 0.1233 | 13.77 | 10800 | 0.2922 | 0.3213 |
| 0.0688 | 14.29 | 11200 | 0.2903 | 0.3209 |
| 0.0655 | 14.8 | 11600 | 0.2868 | 0.3182 |
| 0.0449 | 15.31 | 12000 | 0.2959 | 0.3172 |
| 0.0421 | 15.82 | 12400 | 0.2966 | 0.3180 |
| 0.0858 | 16.33 | 12800 | 0.2941 | 0.3164 |
| 0.0859 | 16.84 | 13200 | 0.2980 | 0.3165 |
| 0.0561 | 17.35 | 13600 | 0.2965 | 0.3165 |
| 0.0506 | 17.86 | 14000 | 0.2935 | 0.3148 |
| 0.0312 | 18.37 | 14400 | 0.2964 | 0.3154 |
| 0.0403 | 18.88 | 14800 | 0.2967 | 0.3160 |
| 0.0924 | 19.39 | 15200 | 0.2955 | 0.3147 |
| 0.0585 | 19.9 | 15600 | 0.2965 | 0.3144 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.0+cu113
- Datasets 1.18.1
- Tokenizers 0.11.0
| {"tags": ["generated_from_trainer"], "model-index": [{"name": "", "results": []}]} | automatic-speech-recognition | Rolv-Arild/xls-r-300m-npsc-seq2seq | [
"transformers",
"pytorch",
"tensorboard",
"speech-encoder-decoder",
"automatic-speech-recognition",
"generated_from_trainer",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #speech-encoder-decoder #automatic-speech-recognition #generated_from_trainer #endpoints_compatible #region-us
|
This model was trained from scratch on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2965
* Wer: 0.3144
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0001
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* gradient\_accumulation\_steps: 8
* total\_train\_batch\_size: 64
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 1000
* num\_epochs: 20.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.0+cu113
* Datasets 1.18.1
* Tokenizers 0.11.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 64\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 20.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.0+cu113\n* Datasets 1.18.1\n* Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #pytorch #tensorboard #speech-encoder-decoder #automatic-speech-recognition #generated_from_trainer #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 64\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 20.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.0+cu113\n* Datasets 1.18.1\n* Tokenizers 0.11.0"
] | [
51,
159,
4,
36
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #speech-encoder-decoder #automatic-speech-recognition #generated_from_trainer #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 8\n* total\\_train\\_batch\\_size: 64\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 20.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.0+cu113\n* Datasets 1.18.1\n* Tokenizers 0.11.0"
] | [
-0.11939792335033417,
0.09089664369821548,
-0.002606774214655161,
0.05295642465353012,
0.1311279684305191,
0.011830671690404415,
0.11816155910491943,
0.12635602056980133,
-0.09536926448345184,
0.08427029103040695,
0.10564007610082626,
0.09369064122438431,
0.04926692321896553,
0.08135470002889633,
-0.03574918210506439,
-0.31642860174179077,
0.01093285996466875,
0.03996051475405693,
-0.11973550915718079,
0.11451932787895203,
0.09699304401874542,
-0.12138908356428146,
0.042450495064258575,
0.02599003165960312,
-0.1537712812423706,
-0.0050980583764612675,
0.00141241739038378,
-0.09803581237792969,
0.1226547434926033,
0.034910399466753006,
0.1091616228222847,
0.01656949147582054,
0.09203535318374634,
-0.21478812396526337,
0.008313475176692009,
0.06154566630721092,
0.046324968338012695,
0.08608833700418472,
0.06599348038434982,
-0.004194552078843117,
0.11361014097929001,
-0.07903557270765305,
0.08549629151821136,
0.04533758759498596,
-0.10388161242008209,
-0.332783043384552,
-0.06894788146018982,
0.05783125013113022,
0.09637518227100372,
0.09292171150445938,
-0.019082723185420036,
0.0993494912981987,
-0.04802617058157921,
0.10173042863607407,
0.22305534780025482,
-0.23791201412677765,
-0.068501316010952,
-0.05699077621102333,
0.06844523549079895,
0.04980524629354477,
-0.11324910819530487,
-0.01914617232978344,
0.03303774446249008,
0.04171065241098404,
0.10541433840990067,
0.0015893455129116774,
0.0014774156734347343,
0.007017912343144417,
-0.1531425416469574,
-0.042401645332574844,
0.11761825531721115,
0.03920551389455795,
-0.03985634073615074,
-0.08631271123886108,
-0.05427701026201248,
-0.20896057784557343,
-0.05120033025741577,
-0.007206627167761326,
0.01721348613500595,
-0.05733726918697357,
-0.11656540632247925,
-0.01638888195157051,
-0.07472752779722214,
-0.09656970947980881,
-0.004818441346287727,
0.23980286717414856,
0.04599575325846672,
-0.015629608184099197,
-0.021508604288101196,
0.09859149903059006,
0.04637939855456352,
-0.16253158450126648,
0.007925781421363354,
0.04396997019648552,
-0.05306142941117287,
-0.011523311026394367,
-0.05189376324415207,
-0.0323188453912735,
-0.0035865544341504574,
0.14686818420886993,
-0.07419632375240326,
0.0774400606751442,
0.0001792276161722839,
0.017648212611675262,
-0.0925954207777977,
0.22974486649036407,
-0.057454559952020645,
-0.000711399654392153,
-0.025226768106222153,
0.09852060675621033,
0.018546855077147484,
-0.02714662067592144,
-0.08989628404378891,
0.017936546355485916,
0.11516536772251129,
0.039534807205200195,
-0.02465699054300785,
0.04159841686487198,
-0.028116479516029358,
-0.021843360736966133,
0.02494255267083645,
-0.10881707072257996,
0.03770441189408302,
0.023683154955506325,
-0.07270871102809906,
0.019863849505782127,
0.01152854971587658,
0.008678051643073559,
-0.027333566918969154,
0.12089446932077408,
-0.08425537496805191,
-0.0019474945729598403,
-0.08160445839166641,
-0.12235589325428009,
0.015029020607471466,
-0.03357948362827301,
0.00314288935624063,
-0.08749987930059433,
-0.09517157822847366,
-0.017905117943882942,
0.02799832448363304,
-0.039395999163389206,
-0.039532504975795746,
-0.0462927408516407,
-0.09309852123260498,
0.06386057287454605,
-0.023220663890242577,
0.10508348792791367,
-0.05012645944952965,
0.11850062012672424,
0.0736551433801651,
0.05227353423833847,
0.0012867540353909135,
0.058836065232753754,
-0.052380990236997604,
0.03383813425898552,
-0.20069140195846558,
0.0769859030842781,
-0.08989797532558441,
0.035198815166950226,
-0.11512039601802826,
-0.12996536493301392,
0.023516880348324776,
0.006475626491010189,
0.10582578927278519,
0.10239004343748093,
-0.140431210398674,
-0.11232492327690125,
0.15171726047992706,
-0.0855780765414238,
-0.09477709233760834,
0.13572627305984497,
-0.015233471058309078,
-0.04244638979434967,
0.047745659947395325,
0.16745230555534363,
0.09839155524969101,
-0.11992517858743668,
-0.006272707134485245,
-0.03565582260489464,
0.11366317421197891,
0.017659872770309448,
0.08839890360832214,
-0.030193541198968887,
0.026215311139822006,
0.004948120564222336,
-0.04247291386127472,
0.0713767409324646,
-0.09608927369117737,
-0.07463429123163223,
-0.036648549139499664,
-0.07711132615804672,
0.05432099103927612,
0.057125214487314224,
0.04055335000157356,
-0.09246870875358582,
-0.14256854355335236,
0.07088132947683334,
0.10608045011758804,
-0.09993626177310944,
0.04286537691950798,
-0.09228411316871643,
0.05442957952618599,
-0.034185174852609634,
-0.015253225341439247,
-0.1807471513748169,
-0.02323502115905285,
0.028562264516949654,
-0.04849539324641228,
0.033284854143857956,
-0.05661454796791077,
0.07772702723741531,
0.06436892598867416,
-0.05513057857751846,
-0.06330181658267975,
-0.07205606997013092,
0.0005495423101820052,
-0.07980584353208542,
-0.21378707885742188,
-0.07593350857496262,
-0.025164885446429253,
0.12837441265583038,
-0.18129557371139526,
0.0210530124604702,
0.035094015300273895,
0.13006675243377686,
0.024813324213027954,
-0.031515367329120636,
-0.02225324884057045,
0.0948270931839943,
-0.0277972724288702,
-0.06072468310594559,
0.028325280174613,
0.0043960269540548325,
-0.1136036366224289,
-0.002224939875304699,
-0.1546800285577774,
0.10702084749937057,
0.1290164291858673,
-0.02375589869916439,
-0.07697000354528427,
-0.00867396965622902,
-0.0631868839263916,
-0.04641334339976311,
0.0007212762720882893,
-0.020989667624235153,
0.19916494190692902,
0.021122319623827934,
0.137028306722641,
-0.07366158068180084,
-0.045527879148721695,
0.04052194952964783,
0.0008764573722146451,
-0.02317460998892784,
0.1071447879076004,
0.05639355257153511,
-0.030644061043858528,
0.1068257987499237,
0.09129102528095245,
-0.09983053058385849,
0.1448318213224411,
-0.06947407871484756,
-0.09448763728141785,
-0.028927307575941086,
-0.014344325289130211,
0.02799157425761223,
0.1186249703168869,
-0.1476442962884903,
-0.027859840542078018,
0.013978706672787666,
0.01304638758301735,
0.02444755844771862,
-0.20349475741386414,
0.01387921255081892,
0.04049880430102348,
-0.054825346916913986,
-0.03302260860800743,
-0.01512625440955162,
0.011485272087156773,
0.08927520364522934,
0.023260999470949173,
-0.061389897018671036,
-0.0001852673158282414,
-0.003909277729690075,
-0.0672144964337349,
0.1966313123703003,
-0.11242205649614334,
-0.15683792531490326,
-0.15944142639636993,
-0.04174894094467163,
-0.04096589609980583,
0.0021283512469381094,
0.04511366784572601,
-0.1160384938120842,
-0.03556652367115021,
-0.043332017958164215,
0.07291527837514877,
-0.05526464059948921,
0.04662085697054863,
0.014352211728692055,
-0.012923303991556168,
0.06693650037050247,
-0.12045661360025406,
0.011621094308793545,
-0.023684317246079445,
-0.01987333782017231,
0.0024818023666739464,
0.058367714285850525,
0.0978725254535675,
0.1880667358636856,
0.006798443850129843,
0.02105920948088169,
-0.03576764464378357,
0.19113805890083313,
-0.11243172734975815,
-0.02600163407623768,
0.12869112193584442,
-0.022866718471050262,
0.04604050889611244,
0.0996384471654892,
0.06209791824221611,
-0.08082184195518494,
0.011698197573423386,
0.021558351814746857,
-0.01972321793437004,
-0.21473795175552368,
-0.04387687146663666,
-0.04636687412858009,
-0.007336689159274101,
0.10657622665166855,
0.02175447903573513,
0.037355922162532806,
0.03312968462705612,
-0.009954571723937988,
0.02471245639026165,
-0.018880503252148628,
0.08507438749074936,
0.13568206131458282,
0.038057588040828705,
0.12314919382333755,
-0.020532120019197464,
-0.061195190995931625,
0.021915217861533165,
-0.0188042800873518,
0.18792475759983063,
-0.009959309361875057,
0.1774715781211853,
0.028819149360060692,
0.144614577293396,
0.024154361337423325,
0.07582371681928635,
0.009375616908073425,
-0.019767548888921738,
0.02780534140765667,
-0.05678388103842735,
-0.04036471247673035,
0.020525678992271423,
0.06484047323465347,
0.05047919973731041,
-0.12594571709632874,
-0.025682980194687843,
0.016171494498848915,
0.32818588614463806,
0.05876877158880234,
-0.32426726818084717,
-0.1215776801109314,
0.000814299623016268,
-0.059916965663433075,
-0.05572671443223953,
0.03272254765033722,
0.10287558287382126,
-0.07286303490400314,
0.05389351770281792,
-0.07130645960569382,
0.08447407186031342,
-0.0669541135430336,
0.020334433764219284,
0.07543253153562546,
0.10660013556480408,
-0.003940143156796694,
0.04201168194413185,
-0.24362017214298248,
0.27099332213401794,
0.005852032452821732,
0.08174435794353485,
-0.04816411808133125,
0.03219480812549591,
0.016346391290426254,
-0.011291109025478363,
0.03731921315193176,
-0.026411524042487144,
-0.07038842141628265,
-0.20426508784294128,
-0.07752393931150436,
0.014095564372837543,
0.15346986055374146,
-0.03520709276199341,
0.12940548360347748,
-0.03403623029589653,
-0.021679019555449486,
0.07184859365224838,
-0.11385481804609299,
-0.07936915755271912,
-0.0944836437702179,
0.027809975668787956,
0.045128315687179565,
0.05715380981564522,
-0.09122882038354874,
-0.11422161012887955,
-0.05225953087210655,
0.13547714054584503,
-0.10065008699893951,
-0.036169856786727905,
-0.13041619956493378,
0.06778490543365479,
0.16982826590538025,
-0.0704786628484726,
0.061615459620952606,
0.01344445999711752,
0.13458114862442017,
0.018242523074150085,
-0.04264334589242935,
0.11300129443407059,
-0.0755394697189331,
-0.21004942059516907,
-0.03792525455355644,
0.1715698540210724,
0.028414707630872726,
0.05492282286286354,
-0.03036479279398918,
0.041988223791122437,
-0.021813934668898582,
-0.07977534830570221,
0.037636660039424896,
-0.008674049749970436,
0.01801149733364582,
0.04698029160499573,
-0.017226653173565865,
0.008462317287921906,
-0.07532961666584015,
-0.024391910061240196,
0.1428833305835724,
0.2450198531150818,
-0.08873328566551208,
-0.000736626039724797,
0.09329817444086075,
-0.04025810956954956,
-0.15723870694637299,
0.0166645385324955,
0.10976851731538773,
0.018133854493498802,
-0.036351870745420456,
-0.20754104852676392,
0.06371141970157623,
0.07201950252056122,
-0.02059023454785347,
0.10348566621541977,
-0.2926620543003082,
-0.14918887615203857,
0.13679082691669464,
0.09775979816913605,
-0.010042007081210613,
-0.15678972005844116,
-0.06186297535896301,
-0.02370518259704113,
-0.1043797954916954,
0.08421993255615234,
-0.0339130274951458,
0.12136884778738022,
-0.008825225755572319,
0.07112804055213928,
0.021632228046655655,
-0.059078067541122437,
0.14153388142585754,
-0.011058640666306019,
0.05024116858839989,
-0.018892914056777954,
-0.00017014429613482207,
0.037230562418699265,
-0.04701678827404976,
0.021762100979685783,
-0.04948423057794571,
0.036562107503414154,
-0.09955595433712006,
-0.03411927819252014,
-0.09748407453298569,
0.022665690630674362,
-0.0349729098379612,
-0.04816755652427673,
-0.02514749951660633,
0.015965968370437622,
0.023170821368694305,
-0.025082433596253395,
0.15826742351055145,
-0.02455844357609749,
0.16930952668190002,
0.12852443754673004,
0.0994882881641388,
-0.021116238087415695,
-0.09137929230928421,
0.012669003568589687,
-0.029636487364768982,
0.07158175855875015,
-0.12241443991661072,
0.02266034297645092,
0.14174802601337433,
0.061171405017375946,
0.1355672925710678,
0.072889544069767,
-0.06912149488925934,
0.022932520136237144,
0.05742830038070679,
-0.11109568178653717,
-0.13073410093784332,
-0.031110528856515884,
0.011977069079875946,
-0.1290086805820465,
0.05051018297672272,
0.12321431189775467,
-0.062151022255420685,
-0.005446026101708412,
0.012617879547178745,
-0.0014092355268076062,
-0.04970116168260574,
0.22427895665168762,
0.06044197827577591,
0.08086606860160828,
-0.10581030696630478,
0.0755462497472763,
0.052265554666519165,
-0.16608679294586182,
0.03744983673095703,
0.09535360336303711,
-0.05479779466986656,
-0.010406100191175938,
-0.000345629669027403,
0.0745554268360138,
-0.0492497980594635,
-0.051623161882162094,
-0.14697162806987762,
-0.14946602284908295,
0.08373259007930756,
0.17350664734840393,
0.03707118704915047,
0.02790617011487484,
-0.06668724119663239,
0.06888670474290848,
-0.1458968222141266,
0.10132385790348053,
0.05295221880078316,
0.08449406176805496,
-0.14503119885921478,
0.1812439113855362,
0.012334980070590973,
0.031776782125234604,
-0.012821624055504799,
-0.003139363368973136,
-0.11140314489603043,
0.02805221825838089,
-0.14990785717964172,
-0.03709165379405022,
-0.03616180270910263,
-0.00318913790397346,
0.00275223096832633,
-0.07309853285551071,
-0.059408221393823624,
0.029401017352938652,
-0.11073854565620422,
-0.04457543045282364,
0.006576749496161938,
0.0260478425770998,
-0.1250271052122116,
-0.007391374092549086,
0.04221157357096672,
-0.11773864179849625,
0.09364421665668488,
0.07539641857147217,
0.015232299454510212,
0.063839852809906,
-0.04866030812263489,
0.002313717268407345,
0.05059817060828209,
-0.0036723758094012737,
0.03999152034521103,
-0.13342924416065216,
-0.006382848136126995,
-0.029425019398331642,
0.052273739129304886,
0.005853257607668638,
0.07108832895755768,
-0.12436874210834503,
-0.005320866592228413,
-0.014233213849365711,
-0.047343723475933075,
-0.07114915549755096,
0.04741869866847992,
0.084356889128685,
0.03262230008840561,
0.14922599494457245,
-0.08652318269014359,
0.041626423597335815,
-0.2046041190624237,
0.009615164250135422,
-0.03909766301512718,
-0.10044065862894058,
-0.045616500079631805,
-0.020949194207787514,
0.08830645680427551,
-0.06208556517958641,
0.09017167240381241,
-0.06093471497297287,
0.08433683961629868,
0.04946168139576912,
-0.06102503836154938,
-0.014551224187016487,
0.05682556703686714,
0.23044487833976746,
0.0520227886736393,
-0.040439408272504807,
0.06315488368272781,
0.022010186687111855,
0.07802695035934448,
0.13984034955501556,
0.14161641895771027,
0.13729587197303772,
0.04095922037959099,
0.10037129372358322,
0.08324270695447922,
-0.08240803331136703,
-0.17029717564582825,
0.06356975436210632,
-0.041733358055353165,
0.13023585081100464,
-0.015018303878605366,
0.21794800460338593,
0.1034337729215622,
-0.1538078337907791,
0.05318703129887581,
-0.029782267287373543,
-0.09444538503885269,
-0.10289256274700165,
0.0025222438853234053,
-0.0660521611571312,
-0.17222638428211212,
0.007828750647604465,
-0.11769658327102661,
0.04261916130781174,
0.046034179627895355,
0.03592773899435997,
0.011155678890645504,
0.1717904657125473,
0.061264656484127045,
-0.004036181606352329,
0.1054837629199028,
0.02611722983419895,
-0.02026374079287052,
-0.04789868742227554,
-0.09124599397182465,
0.03254605084657669,
-0.013274072669446468,
0.05383368209004402,
-0.03966755047440529,
-0.12407294660806656,
0.0627841055393219,
0.0049997614696621895,
-0.1111178994178772,
0.034124549478292465,
0.011913148686289787,
0.07083071023225784,
0.06520719081163406,
0.010742939077317715,
-0.01715186983346939,
-0.02826087176799774,
0.23196636140346527,
-0.119306281208992,
-0.0762898325920105,
-0.11702312529087067,
0.28051596879959106,
0.010495935566723347,
-0.01040524709969759,
0.025283746421337128,
-0.08163564652204514,
-0.045991718769073486,
0.16472549736499786,
0.139262855052948,
-0.01876302808523178,
-0.013555692508816719,
0.021396750584244728,
-0.012433165684342384,
-0.06069104000926018,
0.08617690950632095,
0.14004282653331757,
0.08881358802318573,
-0.056493259966373444,
-0.04459039866924286,
-0.042875368148088455,
-0.04271484166383743,
-0.023359160870313644,
0.07443614304065704,
0.015739869326353073,
-0.015814686194062233,
-0.03699107840657234,
0.08449910581111908,
-0.06862709671258926,
-0.1390763372182846,
0.02089926414191723,
-0.1994706690311432,
-0.16695164144039154,
-0.01702052168548107,
0.0728510245680809,
0.016782263293862343,
0.0537460632622242,
0.017120491713285446,
-0.02652951143682003,
0.09890801459550858,
-0.000838523090351373,
-0.050078101456165314,
-0.10194765776395798,
0.09556606411933899,
-0.09270040690898895,
0.17426081001758575,
-0.05067135766148567,
0.04771813750267029,
0.12234166264533997,
0.07540430128574371,
-0.07757747173309326,
0.04401896521449089,
0.07176489382982254,
-0.14789901673793793,
0.03388363495469093,
0.19016717374324799,
-0.03521151840686798,
0.12938712537288666,
0.024024341255426407,
-0.15090827643871307,
0.012957313098013401,
-0.08769901096820831,
-0.059983622282743454,
-0.05179443582892418,
-0.043554868549108505,
-0.04907636344432831,
0.1335870921611786,
0.2088843435049057,
-0.05604056641459465,
-0.01831595040857792,
-0.04916185513138771,
0.0045616342686116695,
0.062231872230768204,
0.1084216833114624,
-0.04180854931473732,
-0.29122301936149597,
0.007376671303063631,
0.02866491861641407,
0.002142682671546936,
-0.283270925283432,
-0.08091467618942261,
0.05081871524453163,
-0.05132473260164261,
-0.058969758450984955,
0.09627626836299896,
0.05209311842918396,
0.046465642750263214,
-0.049988582730293274,
-0.07646319270133972,
-0.050676342099905014,
0.18400199711322784,
-0.20056457817554474,
-0.06217580288648605
] |
null | null | transformers |
# ProtBert model
Pretrained model on protein sequences using a masked language modeling (MLM) objective. It was introduced in
[this paper](https://doi.org/10.1101/2020.07.12.199554) and first released in
[this repository](https://github.com/agemagician/ProtTrans). This model is trained on uppercase amino acids: it only works with capital letter amino acids.
## Model description
ProtBert is based on Bert model which pretrained on a large corpus of protein sequences in a self-supervised fashion.
This means it was pretrained on the raw protein sequences only, with no humans labelling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those protein sequences.
One important difference between our Bert model and the original Bert version is the way of dealing with sequences as separate documents.
This means the Next sentence prediction is not used, as each sequence is treated as a complete document.
The masking follows the original Bert training with randomly masks 15% of the amino acids in the input.
At the end, the feature extracted from this model revealed that the LM-embeddings from unlabeled data (only protein sequences) captured important biophysical properties governing protein
shape.
This implied learning some of the grammar of the language of life realized in protein sequences.
## Intended uses & limitations
The model could be used for protein feature extraction or to be fine-tuned on downstream tasks.
We have noticed in some tasks you could gain more accuracy by fine-tuning the model rather than using it as a feature extractor.
### How to use
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import BertForMaskedLM, BertTokenizer, pipeline
>>> tokenizer = BertTokenizer.from_pretrained("Rostlab/prot_bert", do_lower_case=False )
>>> model = BertForMaskedLM.from_pretrained("Rostlab/prot_bert")
>>> unmasker = pipeline('fill-mask', model=model, tokenizer=tokenizer)
>>> unmasker('D L I P T S S K L V V [MASK] D T S L Q V K K A F F A L V T')
[{'score': 0.11088453233242035,
'sequence': '[CLS] D L I P T S S K L V V L D T S L Q V K K A F F A L V T [SEP]',
'token': 5,
'token_str': 'L'},
{'score': 0.08402521163225174,
'sequence': '[CLS] D L I P T S S K L V V S D T S L Q V K K A F F A L V T [SEP]',
'token': 10,
'token_str': 'S'},
{'score': 0.07328339666128159,
'sequence': '[CLS] D L I P T S S K L V V V D T S L Q V K K A F F A L V T [SEP]',
'token': 8,
'token_str': 'V'},
{'score': 0.06921856850385666,
'sequence': '[CLS] D L I P T S S K L V V K D T S L Q V K K A F F A L V T [SEP]',
'token': 12,
'token_str': 'K'},
{'score': 0.06382402777671814,
'sequence': '[CLS] D L I P T S S K L V V I D T S L Q V K K A F F A L V T [SEP]',
'token': 11,
'token_str': 'I'}]
```
Here is how to use this model to get the features of a given protein sequence in PyTorch:
```python
from transformers import BertModel, BertTokenizer
import re
tokenizer = BertTokenizer.from_pretrained("Rostlab/prot_bert", do_lower_case=False )
model = BertModel.from_pretrained("Rostlab/prot_bert")
sequence_Example = "A E T C Z A O"
sequence_Example = re.sub(r"[UZOB]", "X", sequence_Example)
encoded_input = tokenizer(sequence_Example, return_tensors='pt')
output = model(**encoded_input)
```
## Training data
The ProtBert model was pretrained on [Uniref100](https://www.uniprot.org/downloads), a dataset consisting of 217 million protein sequences.
## Training procedure
### Preprocessing
The protein sequences are uppercased and tokenized using a single space and a vocabulary size of 21. The rare amino acids "U,Z,O,B" were mapped to "X".
The inputs of the model are then of the form:
```
[CLS] Protein Sequence A [SEP] Protein Sequence B [SEP]
```
Furthermore, each protein sequence was treated as a separate document.
The preprocessing step was performed twice, once for a combined length (2 sequences) of less than 512 amino acids, and another time using a combined length (2 sequences) of less than 2048 amino acids.
The details of the masking procedure for each sequence followed the original Bert model as following:
- 15% of the amino acids are masked.
- In 80% of the cases, the masked amino acids are replaced by `[MASK]`.
- In 10% of the cases, the masked amino acids are replaced by a random amino acid (different) from the one they replace.
- In the 10% remaining cases, the masked amino acids are left as is.
### Pretraining
The model was trained on a single TPU Pod V3-512 for 400k steps in total.
300K steps using sequence length 512 (batch size 15k), and 100K steps using sequence length 2048 (batch size 2.5k).
The optimizer used is Lamb with a learning rate of 0.002, a weight decay of 0.01, learning rate warmup for 40k steps and linear decay of the learning rate after.
## Evaluation results
When fine-tuned on downstream tasks, this model achieves the following results:
Test results :
| Task/Dataset | secondary structure (3-states) | secondary structure (8-states) | Localization | Membrane |
|:-----:|:-----:|:-----:|:-----:|:-----:|
| CASP12 | 75 | 63 | | |
| TS115 | 83 | 72 | | |
| CB513 | 81 | 66 | | |
| DeepLoc | | | 79 | 91 |
### BibTeX entry and citation info
```bibtex
@article {Elnaggar2020.07.12.199554,
author = {Elnaggar, Ahmed and Heinzinger, Michael and Dallago, Christian and Rehawi, Ghalia and Wang, Yu and Jones, Llion and Gibbs, Tom and Feher, Tamas and Angerer, Christoph and Steinegger, Martin and BHOWMIK, DEBSINDHU and Rost, Burkhard},
title = {ProtTrans: Towards Cracking the Language of Life{\textquoteright}s Code Through Self-Supervised Deep Learning and High Performance Computing},
elocation-id = {2020.07.12.199554},
year = {2020},
doi = {10.1101/2020.07.12.199554},
publisher = {Cold Spring Harbor Laboratory},
abstract = {Computational biology and bioinformatics provide vast data gold-mines from protein sequences, ideal for Language Models (LMs) taken from Natural Language Processing (NLP). These LMs reach for new prediction frontiers at low inference costs. Here, we trained two auto-regressive language models (Transformer-XL, XLNet) and two auto-encoder models (Bert, Albert) on data from UniRef and BFD containing up to 393 billion amino acids (words) from 2.1 billion protein sequences (22- and 112 times the entire English Wikipedia). The LMs were trained on the Summit supercomputer at Oak Ridge National Laboratory (ORNL), using 936 nodes (total 5616 GPUs) and one TPU Pod (V3-512 or V3-1024). We validated the advantage of up-scaling LMs to larger models supported by bigger data by predicting secondary structure (3-states: Q3=76-84, 8 states: Q8=65-73), sub-cellular localization for 10 cellular compartments (Q10=74) and whether a protein is membrane-bound or water-soluble (Q2=89). Dimensionality reduction revealed that the LM-embeddings from unlabeled data (only protein sequences) captured important biophysical properties governing protein shape. This implied learning some of the grammar of the language of life realized in protein sequences. The successful up-scaling of protein LMs through HPC to larger data sets slightly reduced the gap between models trained on evolutionary information and LMs. Availability ProtTrans: \<a href="https://github.com/agemagician/ProtTrans"\>https://github.com/agemagician/ProtTrans\</a\>Competing Interest StatementThe authors have declared no competing interest.},
URL = {https://www.biorxiv.org/content/early/2020/07/21/2020.07.12.199554},
eprint = {https://www.biorxiv.org/content/early/2020/07/21/2020.07.12.199554.full.pdf},
journal = {bioRxiv}
}
```
> Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
| {"tags": ["protein language model", "protein"], "datasets": ["Uniref100"]} | fill-mask | Rostlab/prot_bert | [
"transformers",
"pytorch",
"fill-mask",
"protein language model",
"protein",
"dataset:Uniref100",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #fill-mask #protein language model #protein #dataset-Uniref100 #autotrain_compatible #endpoints_compatible #has_space #region-us
| ProtBert model
==============
Pretrained model on protein sequences using a masked language modeling (MLM) objective. It was introduced in
this paper and first released in
this repository. This model is trained on uppercase amino acids: it only works with capital letter amino acids.
Model description
-----------------
ProtBert is based on Bert model which pretrained on a large corpus of protein sequences in a self-supervised fashion.
This means it was pretrained on the raw protein sequences only, with no humans labelling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those protein sequences.
One important difference between our Bert model and the original Bert version is the way of dealing with sequences as separate documents.
This means the Next sentence prediction is not used, as each sequence is treated as a complete document.
The masking follows the original Bert training with randomly masks 15% of the amino acids in the input.
At the end, the feature extracted from this model revealed that the LM-embeddings from unlabeled data (only protein sequences) captured important biophysical properties governing protein
shape.
This implied learning some of the grammar of the language of life realized in protein sequences.
Intended uses & limitations
---------------------------
The model could be used for protein feature extraction or to be fine-tuned on downstream tasks.
We have noticed in some tasks you could gain more accuracy by fine-tuning the model rather than using it as a feature extractor.
### How to use
You can use this model directly with a pipeline for masked language modeling:
Here is how to use this model to get the features of a given protein sequence in PyTorch:
Training data
-------------
The ProtBert model was pretrained on Uniref100, a dataset consisting of 217 million protein sequences.
Training procedure
------------------
### Preprocessing
The protein sequences are uppercased and tokenized using a single space and a vocabulary size of 21. The rare amino acids "U,Z,O,B" were mapped to "X".
The inputs of the model are then of the form:
Furthermore, each protein sequence was treated as a separate document.
The preprocessing step was performed twice, once for a combined length (2 sequences) of less than 512 amino acids, and another time using a combined length (2 sequences) of less than 2048 amino acids.
The details of the masking procedure for each sequence followed the original Bert model as following:
* 15% of the amino acids are masked.
* In 80% of the cases, the masked amino acids are replaced by '[MASK]'.
* In 10% of the cases, the masked amino acids are replaced by a random amino acid (different) from the one they replace.
* In the 10% remaining cases, the masked amino acids are left as is.
### Pretraining
The model was trained on a single TPU Pod V3-512 for 400k steps in total.
300K steps using sequence length 512 (batch size 15k), and 100K steps using sequence length 2048 (batch size 2.5k).
The optimizer used is Lamb with a learning rate of 0.002, a weight decay of 0.01, learning rate warmup for 40k steps and linear decay of the learning rate after.
Evaluation results
------------------
When fine-tuned on downstream tasks, this model achieves the following results:
Test results :
### BibTeX entry and citation info
>
> Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn
>
>
>
| [
"### How to use\n\n\nYou can use this model directly with a pipeline for masked language modeling:\n\n\nHere is how to use this model to get the features of a given protein sequence in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe ProtBert model was pretrained on Uniref100, a dataset consisting of 217 million protein sequences.\n\n\nTraining procedure\n------------------",
"### Preprocessing\n\n\nThe protein sequences are uppercased and tokenized using a single space and a vocabulary size of 21. The rare amino acids \"U,Z,O,B\" were mapped to \"X\".\nThe inputs of the model are then of the form:\n\n\nFurthermore, each protein sequence was treated as a separate document.\nThe preprocessing step was performed twice, once for a combined length (2 sequences) of less than 512 amino acids, and another time using a combined length (2 sequences) of less than 2048 amino acids.\n\n\nThe details of the masking procedure for each sequence followed the original Bert model as following:\n\n\n* 15% of the amino acids are masked.\n* In 80% of the cases, the masked amino acids are replaced by '[MASK]'.\n* In 10% of the cases, the masked amino acids are replaced by a random amino acid (different) from the one they replace.\n* In the 10% remaining cases, the masked amino acids are left as is.",
"### Pretraining\n\n\nThe model was trained on a single TPU Pod V3-512 for 400k steps in total.\n300K steps using sequence length 512 (batch size 15k), and 100K steps using sequence length 2048 (batch size 2.5k).\nThe optimizer used is Lamb with a learning rate of 0.002, a weight decay of 0.01, learning rate warmup for 40k steps and linear decay of the learning rate after.\n\n\nEvaluation results\n------------------\n\n\nWhen fine-tuned on downstream tasks, this model achieves the following results:\n\n\nTest results :",
"### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #fill-mask #protein language model #protein #dataset-Uniref100 #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"### How to use\n\n\nYou can use this model directly with a pipeline for masked language modeling:\n\n\nHere is how to use this model to get the features of a given protein sequence in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe ProtBert model was pretrained on Uniref100, a dataset consisting of 217 million protein sequences.\n\n\nTraining procedure\n------------------",
"### Preprocessing\n\n\nThe protein sequences are uppercased and tokenized using a single space and a vocabulary size of 21. The rare amino acids \"U,Z,O,B\" were mapped to \"X\".\nThe inputs of the model are then of the form:\n\n\nFurthermore, each protein sequence was treated as a separate document.\nThe preprocessing step was performed twice, once for a combined length (2 sequences) of less than 512 amino acids, and another time using a combined length (2 sequences) of less than 2048 amino acids.\n\n\nThe details of the masking procedure for each sequence followed the original Bert model as following:\n\n\n* 15% of the amino acids are masked.\n* In 80% of the cases, the masked amino acids are replaced by '[MASK]'.\n* In 10% of the cases, the masked amino acids are replaced by a random amino acid (different) from the one they replace.\n* In the 10% remaining cases, the masked amino acids are left as is.",
"### Pretraining\n\n\nThe model was trained on a single TPU Pod V3-512 for 400k steps in total.\n300K steps using sequence length 512 (batch size 15k), and 100K steps using sequence length 2048 (batch size 2.5k).\nThe optimizer used is Lamb with a learning rate of 0.002, a weight decay of 0.01, learning rate warmup for 40k steps and linear decay of the learning rate after.\n\n\nEvaluation results\n------------------\n\n\nWhen fine-tuned on downstream tasks, this model achieves the following results:\n\n\nTest results :",
"### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>"
] | [
53,
81,
234,
128,
34
] | [
"passage: TAGS\n#transformers #pytorch #fill-mask #protein language model #protein #dataset-Uniref100 #autotrain_compatible #endpoints_compatible #has_space #region-us \n### How to use\n\n\nYou can use this model directly with a pipeline for masked language modeling:\n\n\nHere is how to use this model to get the features of a given protein sequence in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe ProtBert model was pretrained on Uniref100, a dataset consisting of 217 million protein sequences.\n\n\nTraining procedure\n------------------### Preprocessing\n\n\nThe protein sequences are uppercased and tokenized using a single space and a vocabulary size of 21. The rare amino acids \"U,Z,O,B\" were mapped to \"X\".\nThe inputs of the model are then of the form:\n\n\nFurthermore, each protein sequence was treated as a separate document.\nThe preprocessing step was performed twice, once for a combined length (2 sequences) of less than 512 amino acids, and another time using a combined length (2 sequences) of less than 2048 amino acids.\n\n\nThe details of the masking procedure for each sequence followed the original Bert model as following:\n\n\n* 15% of the amino acids are masked.\n* In 80% of the cases, the masked amino acids are replaced by '[MASK]'.\n* In 10% of the cases, the masked amino acids are replaced by a random amino acid (different) from the one they replace.\n* In the 10% remaining cases, the masked amino acids are left as is.### Pretraining\n\n\nThe model was trained on a single TPU Pod V3-512 for 400k steps in total.\n300K steps using sequence length 512 (batch size 15k), and 100K steps using sequence length 2048 (batch size 2.5k).\nThe optimizer used is Lamb with a learning rate of 0.002, a weight decay of 0.01, learning rate warmup for 40k steps and linear decay of the learning rate after.\n\n\nEvaluation results\n------------------\n\n\nWhen fine-tuned on downstream tasks, this model achieves the following results:\n\n\nTest results :"
] | [
-0.018880799412727356,
0.060012876987457275,
-0.0025500385090708733,
0.027828818187117577,
0.06336506456136703,
-0.015992285683751106,
-0.0032671322114765644,
0.1130908951163292,
-0.006698013283312321,
0.09752637892961502,
0.03904907777905464,
-0.0661359578371048,
0.0895882248878479,
0.1998879611492157,
0.09449010342359543,
-0.24009224772453308,
0.03396468609571457,
-0.0691617950797081,
-0.021775420755147934,
0.08320775628089905,
0.08567560464143753,
-0.08153972029685974,
0.010552562773227692,
-0.030536789447069168,
0.03584624454379082,
-0.013058732263743877,
0.003749511204659939,
-0.041794754564762115,
0.10423264652490616,
0.0010226033627986908,
0.05207936465740204,
0.04370395839214325,
0.08027616888284683,
-0.14593487977981567,
0.03141726180911064,
0.1224481537938118,
0.05490337684750557,
0.03495262563228607,
0.020201299339532852,
0.10665429383516312,
0.04103688895702362,
-0.021065345034003258,
0.041283369064331055,
0.05011441931128502,
-0.09335581213235855,
-0.11753552407026291,
-0.1316879689693451,
0.012842741794884205,
0.10535222291946411,
0.004732224158942699,
-0.015204857103526592,
-0.0789753720164299,
-0.033165838569402695,
0.037575412541627884,
0.14742346107959747,
-0.19767019152641296,
0.009279878810048103,
-0.027397485449910164,
0.03480453044176102,
0.027804125100374222,
0.01409347727894783,
-0.005441558547317982,
0.024740468710660934,
-0.00690670358017087,
0.040514856576919556,
-0.013741422444581985,
0.10117191821336746,
-0.06517613679170609,
-0.12783470749855042,
-0.06909497082233429,
-0.02098682150244713,
0.011127607896924019,
-0.17731493711471558,
-0.11488313972949982,
-0.10823079943656921,
-0.1315014362335205,
-0.03227582201361656,
-0.06441580504179001,
0.02782556228339672,
0.01350269466638565,
0.05804692953824997,
-0.083073690533638,
-0.0671776831150055,
-0.08374268561601639,
-0.03487046808004379,
0.19963261485099792,
0.06825115531682968,
0.005396767985075712,
0.005449220072478056,
0.0811748281121254,
-0.13180556893348694,
-0.010423589497804642,
-0.10260748118162155,
-0.01414543017745018,
-0.15116024017333984,
-0.0166433434933424,
-0.002660494763404131,
-0.21067576110363007,
-0.06366756558418274,
0.1747743785381317,
-0.030194507911801338,
0.045060545206069946,
-0.028689401224255562,
-0.05143999680876732,
-0.012258682399988174,
0.15477481484413147,
-0.06640581041574478,
0.034015554934740067,
-0.008917496539652348,
0.08616794645786285,
-0.03496678173542023,
0.003900842973962426,
0.03180698677897453,
0.06079315394163132,
0.026174258440732956,
0.04964987188577652,
-0.005784936714917421,
-0.005477066617459059,
-0.038516294211149216,
-0.010305718518793583,
0.04478876665234566,
-0.1467205435037613,
-0.02992381528019905,
0.012233801186084747,
-0.03768937662243843,
0.04285847768187523,
0.07500961422920227,
-0.0962940976023674,
-0.06982528418302536,
0.07875441014766693,
-0.07543613761663437,
-0.06502857804298401,
-0.1022869274020195,
-0.07029379159212112,
0.004388715140521526,
-0.09801502525806427,
-0.08231198787689209,
-0.05057590454816818,
-0.14082814753055573,
-0.07737982273101807,
0.04565751925110817,
0.008451271802186966,
-0.026308339089155197,
0.04002584144473076,
0.058835260570049286,
-0.013163813389837742,
-0.0173313170671463,
0.05029980465769768,
0.010228258557617664,
0.06588524580001831,
-0.0329488106071949,
0.10008660703897476,
0.11914142221212387,
0.06325801461935043,
-0.08424097299575806,
0.08664604276418686,
-0.20451584458351135,
0.0794602632522583,
-0.028970899060368538,
-0.08449089527130127,
-0.12036821991205215,
-0.01944797858595848,
-0.0974801778793335,
-0.005900448653846979,
0.058640800416469574,
0.07288988679647446,
-0.18371300399303436,
-0.034433066844940186,
0.24022813141345978,
-0.053050749003887177,
0.06946583837270737,
0.11207106709480286,
-0.0029117008671164513,
0.08327342569828033,
0.08781903237104416,
0.11622881889343262,
0.07361702620983124,
-0.12193295359611511,
-0.044267307966947556,
-0.016173910349607468,
-0.045096565037965775,
0.16310830414295197,
0.059941891580820084,
-0.09785193204879761,
-0.015928488224744797,
0.019331004470586777,
-0.0473339818418026,
-0.04330918937921524,
-0.008465982042253017,
-0.035283416509628296,
0.03620480000972748,
-0.011391643434762955,
0.04533521831035614,
-0.04251956567168236,
-0.020438075065612793,
-0.004084062296897173,
-0.09040083736181259,
-0.0430804044008255,
0.12413004785776138,
-0.1041872426867485,
0.0974874198436737,
-0.027783354744315147,
0.011747586540877819,
0.03386572748422623,
0.036845527589321136,
-0.15369896590709686,
0.02363034524023533,
0.07086564600467682,
-0.18602029979228973,
0.014198221266269684,
-0.08030375838279724,
-0.0038719484582543373,
0.07344159483909607,
0.006742839235812426,
0.018952958285808563,
-0.057099927216768265,
-0.026174373924732208,
-0.12469654530286789,
-0.05900905281305313,
-0.04175504669547081,
-0.013139630667865276,
0.1220368817448616,
-0.013456116430461407,
-0.00017223444592673331,
-0.07759947329759598,
0.04159020259976387,
0.014611708000302315,
-0.0810551643371582,
0.004013059660792351,
0.02899857424199581,
0.011488881893455982,
-0.021411782130599022,
-0.0027595299761742353,
-0.020404450595378876,
0.022696031257510185,
0.04495907202363014,
-0.1050662025809288,
-0.25432878732681274,
0.01579667255282402,
0.1949300765991211,
-0.11176645010709763,
0.06937733292579651,
-0.008000771515071392,
-0.03583464398980141,
-0.10635437071323395,
0.026247743517160416,
0.15000808238983154,
0.0181944090873003,
0.13457433879375458,
-0.07281550765037537,
0.07399103790521622,
0.01675810106098652,
-0.002810206962749362,
-0.02102981135249138,
0.0556597076356411,
0.0555608756840229,
-0.07340420037508011,
-0.014185403473675251,
-0.026852594688534737,
0.09151089191436768,
0.06753700971603394,
0.04519965127110481,
-0.1535324901342392,
-0.05102265998721123,
0.03647993505001068,
-0.002677860902622342,
0.09559164196252823,
0.051568858325481415,
0.03373108059167862,
0.007065922487527132,
0.06859481334686279,
-0.00385826057754457,
-0.11782871931791306,
0.08846130222082138,
0.07489584386348724,
-0.0946899950504303,
0.019320273771882057,
-0.08696550875902176,
-0.028297003358602524,
0.08462177962064743,
0.10890641808509827,
0.02903410978615284,
-0.03878720849752426,
-0.015316913835704327,
-0.08549091964960098,
0.21442706882953644,
-0.05492633581161499,
-0.2742155194282532,
-0.14987219870090485,
0.009707077406346798,
0.027905628085136414,
0.04330816492438316,
0.014266210608184338,
-0.002686095889657736,
-0.04897468909621239,
-0.08318980038166046,
0.054887741804122925,
0.003967341501265764,
0.044324636459350586,
0.02705329842865467,
-0.014383425004780293,
0.06949982792139053,
-0.08263421803712845,
0.007834460586309433,
-0.07393437623977661,
-0.01001870445907116,
-0.002821804955601692,
0.05476868152618408,
0.07720040529966354,
0.11023954302072525,
-0.029854368418455124,
-0.04077728092670441,
-0.013270481489598751,
0.0387202650308609,
-0.07659105956554413,
0.05705959349870682,
0.12719832360744476,
-0.05273756757378578,
0.07640590518712997,
0.07699102908372879,
0.008611791767179966,
-0.004561048932373524,
0.0565103217959404,
0.09999623894691467,
-0.10667088627815247,
-0.15478715300559998,
0.01566668041050434,
0.01846206746995449,
0.11516042798757553,
0.09457845985889435,
0.035418231040239334,
-0.05331697314977646,
-0.021469300612807274,
-0.05732721462845802,
-0.03804577887058258,
0.001293823355808854,
0.03285276144742966,
0.0004297340346965939,
-0.056459635496139526,
0.07225535809993744,
-0.026351507753133774,
0.03208841383457184,
0.10100936889648438,
0.04487520828843117,
0.19391582906246185,
-0.03591848909854889,
0.17065200209617615,
0.06579237431287766,
0.01810583472251892,
0.04104260355234146,
0.06721816211938858,
-0.010802753269672394,
0.028576895594596863,
-0.039969537407159805,
-0.02687758021056652,
-0.02382393181324005,
0.03716056048870087,
0.011498123407363892,
-0.02403850294649601,
-0.05226040259003639,
0.07667645812034607,
0.021829115226864815,
0.3493669331073761,
0.1173461526632309,
-0.10672298818826675,
-0.04932035878300667,
-0.016433846205472946,
-0.12609612941741943,
-0.0505531020462513,
0.004542035982012749,
-0.032652489840984344,
-0.14056645333766937,
-0.009722829796373844,
-0.08212147653102875,
0.06794819235801697,
-0.10127890855073929,
-0.013622933067381382,
0.03911908343434334,
0.03592180460691452,
-0.04949445649981499,
0.05199911817908287,
-0.22596228122711182,
0.09315352886915207,
-0.007197611033916473,
0.09163878858089447,
-0.026741720736026764,
0.05878676846623421,
-0.019107064232230186,
-0.07861088961362839,
0.17049071192741394,
-0.0008359344210475683,
-0.11606988310813904,
-0.1255219280719757,
-0.1510642170906067,
-0.005861638579517603,
0.10904552787542343,
-0.004049344453960657,
0.15907438099384308,
-0.029874028638005257,
0.016929246485233307,
-0.027277858927845955,
0.037323806434869766,
-0.10963534563779831,
-0.060231029987335205,
0.06318146735429764,
-0.15701940655708313,
-0.004796772263944149,
-0.04280634969472885,
-0.034955866634845734,
-0.09065310657024384,
0.17424795031547546,
-0.16982805728912354,
-0.06070510670542717,
-0.10592639446258545,
-0.02396448701620102,
0.06765466928482056,
-0.10846992582082748,
0.04601188004016876,
-0.002603430300951004,
0.05836071819067001,
-0.08051085472106934,
-0.1410411298274994,
0.05257242172956467,
-0.001289451727643609,
-0.15399613976478577,
-0.04510023817420006,
0.08747279644012451,
0.06944968551397324,
0.04505300894379616,
0.00011323781654937193,
0.04985613003373146,
0.041476331651210785,
-0.04891909286379814,
0.03453436866402626,
0.017940310761332512,
0.03591817989945412,
0.09652193635702133,
-0.08615267276763916,
0.03227237984538078,
-0.08904916048049927,
0.021973254159092903,
0.05874600261449814,
0.2841373383998871,
-0.07301061600446701,
0.17669057846069336,
0.15640197694301605,
-0.09350353479385376,
-0.24414567649364471,
-0.04735312610864639,
0.06755293905735016,
0.07095997035503387,
0.006055954843759537,
-0.26076075434684753,
0.03062187135219574,
0.12372832745313644,
-0.04792272299528122,
0.07379133999347687,
-0.26514947414398193,
-0.15325050055980682,
0.046041980385780334,
-0.013138089329004288,
0.08865594118833542,
-0.04197187349200249,
0.008753458969295025,
0.0471615195274353,
0.050020311027765274,
0.03417640179395676,
0.10382034629583359,
0.15640392899513245,
-0.005486908834427595,
-0.1065344586968422,
0.04543453827500343,
-0.032909221947193146,
0.09565608948469162,
-0.037514422088861465,
-0.009011129848659039,
0.0018641583155840635,
0.02657485380768776,
-0.045550327748060226,
-0.04727320373058319,
0.12281745672225952,
0.12839655578136444,
0.04153093695640564,
-0.03784933313727379,
-0.07793661952018738,
-0.058830369263887405,
0.009681422263383865,
-0.04238355532288551,
-0.04554670304059982,
-0.07538148015737534,
0.07552420347929001,
0.06875584274530411,
-0.010253890417516232,
-0.003746482776477933,
-0.03321199491620064,
-0.032141514122486115,
0.1547425389289856,
-0.02229383960366249,
0.01738915406167507,
-0.1588551104068756,
0.06392554938793182,
-0.00226717721670866,
0.025092871859669685,
-0.10889247804880142,
0.060640037059783936,
0.03337873890995979,
0.04086417704820633,
0.12650379538536072,
0.014350379817187786,
-0.1718113273382187,
-0.015999849885702133,
0.04508751630783081,
-0.1210847795009613,
-0.1758400946855545,
0.019801262766122818,
-0.2399577796459198,
-0.15913183987140656,
-0.01625763438642025,
0.12640246748924255,
-0.023534491658210754,
-0.030108479782938957,
-0.007852919399738312,
0.10670158267021179,
0.007572091184556484,
0.1284267008304596,
0.020762603729963303,
0.022581640630960464,
-0.0552222914993763,
0.15735222399234772,
0.11123274266719818,
-0.16146492958068848,
0.006886661518365145,
0.11898845434188843,
-0.07573264837265015,
0.0053880359046161175,
-0.06624367833137512,
-0.08052705228328705,
0.003191396128386259,
-0.008810347877442837,
-0.10770738124847412,
-0.04332921653985977,
0.08734731376171112,
0.02220247872173786,
0.04378144443035126,
0.0763789564371109,
-0.04405495151877403,
0.03725340589880943,
-0.0510580837726593,
0.10727962851524353,
0.015347576700150967,
0.06251262873411179,
0.0073392014019191265,
0.18870627880096436,
-0.014326157048344612,
0.1327746957540512,
-0.02606990560889244,
-0.03519847244024277,
-0.0232977457344532,
-0.011200265027582645,
-0.010751722380518913,
-0.041630834341049194,
-0.07836741209030151,
-0.052275776863098145,
-0.009736886247992516,
0.05604937672615051,
0.026453111320734024,
0.038551319390535355,
-0.05710120126605034,
-0.07257003337144852,
-0.09113851189613342,
0.012209681794047356,
-0.0997072234749794,
0.018867626786231995,
0.014933586120605469,
-0.0520644448697567,
0.06210879236459732,
-0.01378206443041563,
0.03440582752227783,
0.020453808829188347,
0.18164172768592834,
0.01418161392211914,
-0.02667614258825779,
0.008541299030184746,
-0.03574138879776001,
-0.14068786799907684,
0.019861651584506035,
-0.0419907420873642,
-0.026874443516135216,
-0.026791252195835114,
-0.009862145408987999,
-0.13101918995380402,
0.02218765765428543,
-0.02910669520497322,
0.05570890009403229,
-0.05836954340338707,
0.01863495260477066,
-0.046525586396455765,
0.11511588841676712,
0.08111241459846497,
-0.05456886440515518,
0.03170048072934151,
-0.1240018829703331,
-0.01791965775191784,
-0.018414117395877838,
-0.027592187747359276,
0.01743483915925026,
-0.03732944652438164,
0.029666131362318993,
0.04363369196653366,
0.03062145598232746,
0.0634763240814209,
0.025508111342787743,
0.04235001653432846,
-0.11758868396282196,
-0.01938583515584469,
0.033873602747917175,
0.06002064421772957,
-0.0082580940797925,
-0.06614315509796143,
0.04958328977227211,
0.06595312803983688,
-0.0005560455028899014,
0.10606402903795242,
0.1898314356803894,
0.1291947364807129,
0.17256633937358856,
0.07054979354143143,
-0.10722541064023972,
-0.07765690237283707,
-0.22703702747821808,
0.010252170264720917,
-0.11867336183786392,
0.04020744562149048,
0.009071217849850655,
0.03673403337597847,
0.07123342156410217,
-0.17799417674541473,
0.13985425233840942,
0.01995917782187462,
-0.08525365591049194,
-0.1261877417564392,
-0.09330833703279495,
-0.06583726406097412,
0.0034301753621548414,
-0.04144933074712753,
-0.10418390482664108,
0.056813742965459824,
0.16812406480312347,
-0.01910712569952011,
0.01972406730055809,
0.0919310450553894,
-0.11123564094305038,
-0.1087668389081955,
0.07176108658313751,
0.03871253505349159,
0.0575585663318634,
0.10953356325626373,
-0.03601250424981117,
0.061373546719551086,
-0.027706338092684746,
0.10309193283319473,
-0.026973946020007133,
0.0799993947148323,
0.08343911916017532,
-0.00035453488817438483,
-0.04075346514582634,
-0.012720015831291676,
-0.0036792864557355642,
0.11499432474374771,
0.1418442577123642,
0.03633342683315277,
-0.061055418103933334,
-0.0018758362857624888,
0.17726178467273712,
-0.044708941131830215,
0.016941068693995476,
-0.1685720831155777,
0.3446650505065918,
0.1191728338599205,
0.038727037608623505,
0.018912119790911674,
-0.1031046137213707,
-0.008232804015278816,
0.19880232214927673,
0.04342161864042282,
0.0018499278230592608,
-0.01996784843504429,
-0.01937408745288849,
-0.0010374576086178422,
0.0411221943795681,
0.14297500252723694,
0.03864726796746254,
0.16541853547096252,
-0.022663533687591553,
0.07194458693265915,
-0.008355247788131237,
-0.0511120930314064,
-0.19181737303733826,
0.1322956532239914,
-0.004657169803977013,
0.038867849856615067,
-0.0880153626203537,
0.02702897973358631,
0.07718639820814133,
-0.3012603521347046,
0.010630256496369839,
-0.0449601374566555,
-0.11889731138944626,
-0.03679420426487923,
-0.07450620085000992,
-0.006804438307881355,
0.06022646278142929,
0.014036432839930058,
0.07171730697154999,
0.15745292603969574,
0.05802702158689499,
-0.00403905613347888,
-0.08716125786304474,
0.07702286541461945,
-0.04460606351494789,
0.20861732959747314,
0.037445034831762314,
-0.03544747829437256,
0.056453898549079895,
-0.006692178547382355,
-0.10662835836410522,
0.037967585027217865,
0.012707427144050598,
-0.06449481099843979,
0.05659729614853859,
0.19162039458751678,
-0.037212174385786057,
0.02494530938565731,
0.009372209198772907,
-0.05168024078011513,
0.056968435645103455,
-0.03648250922560692,
-0.05980907753109932,
-0.09117812663316727,
0.06441669166088104,
-0.07222232967615128,
0.14172670245170593,
0.2314234972000122,
-0.028566719964146614,
0.04472921043634415,
-0.03621656820178032,
-0.029375221580266953,
0.002653656993061304,
0.10926675796508789,
-0.034620679914951324,
-0.1260867863893509,
0.022294526919722557,
-0.15010546147823334,
0.044436000287532806,
-0.23859402537345886,
-0.06348317861557007,
0.05776021257042885,
-0.051632195711135864,
-0.023273853585124016,
0.11279182136058807,
-0.009782516397535801,
-0.03154107928276062,
-0.012262959033250809,
-0.095638707280159,
-0.012767912819981575,
0.058445606380701065,
-0.12986934185028076,
-0.10371333360671997
] |
null | null | transformers |
# ProtBert-BFD model
Pretrained model on protein sequences using a masked language modeling (MLM) objective. It was introduced in
[this paper](https://doi.org/10.1101/2020.07.12.199554) and first released in
[this repository](https://github.com/agemagician/ProtTrans). This model is trained on uppercase amino acids: it only works with capital letter amino acids.
## Model description
ProtBert-BFD is based on Bert model which pretrained on a large corpus of protein sequences in a self-supervised fashion.
This means it was pretrained on the raw protein sequences only, with no humans labelling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those protein sequences.
One important difference between our Bert model and the original Bert version is the way of dealing with sequences as separate documents
This means the Next sentence prediction is not used, as each sequence is treated as a complete document.
The masking follows the original Bert training with randomly masks 15% of the amino acids in the input.
At the end, the feature extracted from this model revealed that the LM-embeddings from unlabeled data (only protein sequences) captured important biophysical properties governing protein
shape.
This implied learning some of the grammar of the language of life realized in protein sequences.
## Intended uses & limitations
The model could be used for protein feature extraction or to be fine-tuned on downstream tasks.
We have noticed in some tasks you could gain more accuracy by fine-tuning the model rather than using it as a feature extractor.
### How to use
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import BertForMaskedLM, BertTokenizer, pipeline
>>> tokenizer = BertTokenizer.from_pretrained('Rostlab/prot_bert_bfd', do_lower_case=False )
>>> model = BertForMaskedLM.from_pretrained("Rostlab/prot_bert_bfd")
>>> unmasker = pipeline('fill-mask', model=model, tokenizer=tokenizer)
>>> unmasker('D L I P T S S K L V V [MASK] D T S L Q V K K A F F A L V T')
[{'score': 0.1165614128112793,
'sequence': '[CLS] D L I P T S S K L V V L D T S L Q V K K A F F A L V T [SEP]',
'token': 5,
'token_str': 'L'},
{'score': 0.08976086974143982,
'sequence': '[CLS] D L I P T S S K L V V V D T S L Q V K K A F F A L V T [SEP]',
'token': 8,
'token_str': 'V'},
{'score': 0.08864385634660721,
'sequence': '[CLS] D L I P T S S K L V V S D T S L Q V K K A F F A L V T [SEP]',
'token': 10,
'token_str': 'S'},
{'score': 0.06227643042802811,
'sequence': '[CLS] D L I P T S S K L V V A D T S L Q V K K A F F A L V T [SEP]',
'token': 6,
'token_str': 'A'},
{'score': 0.06194969266653061,
'sequence': '[CLS] D L I P T S S K L V V T D T S L Q V K K A F F A L V T [SEP]',
'token': 15,
'token_str': 'T'}]
```
Here is how to use this model to get the features of a given protein sequence in PyTorch:
```python
from transformers import BertModel, BertTokenizer
import re
tokenizer = BertTokenizer.from_pretrained('Rostlab/prot_bert_bfd', do_lower_case=False )
model = BertModel.from_pretrained("Rostlab/prot_bert_bfd")
sequence_Example = "A E T C Z A O"
sequence_Example = re.sub(r"[UZOB]", "X", sequence_Example)
encoded_input = tokenizer(sequence_Example, return_tensors='pt')
output = model(**encoded_input)
```
## Training data
The ProtBert-BFD model was pretrained on [BFD](https://bfd.mmseqs.com/), a dataset consisting of 2.1 billion protein sequences.
## Training procedure
### Preprocessing
The protein sequences are uppercased and tokenized using a single space and a vocabulary size of 21.
The inputs of the model are then of the form:
```
[CLS] Protein Sequence A [SEP] Protein Sequence B [SEP]
```
Furthermore, each protein sequence was treated as a separate document.
The preprocessing step was performed twice, once for a combined length (2 sequences) of less than 512 amino acids, and another time using a combined length (2 sequences) of less than 2048 amino acids.
The details of the masking procedure for each sequence followed the original Bert model as following:
- 15% of the amino acids are masked.
- In 80% of the cases, the masked amino acids are replaced by `[MASK]`.
- In 10% of the cases, the masked amino acids are replaced by a random amino acid (different) from the one they replace.
- In the 10% remaining cases, the masked amino acids are left as is.
### Pretraining
The model was trained on a single TPU Pod V3-1024 for one million steps in total.
800k steps using sequence length 512 (batch size 32k), and 200K steps using sequence length 2048 (batch size 6k).
The optimizer used is Lamb with a learning rate of 0.002, a weight decay of 0.01, learning rate warmup for 140k steps and linear decay of the learning rate after.
## Evaluation results
When fine-tuned on downstream tasks, this model achieves the following results:
Test results :
| Task/Dataset | secondary structure (3-states) | secondary structure (8-states) | Localization | Membrane |
|:-----:|:-----:|:-----:|:-----:|:-----:|
| CASP12 | 76 | 65 | | |
| TS115 | 84 | 73 | | |
| CB513 | 83 | 70 | | |
| DeepLoc | | | 78 | 91 |
### BibTeX entry and citation info
```bibtex
@article {Elnaggar2020.07.12.199554,
author = {Elnaggar, Ahmed and Heinzinger, Michael and Dallago, Christian and Rehawi, Ghalia and Wang, Yu and Jones, Llion and Gibbs, Tom and Feher, Tamas and Angerer, Christoph and Steinegger, Martin and BHOWMIK, DEBSINDHU and Rost, Burkhard},
title = {ProtTrans: Towards Cracking the Language of Life{\textquoteright}s Code Through Self-Supervised Deep Learning and High Performance Computing},
elocation-id = {2020.07.12.199554},
year = {2020},
doi = {10.1101/2020.07.12.199554},
publisher = {Cold Spring Harbor Laboratory},
abstract = {Computational biology and bioinformatics provide vast data gold-mines from protein sequences, ideal for Language Models (LMs) taken from Natural Language Processing (NLP). These LMs reach for new prediction frontiers at low inference costs. Here, we trained two auto-regressive language models (Transformer-XL, XLNet) and two auto-encoder models (Bert, Albert) on data from UniRef and BFD containing up to 393 billion amino acids (words) from 2.1 billion protein sequences (22- and 112 times the entire English Wikipedia). The LMs were trained on the Summit supercomputer at Oak Ridge National Laboratory (ORNL), using 936 nodes (total 5616 GPUs) and one TPU Pod (V3-512 or V3-1024). We validated the advantage of up-scaling LMs to larger models supported by bigger data by predicting secondary structure (3-states: Q3=76-84, 8 states: Q8=65-73), sub-cellular localization for 10 cellular compartments (Q10=74) and whether a protein is membrane-bound or water-soluble (Q2=89). Dimensionality reduction revealed that the LM-embeddings from unlabeled data (only protein sequences) captured important biophysical properties governing protein shape. This implied learning some of the grammar of the language of life realized in protein sequences. The successful up-scaling of protein LMs through HPC to larger data sets slightly reduced the gap between models trained on evolutionary information and LMs. Availability ProtTrans: \<a href="https://github.com/agemagician/ProtTrans"\>https://github.com/agemagician/ProtTrans\</a\>Competing Interest StatementThe authors have declared no competing interest.},
URL = {https://www.biorxiv.org/content/early/2020/07/21/2020.07.12.199554},
eprint = {https://www.biorxiv.org/content/early/2020/07/21/2020.07.12.199554.full.pdf},
journal = {bioRxiv}
}
```
> Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
| {"language": "protein", "tags": ["protein language model"], "datasets": ["BFD"]} | fill-mask | Rostlab/prot_bert_bfd | [
"transformers",
"pytorch",
"tf",
"fill-mask",
"protein language model",
"dataset:BFD",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"protein"
] | TAGS
#transformers #pytorch #tf #fill-mask #protein language model #dataset-BFD #autotrain_compatible #endpoints_compatible #has_space #region-us
| ProtBert-BFD model
==================
Pretrained model on protein sequences using a masked language modeling (MLM) objective. It was introduced in
this paper and first released in
this repository. This model is trained on uppercase amino acids: it only works with capital letter amino acids.
Model description
-----------------
ProtBert-BFD is based on Bert model which pretrained on a large corpus of protein sequences in a self-supervised fashion.
This means it was pretrained on the raw protein sequences only, with no humans labelling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those protein sequences.
One important difference between our Bert model and the original Bert version is the way of dealing with sequences as separate documents
This means the Next sentence prediction is not used, as each sequence is treated as a complete document.
The masking follows the original Bert training with randomly masks 15% of the amino acids in the input.
At the end, the feature extracted from this model revealed that the LM-embeddings from unlabeled data (only protein sequences) captured important biophysical properties governing protein
shape.
This implied learning some of the grammar of the language of life realized in protein sequences.
Intended uses & limitations
---------------------------
The model could be used for protein feature extraction or to be fine-tuned on downstream tasks.
We have noticed in some tasks you could gain more accuracy by fine-tuning the model rather than using it as a feature extractor.
### How to use
You can use this model directly with a pipeline for masked language modeling:
Here is how to use this model to get the features of a given protein sequence in PyTorch:
Training data
-------------
The ProtBert-BFD model was pretrained on BFD, a dataset consisting of 2.1 billion protein sequences.
Training procedure
------------------
### Preprocessing
The protein sequences are uppercased and tokenized using a single space and a vocabulary size of 21.
The inputs of the model are then of the form:
Furthermore, each protein sequence was treated as a separate document.
The preprocessing step was performed twice, once for a combined length (2 sequences) of less than 512 amino acids, and another time using a combined length (2 sequences) of less than 2048 amino acids.
The details of the masking procedure for each sequence followed the original Bert model as following:
* 15% of the amino acids are masked.
* In 80% of the cases, the masked amino acids are replaced by '[MASK]'.
* In 10% of the cases, the masked amino acids are replaced by a random amino acid (different) from the one they replace.
* In the 10% remaining cases, the masked amino acids are left as is.
### Pretraining
The model was trained on a single TPU Pod V3-1024 for one million steps in total.
800k steps using sequence length 512 (batch size 32k), and 200K steps using sequence length 2048 (batch size 6k).
The optimizer used is Lamb with a learning rate of 0.002, a weight decay of 0.01, learning rate warmup for 140k steps and linear decay of the learning rate after.
Evaluation results
------------------
When fine-tuned on downstream tasks, this model achieves the following results:
Test results :
### BibTeX entry and citation info
>
> Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn
>
>
>
| [
"### How to use\n\n\nYou can use this model directly with a pipeline for masked language modeling:\n\n\nHere is how to use this model to get the features of a given protein sequence in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe ProtBert-BFD model was pretrained on BFD, a dataset consisting of 2.1 billion protein sequences.\n\n\nTraining procedure\n------------------",
"### Preprocessing\n\n\nThe protein sequences are uppercased and tokenized using a single space and a vocabulary size of 21.\nThe inputs of the model are then of the form:\n\n\nFurthermore, each protein sequence was treated as a separate document.\nThe preprocessing step was performed twice, once for a combined length (2 sequences) of less than 512 amino acids, and another time using a combined length (2 sequences) of less than 2048 amino acids.\n\n\nThe details of the masking procedure for each sequence followed the original Bert model as following:\n\n\n* 15% of the amino acids are masked.\n* In 80% of the cases, the masked amino acids are replaced by '[MASK]'.\n* In 10% of the cases, the masked amino acids are replaced by a random amino acid (different) from the one they replace.\n* In the 10% remaining cases, the masked amino acids are left as is.",
"### Pretraining\n\n\nThe model was trained on a single TPU Pod V3-1024 for one million steps in total.\n800k steps using sequence length 512 (batch size 32k), and 200K steps using sequence length 2048 (batch size 6k).\nThe optimizer used is Lamb with a learning rate of 0.002, a weight decay of 0.01, learning rate warmup for 140k steps and linear decay of the learning rate after.\n\n\nEvaluation results\n------------------\n\n\nWhen fine-tuned on downstream tasks, this model achieves the following results:\n\n\nTest results :",
"### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #tf #fill-mask #protein language model #dataset-BFD #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"### How to use\n\n\nYou can use this model directly with a pipeline for masked language modeling:\n\n\nHere is how to use this model to get the features of a given protein sequence in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe ProtBert-BFD model was pretrained on BFD, a dataset consisting of 2.1 billion protein sequences.\n\n\nTraining procedure\n------------------",
"### Preprocessing\n\n\nThe protein sequences are uppercased and tokenized using a single space and a vocabulary size of 21.\nThe inputs of the model are then of the form:\n\n\nFurthermore, each protein sequence was treated as a separate document.\nThe preprocessing step was performed twice, once for a combined length (2 sequences) of less than 512 amino acids, and another time using a combined length (2 sequences) of less than 2048 amino acids.\n\n\nThe details of the masking procedure for each sequence followed the original Bert model as following:\n\n\n* 15% of the amino acids are masked.\n* In 80% of the cases, the masked amino acids are replaced by '[MASK]'.\n* In 10% of the cases, the masked amino acids are replaced by a random amino acid (different) from the one they replace.\n* In the 10% remaining cases, the masked amino acids are left as is.",
"### Pretraining\n\n\nThe model was trained on a single TPU Pod V3-1024 for one million steps in total.\n800k steps using sequence length 512 (batch size 32k), and 200K steps using sequence length 2048 (batch size 6k).\nThe optimizer used is Lamb with a learning rate of 0.002, a weight decay of 0.01, learning rate warmup for 140k steps and linear decay of the learning rate after.\n\n\nEvaluation results\n------------------\n\n\nWhen fine-tuned on downstream tasks, this model achieves the following results:\n\n\nTest results :",
"### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>"
] | [
52,
83,
213,
129,
34
] | [
"passage: TAGS\n#transformers #pytorch #tf #fill-mask #protein language model #dataset-BFD #autotrain_compatible #endpoints_compatible #has_space #region-us \n### How to use\n\n\nYou can use this model directly with a pipeline for masked language modeling:\n\n\nHere is how to use this model to get the features of a given protein sequence in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe ProtBert-BFD model was pretrained on BFD, a dataset consisting of 2.1 billion protein sequences.\n\n\nTraining procedure\n------------------### Preprocessing\n\n\nThe protein sequences are uppercased and tokenized using a single space and a vocabulary size of 21.\nThe inputs of the model are then of the form:\n\n\nFurthermore, each protein sequence was treated as a separate document.\nThe preprocessing step was performed twice, once for a combined length (2 sequences) of less than 512 amino acids, and another time using a combined length (2 sequences) of less than 2048 amino acids.\n\n\nThe details of the masking procedure for each sequence followed the original Bert model as following:\n\n\n* 15% of the amino acids are masked.\n* In 80% of the cases, the masked amino acids are replaced by '[MASK]'.\n* In 10% of the cases, the masked amino acids are replaced by a random amino acid (different) from the one they replace.\n* In the 10% remaining cases, the masked amino acids are left as is.### Pretraining\n\n\nThe model was trained on a single TPU Pod V3-1024 for one million steps in total.\n800k steps using sequence length 512 (batch size 32k), and 200K steps using sequence length 2048 (batch size 6k).\nThe optimizer used is Lamb with a learning rate of 0.002, a weight decay of 0.01, learning rate warmup for 140k steps and linear decay of the learning rate after.\n\n\nEvaluation results\n------------------\n\n\nWhen fine-tuned on downstream tasks, this model achieves the following results:\n\n\nTest results :"
] | [
-0.01156269945204258,
0.08984851092100143,
-0.0023207135964185,
0.018617898225784302,
0.07892068475484848,
-0.006360352039337158,
-0.029230792075395584,
0.09339917451143265,
-0.04678703099489212,
0.08648265898227692,
0.03561675548553467,
-0.03500574454665184,
0.098223477602005,
0.18664278090000153,
0.09612672030925751,
-0.24025966227054596,
0.0403016060590744,
-0.037797536700963974,
-0.002217791508883238,
0.10385067760944366,
0.08497177809476852,
-0.07040588557720184,
0.03678419813513756,
0.0003578548494260758,
0.04656415805220604,
0.018595729023218155,
0.014471842907369137,
-0.029964301735162735,
0.11345046758651733,
0.0364818200469017,
0.06071886047720909,
0.024417128413915634,
0.07137656956911087,
-0.1314495950937271,
0.030249856412410736,
0.12292620539665222,
0.0176065806299448,
0.030873682349920273,
-0.014257146045565605,
0.13041682541370392,
0.09733482450246811,
-0.048243723809719086,
0.03132699057459831,
0.08396927267313004,
-0.06819196045398712,
-0.13088205456733704,
-0.09183142334222794,
0.0027466313913464546,
0.010281603783369064,
0.05288044735789299,
-0.021001305431127548,
0.03145061433315277,
0.012238812632858753,
0.0679147019982338,
0.19220231473445892,
-0.2804511487483978,
0.016049791127443314,
-0.017963316291570663,
0.015332131646573544,
0.016941191628575325,
-0.006918907165527344,
0.02319052442908287,
0.008563322946429253,
-0.01530170626938343,
0.08804747462272644,
0.009214375168085098,
0.07008098810911179,
-0.06777719408273697,
-0.11469168215990067,
-0.06880444288253784,
0.0039184740744531155,
0.02244601957499981,
-0.15850040316581726,
-0.12801331281661987,
-0.11527050286531448,
-0.15385226905345917,
-0.02930678427219391,
-0.08645861595869064,
0.03577897697687149,
0.03126181662082672,
0.09497224539518356,
-0.08733020722866058,
-0.07990139722824097,
-0.049455560743808746,
-0.05162515118718147,
0.18988196551799774,
0.08819492161273956,
0.013353751040995121,
0.02635604329407215,
0.039253849536180496,
-0.15824884176254272,
-0.04060792177915573,
-0.10395734757184982,
-0.0329236276447773,
-0.16058377921581268,
-0.013954904861748219,
-0.027431869879364967,
-0.3037794530391693,
-0.09201854467391968,
0.14764992892742157,
-0.032333265990018845,
0.0583031140267849,
-0.029903003945946693,
-0.05773981660604477,
-0.02843347378075123,
0.16195286810398102,
-0.08860833197832108,
-0.00821191631257534,
0.01823842152953148,
0.06678533554077148,
-0.027483822777867317,
0.00734127638861537,
0.036407917737960815,
-0.00016807364590931684,
0.023675372824072838,
0.016339410096406937,
-0.04900868609547615,
0.03679248318076134,
-0.062397878617048264,
-0.019684407860040665,
0.11441363394260406,
-0.13320831954479218,
-0.03893079236149788,
-0.013668653555214405,
-0.036491475999355316,
0.05511603504419327,
0.13130876421928406,
-0.11514610052108765,
-0.0859251543879509,
0.03533525764942169,
-0.0730479508638382,
-0.08945644646883011,
-0.0995144471526146,
-0.07988554239273071,
-0.022346381098031998,
-0.056232403963804245,
-0.08014806360006332,
-0.07516580075025558,
-0.14336827397346497,
-0.03944280743598938,
0.024171672761440277,
0.01145444717258215,
-0.06351737678050995,
0.05478152260184288,
0.044039420783519745,
-0.0448732003569603,
-0.008053507655858994,
0.051362622529268265,
0.00985496211796999,
0.06614068150520325,
-0.014260450378060341,
0.09014206379652023,
0.04880793020129204,
0.04866461083292961,
-0.06595022976398468,
0.049959227442741394,
-0.21145308017730713,
0.06574635952711105,
-0.028151115402579308,
-0.1054178774356842,
-0.11379958689212799,
-0.03862529247999191,
-0.14715448021888733,
-0.021853892132639885,
0.03382555767893791,
0.06558219343423843,
-0.1789463311433792,
-0.049445897340774536,
0.241697758436203,
-0.07237536460161209,
0.06689534336328506,
0.13323485851287842,
-0.010249864310026169,
0.041062526404857635,
0.13552071154117584,
0.04910768195986748,
0.07199590653181076,
-0.1622132658958435,
-0.0846245288848877,
-0.009586031548678875,
-0.04553743451833725,
0.15814806520938873,
0.05744823068380356,
-0.05043216049671173,
-0.018910856917500496,
0.05105282738804817,
0.034711096435785294,
-0.030881615355610847,
0.000992273329757154,
-0.04966655373573303,
0.01245166640728712,
-0.03339023143053055,
0.050377871841192245,
-0.03256223723292351,
-0.013619796372950077,
0.006940235383808613,
-0.1313856989145279,
-0.07494072616100311,
0.10370417684316635,
-0.0807812362909317,
0.09203191101551056,
-0.024547874927520752,
0.008717780001461506,
0.018260657787322998,
0.01604262925684452,
-0.17064467072486877,
-0.009470517747104168,
0.06821031868457794,
-0.12821629643440247,
0.040637221187353134,
-0.06931771337985992,
0.026959871873259544,
0.08151938766241074,
-0.00921032764017582,
0.06732555478811264,
-0.08353111892938614,
-0.03276616334915161,
-0.12620003521442413,
-0.0432305634021759,
-0.05674319341778755,
-0.019324466586112976,
0.1114874854683876,
-0.03074478730559349,
0.006637245416641235,
-0.07177864760160446,
0.0781199187040329,
0.021684713661670685,
-0.08383753895759583,
-0.017256662249565125,
0.018109649419784546,
0.007226055953651667,
-0.01764179766178131,
-0.03937605395913124,
-0.008096547797322273,
-0.04873747378587723,
0.0683879628777504,
-0.1220083087682724,
-0.12623271346092224,
0.040809325873851776,
0.18122000992298126,
-0.12022892385721207,
0.042085278779268265,
-0.022283021360635757,
-0.023321721702814102,
-0.06964711844921112,
0.01258031465113163,
0.24052205681800842,
0.000293106772005558,
0.1546049565076828,
-0.06733845919370651,
0.08281267434358597,
0.013916577212512493,
0.06421840935945511,
-0.023994075134396553,
0.013175892643630505,
0.04772839695215225,
-0.1116756722331047,
0.030322499573230743,
-0.04891568794846535,
0.13184231519699097,
0.1198127418756485,
0.0320955328643322,
-0.16234901547431946,
-0.0312558189034462,
-0.012319290079176426,
-0.004057029262185097,
0.08303172141313553,
0.06008022278547287,
0.05869041383266449,
0.02011287584900856,
0.021712977439165115,
0.006702012848109007,
-0.06886004656553268,
0.102275051176548,
0.13140280544757843,
-0.07478123158216476,
0.0359988771378994,
-0.09576036781072617,
-0.010804083198308945,
0.1028563603758812,
0.08588141202926636,
0.06031377986073494,
-0.0633934885263443,
0.014810419641435146,
-0.06363600492477417,
0.20175059139728546,
-0.06741148233413696,
-0.30800941586494446,
-0.1677691489458084,
0.031835027039051056,
0.0069078244268894196,
0.04462984576821327,
-0.008516313508152962,
0.014789878390729427,
-0.096510149538517,
-0.09041008353233337,
0.01608312875032425,
0.009804390370845795,
0.06569699943065643,
0.04278844967484474,
-0.020698463544249535,
0.09570778906345367,
-0.10925976186990738,
0.011930477805435658,
-0.03402173891663551,
-0.022309159860014915,
-0.03264211490750313,
0.009926869533956051,
0.10149277001619339,
0.040028758347034454,
-0.06275370717048645,
-0.031147900968790054,
-0.017705056816339493,
0.09547705948352814,
-0.05606804043054581,
0.06615111976861954,
0.09712997078895569,
-0.06850645691156387,
0.05996919795870781,
0.047317784279584885,
-0.0015851386124268174,
-0.011874382384121418,
0.03229832649230957,
0.1132107824087143,
-0.08571995794773102,
-0.1224365234375,
-0.010019185952842236,
-0.0174395851790905,
0.00035839813062921166,
0.09858353435993195,
0.06647380441427231,
-0.007057383190840483,
-0.02724592387676239,
-0.06410019099712372,
-0.024954989552497864,
0.00715989526361227,
0.038311511278152466,
-0.018114563077688217,
-0.0413055494427681,
0.0780019760131836,
-0.036155689507722855,
0.010436182841658592,
0.11869676411151886,
0.039190568029880524,
0.1626899540424347,
-0.039866771548986435,
0.25571101903915405,
0.024433961138129234,
0.025174196809530258,
0.04207126051187515,
0.05267517268657684,
-0.003816822078078985,
0.019537445157766342,
-0.04281817749142647,
-0.04592154920101166,
-0.05152976140379906,
0.020719358697533607,
-0.0011085112346336246,
-0.006361097097396851,
-0.03968160226941109,
0.08162964880466461,
0.028198353946208954,
0.30318260192871094,
0.027685416862368584,
-0.18460668623447418,
-0.049324095249176025,
-0.0458403155207634,
-0.12230028212070465,
-0.06300459802150726,
-0.009962543845176697,
-0.005725496914237738,
-0.08965174108743668,
0.0517435148358345,
-0.0795605331659317,
0.07668591290712357,
-0.06903073936700821,
0.016056343913078308,
0.03846699371933937,
0.04843488708138466,
-0.02950292080640793,
0.05348048731684685,
-0.18401242792606354,
0.08250246942043304,
0.00899413600564003,
0.08476877212524414,
-0.08287779241800308,
0.054916996508836746,
-0.0236527007073164,
-0.013764206320047379,
0.16963894665241241,
-0.029074663296341896,
-0.07836823910474777,
-0.08202792704105377,
-0.16477081179618835,
0.013740899972617626,
0.0875040739774704,
0.04830826818943024,
0.16516093909740448,
-0.050950825214385986,
0.0043266271241009235,
0.004901907406747341,
0.10542649030685425,
-0.11165251582860947,
-0.13552412390708923,
0.05395248159766197,
-0.07496875524520874,
-0.0005104842712171376,
-0.04039618745446205,
-0.05251818895339966,
-0.0949162170290947,
0.20786450803279877,
-0.060392167419195175,
-0.0581228993833065,
-0.15375539660453796,
0.003956967033445835,
0.08703215420246124,
-0.09767705202102661,
0.07855411618947983,
0.0062630963511765,
0.1060827299952507,
-0.0835823267698288,
-0.16949215531349182,
0.044210318475961685,
0.042047951370477676,
-0.09874582290649414,
-0.02941909059882164,
0.062162332236766815,
0.06611469388008118,
0.06668360531330109,
0.009967278689146042,
0.035850364714860916,
0.03701459616422653,
-0.0364881232380867,
-0.024379104375839233,
0.00486451992765069,
0.10709080100059509,
0.07488378137350082,
-0.1193070188164711,
-0.03126285597681999,
-0.0535365529358387,
0.09122462570667267,
0.08187543600797653,
0.30006805062294006,
-0.09051653742790222,
0.18135981261730194,
0.10067086666822433,
-0.08346014469861984,
-0.20699556171894073,
-0.05192247033119202,
0.0794176459312439,
0.11949338763952255,
0.05330737680196762,
-0.25468721985816956,
-0.016887230798602104,
0.10657765716314316,
-0.04261994734406471,
0.04278479889035225,
-0.22485332190990448,
-0.15831516683101654,
0.03269293159246445,
0.024062683805823326,
0.11649376899003983,
-0.05521710589528084,
0.04677724093198776,
0.01598983071744442,
0.02400522492825985,
0.038955412805080414,
0.10835210233926773,
0.17286111414432526,
-0.0006216324982233346,
-0.1326451450586319,
0.06728353351354599,
-0.03201397880911827,
0.09043089300394058,
-0.07541405409574509,
0.00263786967843771,
-0.005972376558929682,
0.12120852619409561,
0.05113691836595535,
-0.001730276271700859,
0.137107253074646,
0.043029554188251495,
0.023604171350598335,
-0.061904195696115494,
-0.09512083977460861,
-0.05240769684314728,
0.025157609954476357,
-0.025084689259529114,
-0.0389849953353405,
-0.06097262352705002,
0.06384459882974625,
0.11148238182067871,
-0.041052695363759995,
0.010146821849048138,
-0.03848888725042343,
0.006745587103068829,
0.16818080842494965,
-0.012171709910035133,
-0.007519922219216824,
-0.15435226261615753,
0.08917977660894394,
0.0005947841564193368,
0.04177430272102356,
-0.060742657631635666,
0.06372005492448807,
0.054340943694114685,
0.00799813587218523,
0.10903149843215942,
0.003893334185704589,
-0.16270847618579865,
-0.04924063757061958,
0.045161642134189606,
-0.1490447223186493,
-0.19672513008117676,
-0.011707552708685398,
-0.27038535475730896,
-0.15001146495342255,
-0.07206787914037704,
0.08889558911323547,
-0.0647742971777916,
-0.03675718978047371,
-0.004752523731440306,
0.06519085168838501,
0.007662263233214617,
0.12271294742822647,
0.026068931445479393,
-0.004813032690435648,
-0.05404322221875191,
0.1386926919221878,
0.08879461139440536,
-0.14190827310085297,
0.0530727319419384,
0.0776677131652832,
-0.08742182701826096,
0.006090705748647451,
-0.01667429879307747,
-0.044327229261398315,
0.06274436414241791,
-0.002379209967330098,
-0.12735433876514435,
-0.029415994882583618,
0.09731993824243546,
0.04549604654312134,
0.04375843331217766,
0.0501466765999794,
-0.0399969108402729,
0.014315842650830746,
-0.06806828081607819,
0.11056284606456757,
0.0012989247916266322,
0.05578816309571266,
0.004082994535565376,
0.16010817885398865,
-0.036159541457891464,
0.0856601819396019,
-0.04833774268627167,
0.009191452525556087,
-0.020197711884975433,
-0.04285704717040062,
-0.0625082328915596,
-0.051663171499967575,
-0.03644958883523941,
-0.064904123544693,
-0.011100645177066326,
0.04153960198163986,
0.025051945820450783,
0.04976598918437958,
-0.04435319826006889,
-0.07116173952817917,
-0.06129981204867363,
-0.018001902848482132,
-0.10415168106555939,
0.01836416684091091,
0.021100178360939026,
-0.038026049733161926,
0.04903357848525047,
-0.015058706514537334,
-0.00483736302703619,
0.028708694502711296,
0.16178329288959503,
-0.0012202891521155834,
0.03950154781341553,
0.03234439715743065,
-0.029148388653993607,
-0.10178226977586746,
0.04179476201534271,
-0.006827183533459902,
-0.011048202402889729,
-0.05482868477702141,
0.03413832560181618,
-0.11895937472581863,
-0.017233997583389282,
0.003392984624952078,
0.08988714963197708,
-0.012383764609694481,
0.0068330843932926655,
-0.013830093666911125,
0.06184716150164604,
0.09697604179382324,
-0.03532096743583679,
0.000039128022763179615,
-0.1477840542793274,
-0.015793897211551666,
-0.026836059987545013,
-0.058760132640600204,
0.05954231694340706,
-0.06671489030122757,
0.030430998653173447,
0.04238602891564369,
0.07338021695613861,
0.03375890851020813,
0.04708760231733322,
0.05854842811822891,
-0.08164644986391068,
0.009401723742485046,
0.023556625470519066,
0.09561176598072052,
0.007783590350300074,
-0.07340066879987717,
0.10144788026809692,
0.059237830340862274,
0.039756257086992264,
0.11560090631246567,
0.18801100552082062,
0.14656168222427368,
0.10984993726015091,
0.030360128730535507,
-0.08804178237915039,
-0.0619952492415905,
-0.23294644057750702,
-0.02154955454170704,
-0.13340839743614197,
0.062266793102025986,
0.006986054591834545,
0.0020580000709742308,
0.08592012524604797,
-0.1485086977481842,
0.18580621480941772,
0.035669147968292236,
-0.07173333317041397,
-0.13307516276836395,
-0.12508445978164673,
-0.07812300324440002,
0.04271593689918518,
-0.045140400528907776,
-0.10454965382814407,
0.034331247210502625,
0.1185927614569664,
-0.010846421122550964,
0.00885460339486599,
0.16559508442878723,
-0.11742659658193588,
-0.10574612021446228,
0.07752237468957901,
0.029108397662639618,
0.02669115737080574,
0.06957894563674927,
-0.01638580672442913,
0.06177531182765961,
-0.025784054771065712,
0.10009404271841049,
-0.03257731720805168,
0.07889214158058167,
0.0905904546380043,
0.007182667031884193,
-0.04803235083818436,
-0.01902890019118786,
-0.021548505872488022,
0.08139774948358536,
0.12160500138998032,
0.007552308961749077,
-0.06729952991008759,
-0.018888434395194054,
0.1262059062719345,
-0.015295937657356262,
-0.03431857004761696,
-0.1781952679157257,
0.2680184245109558,
0.1449841558933258,
0.007699898444116116,
0.02798532322049141,
-0.08024168759584427,
0.016443172469735146,
0.19398804008960724,
0.05231786519289017,
-0.004563263617455959,
0.015389448031783104,
-0.022254373878240585,
0.002171710366383195,
0.05962440371513367,
0.14419183135032654,
0.013269105926156044,
0.13602185249328613,
-0.0024823753628879786,
0.07720258086919785,
-0.012016123160719872,
-0.04408411309123039,
-0.1764928698539734,
0.14487136900424957,
0.010693148709833622,
0.01156062912195921,
-0.06038088724017143,
-0.027421552687883377,
0.05921454727649689,
-0.269948810338974,
0.014759929850697517,
-0.07899691164493561,
-0.1313123255968094,
-0.04671011492609978,
-0.09217443317174911,
-0.013749762438237667,
0.046103380620479584,
0.021506309509277344,
0.07559118419885635,
0.20072536170482635,
0.05226966366171837,
-0.03931974247097969,
-0.061006247997283936,
0.06615142524242401,
-0.06813761591911316,
0.21732892096042633,
0.048620108515024185,
-0.05649067461490631,
0.04628031328320503,
-0.03290092200040817,
-0.11308667808771133,
0.034937188029289246,
-0.0013769092038273811,
-0.0626072883605957,
0.04602065682411194,
0.17255017161369324,
-0.03637838736176491,
0.0580994188785553,
-0.03168630972504616,
-0.03632034361362457,
0.04544949159026146,
-0.10636793822050095,
-0.056163012981414795,
-0.04640762880444527,
0.06995406001806259,
-0.071384958922863,
0.13614098727703094,
0.22232212126255035,
0.012153004296123981,
0.03089085780084133,
-0.07415134459733963,
-0.008418766781687737,
0.05994832515716553,
0.11350243538618088,
-0.05479724332690239,
-0.12584809958934784,
-0.027598561719059944,
-0.08450943231582642,
0.0341927707195282,
-0.25462064146995544,
-0.08817891776561737,
0.03692420572042465,
-0.07059018313884735,
-0.017088232561945915,
0.10117113590240479,
0.022254562005400658,
-0.022491108626127243,
-0.002242153976112604,
-0.08461793512105942,
0.006515670567750931,
0.04800250008702278,
-0.16465109586715698,
-0.08773943781852722
] |
null | null | transformers | {"tags": ["summarization"], "widget": [{"text": "predict protein ms : Met Gly Leu Pro Val Ser Trp Ala Pro Pro Ala Leu"}]} | summarization | Rostlab/prot_t5_base_mt_uniref50 | [
"transformers",
"pytorch",
"jax",
"t5",
"text2text-generation",
"summarization",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #text2text-generation #summarization #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| [] | [
"TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #summarization #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] | [
55
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #summarization #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] | [
-0.00885684508830309,
0.0366438552737236,
-0.007386410143226385,
-0.003055740613490343,
0.1814001351594925,
0.02287779562175274,
0.07567945867776871,
0.13352173566818237,
-0.04327563941478729,
-0.03615926206111908,
0.11628758907318115,
0.18869814276695251,
0.005391064565628767,
0.08821596205234528,
-0.10431814938783646,
-0.2872014045715332,
0.038051314651966095,
0.05368693172931671,
-0.02907458320260048,
0.12956306338310242,
0.09080680459737778,
-0.06368096172809601,
0.09022143483161926,
-0.046358730643987656,
-0.15574465692043304,
0.052884653210639954,
0.04959751293063164,
-0.13848528265953064,
0.09714929014444351,
0.05436805635690689,
0.08493895083665848,
0.03899681195616722,
-0.05121820420026779,
-0.15080517530441284,
0.038050681352615356,
0.020033882930874825,
-0.07023341208696365,
0.05648427829146385,
0.11998283863067627,
-0.08822090178728104,
0.153213769197464,
0.01453525759279728,
-0.016363533213734627,
0.06299133598804474,
-0.15228042006492615,
-0.0001669172488618642,
-0.0026921776589006186,
-0.013369261287152767,
0.08426505327224731,
0.08311229199171066,
-0.022362152114510536,
0.09492908418178558,
-0.08439143002033234,
0.12984050810337067,
0.10704697668552399,
-0.29728609323501587,
-0.01523333415389061,
0.061825282871723175,
0.05360575392842293,
0.06870020180940628,
-0.027181552723050117,
0.05179692432284355,
0.03325923532247543,
0.018869342282414436,
0.035680629312992096,
-0.07821083068847656,
-0.16661933064460754,
0.04373058304190636,
-0.08571289479732513,
-0.06616699695587158,
0.28041398525238037,
-0.045259494334459305,
0.06799546629190445,
-0.023925846442580223,
-0.12461716681718826,
-0.04113958403468132,
-0.029575755819678307,
0.01196159329265356,
-0.04596603289246559,
0.06887394189834595,
0.0171765498816967,
-0.03376072645187378,
-0.13003811240196228,
-0.006765075959265232,
-0.17553037405014038,
0.10473687946796417,
0.007915402762591839,
0.046713557094335556,
-0.2371778041124344,
0.07822654396295547,
0.03939322754740715,
-0.10687591880559921,
0.074310801923275,
-0.08738629519939423,
-0.009369096718728542,
-0.01906667649745941,
-0.06367124617099762,
-0.19376051425933838,
0.07443945109844208,
0.06703539937734604,
-0.01491136197000742,
0.046592287719249725,
-0.045419927686452866,
0.07944696396589279,
0.06078286096453667,
0.06684102863073349,
-0.010317256674170494,
-0.04960169643163681,
0.03323304280638695,
-0.10671478509902954,
0.0018692502053454518,
-0.06663831323385239,
-0.12212975323200226,
-0.03243971988558769,
0.07321830838918686,
0.0944797545671463,
0.016429534181952477,
0.1007981076836586,
-0.04228510335087776,
-0.04001845046877861,
-0.03327737748622894,
-0.08496276289224625,
-0.02632840722799301,
-0.0008716568117961287,
0.02530737593770027,
0.1396273970603943,
-0.00012885816977359354,
0.002080426551401615,
-0.16593721508979797,
0.07773100584745407,
-0.09181670844554901,
-0.014909065328538418,
-0.02382843755185604,
-0.06725116819143295,
0.022760072723031044,
-0.08758915960788727,
0.013587563298642635,
-0.15148308873176575,
-0.1534709632396698,
0.01380071323364973,
0.014643745496869087,
-0.03060569427907467,
-0.061917971819639206,
-0.05510208383202553,
-0.03876190632581711,
0.05815538018941879,
-0.051538337022066116,
0.04025772586464882,
-0.06449186056852341,
0.10185105353593826,
-0.05131280794739723,
0.06529854983091354,
-0.10736452788114548,
0.08057296276092529,
-0.1326865255832672,
-0.020127009600400925,
-0.06797417253255844,
0.08430767059326172,
0.022369416430592537,
0.12999486923217773,
-0.0364568792283535,
-0.025630049407482147,
-0.09417803585529327,
0.044804200530052185,
-0.028364818543195724,
0.2156178057193756,
-0.14391346275806427,
-0.10312804579734802,
0.2209349423646927,
-0.08481465280056,
-0.14649082720279694,
0.09437653422355652,
0.009394344873726368,
0.06302239000797272,
0.08797243237495422,
0.20587287843227386,
0.02240793965756893,
0.02021733857691288,
0.0803072527050972,
0.12008228898048401,
-0.09437653422355652,
-0.06446485966444016,
0.015015698969364166,
-0.0012401269050315022,
-0.13677416741847992,
0.053223900496959686,
0.12715870141983032,
0.06097990646958351,
-0.05352640151977539,
-0.03980959579348564,
-0.03372880071401596,
0.014481410384178162,
0.059147484600543976,
0.0014127178583294153,
0.12782199680805206,
-0.04820133373141289,
-0.0068925064988434315,
-0.029158946126699448,
-0.020520100370049477,
-0.02729082480072975,
0.041360970586538315,
-0.025169244036078453,
0.11583849042654037,
-0.029906777665019035,
0.06689642369747162,
-0.21490301191806793,
-0.059421125799417496,
-0.010023992508649826,
0.14354479312896729,
-0.009165200404822826,
0.06721485406160355,
0.030399898067116737,
-0.043772801756858826,
-0.024209056049585342,
0.0022827230859547853,
0.15543591976165771,
-0.012907243333756924,
-0.07269752025604248,
-0.08003406226634979,
0.06101473420858383,
-0.06188555434346199,
-0.00918099656701088,
-0.0757790356874466,
0.01630331203341484,
0.046828512102365494,
0.11903370171785355,
0.02741044946014881,
0.04630012810230255,
0.014543166384100914,
0.021485356613993645,
-0.07282950729131699,
0.013228030875325203,
0.10868684947490692,
-0.004010590258985758,
-0.08133392781019211,
0.1996903121471405,
-0.15968745946884155,
0.2144346684217453,
0.18702711164951324,
-0.2789956331253052,
0.011766656301915646,
-0.04732691869139671,
-0.02946089766919613,
0.009907240979373455,
0.03673785552382469,
-0.033642809838056564,
0.06828907877206802,
-0.006844087969511747,
0.2053433209657669,
-0.07292551547288895,
-0.04903656244277954,
-0.002701889956369996,
-0.019038669764995575,
-0.023080525919795036,
0.0839122012257576,
0.11540234088897705,
-0.22249069809913635,
0.15859244763851166,
0.2290647327899933,
0.050982993096113205,
0.19977931678295135,
-0.022952701896429062,
-0.03984607383608818,
0.0870615541934967,
-0.003991079982370138,
-0.04685705155134201,
-0.08904483169317245,
-0.18961551785469055,
-0.01042428333312273,
0.07238123565912247,
0.04010198637843132,
0.10292918235063553,
-0.08897455036640167,
-0.026495572179555893,
-0.015726637095212936,
0.000966463063377887,
-0.04804627224802971,
0.09804097563028336,
0.07672636210918427,
0.14655740559101105,
-0.023051520809531212,
0.005915218032896519,
0.09642703086137772,
-0.006020356900990009,
-0.10184919834136963,
0.20170597732067108,
-0.12592126429080963,
-0.3428530693054199,
-0.1488613784313202,
-0.10579457879066467,
-0.024466020986437798,
0.024677829816937447,
0.10793636739253998,
-0.08800173550844193,
-0.027735648676753044,
-0.024138297885656357,
0.07717590779066086,
-0.10694387555122375,
0.029257260262966156,
-0.0860510990023613,
0.08137315511703491,
-0.06497011333703995,
-0.07838308811187744,
-0.034999363124370575,
-0.021908728405833244,
-0.027432085946202278,
0.1444745510816574,
-0.13158516585826874,
0.062029916793107986,
0.18466304242610931,
-0.005730915814638138,
0.04125441983342171,
-0.04910963773727417,
0.16709963977336884,
-0.07740995287895203,
0.0015752078033983707,
0.1796632707118988,
-0.053762488067150116,
0.06383250653743744,
0.12461172044277191,
-0.01265552919358015,
-0.0839187279343605,
0.04428728669881821,
-0.0100388303399086,
-0.07807334512472153,
-0.26518213748931885,
-0.105221688747406,
-0.1466415673494339,
0.10710249096155167,
0.06443322449922562,
0.056462280452251434,
0.08046701550483704,
0.05456065014004707,
-0.014095601625740528,
0.033481042832136154,
0.025666946545243263,
0.08081959187984467,
0.19546131789684296,
-0.015092683956027031,
0.12192701548337936,
-0.05570659786462784,
-0.1242058053612709,
0.07870502024888992,
0.030435826629400253,
0.12274514138698578,
0.0453457236289978,
0.07258615642786026,
0.004480099305510521,
0.04591178148984909,
0.12799008190631866,
0.15667922794818878,
0.02719089947640896,
-0.001011176616884768,
-0.05672479048371315,
-0.033047184348106384,
-0.04990791156888008,
0.031959060579538345,
0.02852315828204155,
-0.11695997416973114,
-0.10655515640974045,
-0.052021801471710205,
0.08757030218839645,
0.133547842502594,
0.06719835847616196,
-0.20948073267936707,
0.022415613755583763,
0.08956102281808853,
-0.06065613031387329,
-0.12222155183553696,
0.09725290536880493,
0.025816408917307854,
-0.11204691976308823,
0.07995595037937164,
-0.05511505901813507,
0.14026959240436554,
-0.037842996418476105,
0.09889085590839386,
-0.0577358603477478,
-0.0763547495007515,
0.013694165274500847,
0.11438287049531937,
-0.28264468908309937,
0.20326559245586395,
0.005834352225065231,
-0.0677131786942482,
-0.08859919011592865,
-0.0022602968383580446,
0.003716197097674012,
0.10254153609275818,
0.11365391314029694,
-0.0034065591171383858,
-0.0940663143992424,
-0.045518405735492706,
-0.004580033477395773,
0.03922364488244057,
0.1304946094751358,
-0.025354577228426933,
0.0028823099564760923,
-0.061068251729011536,
-0.00738386670127511,
-0.025994108989834785,
-0.024813441559672356,
-0.0018560424214228988,
-0.18784096837043762,
0.06720361113548279,
0.015354257076978683,
0.060203347355127335,
0.015409635379910469,
-0.015143143944442272,
-0.02193106710910797,
0.20049403607845306,
-0.0372215136885643,
-0.0902707651257515,
-0.12633317708969116,
-0.03203250467777252,
0.06858786195516586,
-0.05740797147154808,
0.032117486000061035,
-0.06839781999588013,
0.028322305530309677,
-0.04979604482650757,
-0.2391272783279419,
0.1272689700126648,
-0.0834110826253891,
-0.02836892567574978,
-0.05505044013261795,
0.19139181077480316,
-0.1027015894651413,
0.004776536021381617,
0.009735398925840855,
0.00979637075215578,
-0.07956475764513016,
-0.06500625610351562,
-0.00189224723726511,
-0.013768031261861324,
0.0410882793366909,
0.06857594102621078,
-0.094396211206913,
-0.05948323383927345,
-0.03094465844333172,
-0.013977794907987118,
0.31960809230804443,
0.11230971664190292,
-0.034313395619392395,
0.16429506242275238,
0.10480558127164841,
-0.09528433531522751,
-0.3014781177043915,
-0.09246442466974258,
-0.08899301290512085,
-0.013309202156960964,
-0.015870267525315285,
-0.16556738317012787,
0.06339729577302933,
-0.008961092680692673,
0.019375422969460487,
0.08768292516469955,
-0.2594638466835022,
-0.08448109030723572,
0.1171906441450119,
0.011714508756995201,
0.32450491189956665,
-0.12076475471258163,
-0.09582061320543289,
-0.04573403671383858,
-0.12682710587978363,
0.18270690739154816,
-0.07042720913887024,
0.09528783708810806,
-0.02888043038547039,
0.07287296652793884,
0.05558346211910248,
-0.026862580329179764,
0.06185968220233917,
0.016074758023023605,
0.0006267539574764669,
-0.08578988164663315,
-0.06263881921768188,
0.057814836502075195,
-0.0008509132312610745,
0.02299005538225174,
-0.08451405167579651,
0.04845539107918739,
-0.1260489672422409,
-0.030153527855873108,
-0.0786549523472786,
0.040309783071279526,
0.02821129374206066,
-0.04789082705974579,
0.0231937188655138,
-0.062297821044921875,
0.028930025175213814,
-0.008066940121352673,
0.2589820623397827,
-0.051961030811071396,
0.1632540374994278,
0.1588483601808548,
0.14186622202396393,
-0.13833339512348175,
0.037340156733989716,
-0.06510274857282639,
-0.04957971349358559,
0.0769677683711052,
-0.14205431938171387,
0.0736270323395729,
0.1253831684589386,
-0.02496907114982605,
0.06151547655463219,
0.11367810517549515,
0.026967117562890053,
-0.017746184021234512,
0.14865785837173462,
-0.24367953836917877,
-0.03359508141875267,
-0.09440615028142929,
-0.032016053795814514,
0.02834210731089115,
0.04582155495882034,
0.17334885895252228,
0.02302214689552784,
-0.02441374398767948,
0.006560794543474913,
0.02024407684803009,
-0.033218517899513245,
0.08112181723117828,
-0.013843647204339504,
0.036134570837020874,
-0.13280481100082397,
0.11878669261932373,
0.050327908247709274,
-0.1899414360523224,
0.0038877432234585285,
0.2005687654018402,
-0.13774177432060242,
-0.11511614918708801,
-0.026880817487835884,
0.11449077725410461,
-0.08072125911712646,
-0.011070881970226765,
-0.07375484704971313,
-0.14760719239711761,
0.09209708124399185,
0.17443175613880157,
0.050490979105234146,
0.09256310015916824,
-0.0644281804561615,
-0.06077445670962334,
-0.031754687428474426,
0.015886465087532997,
0.04367690905928612,
0.0020837823394685984,
-0.09228523075580597,
0.048762790858745575,
-0.03962065652012825,
0.1539565771818161,
-0.08863548189401627,
-0.06554538756608963,
-0.14217108488082886,
0.030391423031687737,
-0.12305980920791626,
-0.03485057130455971,
-0.07572858035564423,
-0.04561195522546768,
-0.014577195979654789,
0.008566942997276783,
-0.062240637838840485,
-0.06196754425764084,
-0.11257091909646988,
0.003634463995695114,
-0.055036239326000214,
0.04969285801053047,
-0.04275442287325859,
-0.005345914512872696,
0.06437298655509949,
-0.04742603003978729,
0.1237197145819664,
0.11285912245512009,
-0.10151691734790802,
0.13135559856891632,
-0.14532481133937836,
-0.1168450340628624,
0.11585338413715363,
0.02214507758617401,
0.035215113312006,
0.07717741280794144,
0.023477060720324516,
0.0639004185795784,
0.0071669272147119045,
0.035012345761060715,
-0.014537067152559757,
-0.1266637146472931,
0.016719264909625053,
-0.037811603397130966,
-0.14255598187446594,
-0.0678265169262886,
-0.04020644351840019,
0.02175409533083439,
0.019971037283539772,
0.1259283870458603,
-0.05454422906041145,
0.09598244726657867,
-0.08099927753210068,
0.03376862034201622,
0.0005306427483446896,
-0.18068280816078186,
-0.06741592288017273,
-0.087196484208107,
0.03930744156241417,
-0.01783260703086853,
0.18652518093585968,
0.04694219306111336,
0.031675148755311966,
0.040428925305604935,
0.06560108810663223,
0.01324198953807354,
0.028279587626457214,
0.23209212720394135,
0.08611393719911575,
-0.0715976282954216,
-0.09866803884506226,
0.05655201897025108,
0.027314601466059685,
0.07959344983100891,
0.1608598679304123,
0.05461559817194939,
-0.019096167758107185,
0.10962182283401489,
-0.030378062278032303,
0.027496235445141792,
-0.05969841778278351,
-0.0773225873708725,
0.018753783777356148,
0.09001362323760986,
0.008197737857699394,
0.05972365289926529,
0.16654548048973083,
-0.021613771095871925,
0.033243000507354736,
-0.024728206917643547,
-0.0579744316637516,
-0.16843555867671967,
-0.12624423205852509,
-0.08257276564836502,
-0.09218014031648636,
-0.012312538921833038,
-0.119622603058815,
0.03608935698866844,
0.04908999428153038,
0.06154166907072067,
-0.05901232361793518,
0.06896135210990906,
0.0861833393573761,
-0.13055628538131714,
0.11458221822977066,
-0.019554752856492996,
0.05867708474397659,
-0.0032143627759069204,
0.006579176988452673,
-0.07318839430809021,
-0.0021383671555668116,
-0.04415219649672508,
0.05846681445837021,
-0.04503598436713219,
0.03198135644197464,
-0.1553085893392563,
-0.10761582106351852,
-0.03047015331685543,
0.0592585988342762,
-0.04666175693273544,
0.0916823074221611,
0.02491159178316593,
-0.04086587205529213,
0.04598375782370567,
0.25174859166145325,
-0.07430531829595566,
-0.05090668424963951,
-0.04736848548054695,
0.23600207269191742,
0.05565238744020462,
0.09263633191585541,
0.0185695793479681,
-0.019804703071713448,
-0.080891452729702,
0.31331130862236023,
0.3096484839916229,
-0.07150512933731079,
0.01141127198934555,
0.005597235634922981,
0.03459639474749565,
0.11472219228744507,
0.12379639595746994,
0.07939007878303528,
0.24356205761432648,
-0.06764660775661469,
0.013322337530553341,
-0.026163339614868164,
-0.0070198699831962585,
-0.07448368519544601,
0.13744546473026276,
0.07407564669847488,
-0.07088526338338852,
-0.017412949353456497,
0.10655779391527176,
-0.2394658774137497,
0.10499228537082672,
-0.10327377915382385,
-0.17018292844295502,
-0.06595364212989807,
-0.03154661878943443,
0.09219581633806229,
0.018831700086593628,
0.07792063802480698,
-0.023131011053919792,
-0.0878896713256836,
0.030071360990405083,
0.034756071865558624,
-0.21177354454994202,
-0.011917813681066036,
0.05727410316467285,
-0.11405173689126968,
-0.025484375655651093,
-0.020780418068170547,
0.00980516616255045,
0.06179577112197876,
0.06572633981704712,
-0.032815925776958466,
0.02752046100795269,
-0.006466712802648544,
0.016800379380583763,
0.04582298919558525,
0.06220430135726929,
0.010845647193491459,
-0.03362390398979187,
0.06460937857627869,
-0.13022170960903168,
0.032139796763658524,
-0.05615578591823578,
-0.03659052401781082,
0.0027205636724829674,
0.005240989848971367,
-0.036667969077825546,
0.04577409848570824,
0.13025572896003723,
-0.004379079211503267,
0.013309872709214687,
-0.08075372874736786,
-0.02780190482735634,
-0.0025897182058542967,
-0.09361409395933151,
-0.0839245542883873,
-0.10867559909820557,
-0.10308113694190979,
0.1043304055929184,
-0.009395968168973923,
-0.2242775410413742,
0.03131091222167015,
-0.10762248933315277,
0.05611661076545715,
-0.1910056471824646,
0.10845531523227692,
0.0922405943274498,
-0.0026878926437348127,
-0.005512266419827938,
-0.06228518858551979,
0.0676109716296196,
0.10238857567310333,
-0.1110263541340828,
-0.07424052059650421
] |
||
null | null | transformers |
# ProtT5-XL-BFD model
Pretrained model on protein sequences using a masked language modeling (MLM) objective. It was introduced in
[this paper](https://doi.org/10.1101/2020.07.12.199554) and first released in
[this repository](https://github.com/agemagician/ProtTrans). This model is trained on uppercase amino acids: it only works with capital letter amino acids.
## Model description
ProtT5-XL-BFD is based on the `t5-3b` model and was pretrained on a large corpus of protein sequences in a self-supervised fashion.
This means it was pretrained on the raw protein sequences only, with no humans labelling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those protein sequences.
One important difference between this T5 model and the original T5 version is the denosing objective.
The original T5-3B model was pretrained using a span denosing objective, while this model was pre-trained with a Bart-like MLM denosing objective.
The masking probability is consistent with the original T5 training by randomly masking 15% of the amino acids in the input.
It has been shown that the features extracted from this self-supervised model (LM-embeddings) captured important biophysical properties governing protein shape.
shape.
This implied learning some of the grammar of the language of life realized in protein sequences.
## Intended uses & limitations
The model could be used for protein feature extraction or to be fine-tuned on downstream tasks.
We have noticed in some tasks on can gain more accuracy by fine-tuning the model rather than using it as a feature extractor.
We have also noticed that for feature extraction, its better to use the feature extracted from the encoder not from the decoder.
### How to use
Here is how to use this model to extract the features of a given protein sequence in PyTorch:
```python
from transformers import T5Tokenizer, T5Model
import re
import torch
tokenizer = T5Tokenizer.from_pretrained('Rostlab/prot_t5_xl_bfd', do_lower_case=False)
model = T5Model.from_pretrained("Rostlab/prot_t5_xl_bfd")
sequences_Example = ["A E T C Z A O","S K T Z P"]
sequences_Example = [re.sub(r"[UZOB]", "X", sequence) for sequence in sequences_Example]
ids = tokenizer.batch_encode_plus(sequences_Example, add_special_tokens=True, padding=True)
input_ids = torch.tensor(ids['input_ids'])
attention_mask = torch.tensor(ids['attention_mask'])
with torch.no_grad():
embedding = model(input_ids=input_ids,attention_mask=attention_mask,decoder_input_ids=None)
# For feature extraction we recommend to use the encoder embedding
encoder_embedding = embedding[2].cpu().numpy()
decoder_embedding = embedding[0].cpu().numpy()
```
## Training data
The ProtT5-XL-BFD model was pretrained on [BFD](https://bfd.mmseqs.com/), a dataset consisting of 2.1 billion protein sequences.
## Training procedure
### Preprocessing
The protein sequences are uppercased and tokenized using a single space and a vocabulary size of 21. The rare amino acids "U,Z,O,B" were mapped to "X".
The inputs of the model are then of the form:
```
Protein Sequence [EOS]
```
The preprocessing step was performed on the fly, by cutting and padding the protein sequences up to 512 tokens.
The details of the masking procedure for each sequence are as follows:
- 15% of the amino acids are masked.
- In 90% of the cases, the masked amino acids are replaced by `[MASK]` token.
- In 10% of the cases, the masked amino acids are replaced by a random amino acid (different) from the one they replace.
### Pretraining
The model was trained on a single TPU Pod V3-1024 for 1.2 million steps in total, using sequence length 512 (batch size 4k).
It has a total of approximately 3B parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
When the model is used for feature etraction, this model achieves the following results:
Test results :
| Task/Dataset | secondary structure (3-states) | secondary structure (8-states) | Localization | Membrane |
|:-----:|:-----:|:-----:|:-----:|:-----:|
| CASP12 | 77 | 66 | | |
| TS115 | 85 | 74 | | |
| CB513 | 84 | 71 | | |
| DeepLoc | | | 77 | 91 |
### BibTeX entry and citation info
```bibtex
@article {Elnaggar2020.07.12.199554,
author = {Elnaggar, Ahmed and Heinzinger, Michael and Dallago, Christian and Rehawi, Ghalia and Wang, Yu and Jones, Llion and Gibbs, Tom and Feher, Tamas and Angerer, Christoph and Steinegger, Martin and BHOWMIK, DEBSINDHU and Rost, Burkhard},
title = {ProtTrans: Towards Cracking the Language of Life{\textquoteright}s Code Through Self-Supervised Deep Learning and High Performance Computing},
elocation-id = {2020.07.12.199554},
year = {2020},
doi = {10.1101/2020.07.12.199554},
publisher = {Cold Spring Harbor Laboratory},
abstract = {Computational biology and bioinformatics provide vast data gold-mines from protein sequences, ideal for Language Models (LMs) taken from Natural Language Processing (NLP). These LMs reach for new prediction frontiers at low inference costs. Here, we trained two auto-regressive language models (Transformer-XL, XLNet) and two auto-encoder models (Bert, Albert) on data from UniRef and BFD containing up to 393 billion amino acids (words) from 2.1 billion protein sequences (22- and 112 times the entire English Wikipedia). The LMs were trained on the Summit supercomputer at Oak Ridge National Laboratory (ORNL), using 936 nodes (total 5616 GPUs) and one TPU Pod (V3-512 or V3-1024). We validated the advantage of up-scaling LMs to larger models supported by bigger data by predicting secondary structure (3-states: Q3=76-84, 8 states: Q8=65-73), sub-cellular localization for 10 cellular compartments (Q10=74) and whether a protein is membrane-bound or water-soluble (Q2=89). Dimensionality reduction revealed that the LM-embeddings from unlabeled data (only protein sequences) captured important biophysical properties governing protein shape. This implied learning some of the grammar of the language of life realized in protein sequences. The successful up-scaling of protein LMs through HPC to larger data sets slightly reduced the gap between models trained on evolutionary information and LMs. Availability ProtTrans: \<a href="https://github.com/agemagician/ProtTrans"\>https://github.com/agemagician/ProtTrans\</a\>Competing Interest StatementThe authors have declared no competing interest.},
URL = {https://www.biorxiv.org/content/early/2020/07/21/2020.07.12.199554},
eprint = {https://www.biorxiv.org/content/early/2020/07/21/2020.07.12.199554.full.pdf},
journal = {bioRxiv}
}
```
> Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
| {"language": "protein", "tags": ["protein language model"], "datasets": ["BFD"]} | text2text-generation | Rostlab/prot_t5_xl_bfd | [
"transformers",
"pytorch",
"tf",
"t5",
"text2text-generation",
"protein language model",
"dataset:BFD",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"protein"
] | TAGS
#transformers #pytorch #tf #t5 #text2text-generation #protein language model #dataset-BFD #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
| ProtT5-XL-BFD model
===================
Pretrained model on protein sequences using a masked language modeling (MLM) objective. It was introduced in
this paper and first released in
this repository. This model is trained on uppercase amino acids: it only works with capital letter amino acids.
Model description
-----------------
ProtT5-XL-BFD is based on the 't5-3b' model and was pretrained on a large corpus of protein sequences in a self-supervised fashion.
This means it was pretrained on the raw protein sequences only, with no humans labelling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those protein sequences.
One important difference between this T5 model and the original T5 version is the denosing objective.
The original T5-3B model was pretrained using a span denosing objective, while this model was pre-trained with a Bart-like MLM denosing objective.
The masking probability is consistent with the original T5 training by randomly masking 15% of the amino acids in the input.
It has been shown that the features extracted from this self-supervised model (LM-embeddings) captured important biophysical properties governing protein shape.
shape.
This implied learning some of the grammar of the language of life realized in protein sequences.
Intended uses & limitations
---------------------------
The model could be used for protein feature extraction or to be fine-tuned on downstream tasks.
We have noticed in some tasks on can gain more accuracy by fine-tuning the model rather than using it as a feature extractor.
We have also noticed that for feature extraction, its better to use the feature extracted from the encoder not from the decoder.
### How to use
Here is how to use this model to extract the features of a given protein sequence in PyTorch:
Training data
-------------
The ProtT5-XL-BFD model was pretrained on BFD, a dataset consisting of 2.1 billion protein sequences.
Training procedure
------------------
### Preprocessing
The protein sequences are uppercased and tokenized using a single space and a vocabulary size of 21. The rare amino acids "U,Z,O,B" were mapped to "X".
The inputs of the model are then of the form:
The preprocessing step was performed on the fly, by cutting and padding the protein sequences up to 512 tokens.
The details of the masking procedure for each sequence are as follows:
* 15% of the amino acids are masked.
* In 90% of the cases, the masked amino acids are replaced by '[MASK]' token.
* In 10% of the cases, the masked amino acids are replaced by a random amino acid (different) from the one they replace.
### Pretraining
The model was trained on a single TPU Pod V3-1024 for 1.2 million steps in total, using sequence length 512 (batch size 4k).
It has a total of approximately 3B parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
When the model is used for feature etraction, this model achieves the following results:
Test results :
### BibTeX entry and citation info
>
> Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to extract the features of a given protein sequence in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe ProtT5-XL-BFD model was pretrained on BFD, a dataset consisting of 2.1 billion protein sequences.\n\n\nTraining procedure\n------------------",
"### Preprocessing\n\n\nThe protein sequences are uppercased and tokenized using a single space and a vocabulary size of 21. The rare amino acids \"U,Z,O,B\" were mapped to \"X\".\nThe inputs of the model are then of the form:\n\n\nThe preprocessing step was performed on the fly, by cutting and padding the protein sequences up to 512 tokens.\n\n\nThe details of the masking procedure for each sequence are as follows:\n\n\n* 15% of the amino acids are masked.\n* In 90% of the cases, the masked amino acids are replaced by '[MASK]' token.\n* In 10% of the cases, the masked amino acids are replaced by a random amino acid (different) from the one they replace.",
"### Pretraining\n\n\nThe model was trained on a single TPU Pod V3-1024 for 1.2 million steps in total, using sequence length 512 (batch size 4k).\nIt has a total of approximately 3B parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for feature etraction, this model achieves the following results:\n\n\nTest results :",
"### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #tf #t5 #text2text-generation #protein language model #dataset-BFD #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to extract the features of a given protein sequence in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe ProtT5-XL-BFD model was pretrained on BFD, a dataset consisting of 2.1 billion protein sequences.\n\n\nTraining procedure\n------------------",
"### Preprocessing\n\n\nThe protein sequences are uppercased and tokenized using a single space and a vocabulary size of 21. The rare amino acids \"U,Z,O,B\" were mapped to \"X\".\nThe inputs of the model are then of the form:\n\n\nThe preprocessing step was performed on the fly, by cutting and padding the protein sequences up to 512 tokens.\n\n\nThe details of the masking procedure for each sequence are as follows:\n\n\n* 15% of the amino acids are masked.\n* In 90% of the cases, the masked amino acids are replaced by '[MASK]' token.\n* In 10% of the cases, the masked amino acids are replaced by a random amino acid (different) from the one they replace.",
"### Pretraining\n\n\nThe model was trained on a single TPU Pod V3-1024 for 1.2 million steps in total, using sequence length 512 (batch size 4k).\nIt has a total of approximately 3B parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for feature etraction, this model achieves the following results:\n\n\nTest results :",
"### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>"
] | [
62,
67,
176,
112,
34
] | [
"passage: TAGS\n#transformers #pytorch #tf #t5 #text2text-generation #protein language model #dataset-BFD #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to extract the features of a given protein sequence in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe ProtT5-XL-BFD model was pretrained on BFD, a dataset consisting of 2.1 billion protein sequences.\n\n\nTraining procedure\n------------------### Preprocessing\n\n\nThe protein sequences are uppercased and tokenized using a single space and a vocabulary size of 21. The rare amino acids \"U,Z,O,B\" were mapped to \"X\".\nThe inputs of the model are then of the form:\n\n\nThe preprocessing step was performed on the fly, by cutting and padding the protein sequences up to 512 tokens.\n\n\nThe details of the masking procedure for each sequence are as follows:\n\n\n* 15% of the amino acids are masked.\n* In 90% of the cases, the masked amino acids are replaced by '[MASK]' token.\n* In 10% of the cases, the masked amino acids are replaced by a random amino acid (different) from the one they replace.### Pretraining\n\n\nThe model was trained on a single TPU Pod V3-1024 for 1.2 million steps in total, using sequence length 512 (batch size 4k).\nIt has a total of approximately 3B parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for feature etraction, this model achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>"
] | [
0.005097676534205675,
0.08261376619338989,
-0.0016880639595910907,
0.05573452636599541,
0.07109284400939941,
0.009570641443133354,
-0.004908155649900436,
0.1181461364030838,
-0.023078221827745438,
0.06956952810287476,
0.03151322528719902,
0.02695300430059433,
0.12478771805763245,
0.2686762809753418,
0.07261019200086594,
-0.22328020632266998,
0.05080382525920868,
0.005728687159717083,
0.0702134296298027,
0.09595557302236557,
0.06850919127464294,
-0.08315764367580414,
0.048424653708934784,
-0.0254153273999691,
0.02186380885541439,
0.011100449599325657,
0.021392136812210083,
-0.034922193735837936,
0.1346847265958786,
0.027621151879429817,
0.04152576997876167,
0.048641305416822433,
0.05196905508637428,
-0.16544665396213531,
0.03294796496629715,
0.09591440111398697,
-0.014600059948861599,
0.08355609327554703,
0.05686299875378609,
0.11148155480623245,
0.030511118471622467,
-0.05165993794798851,
0.003291470929980278,
0.08595194667577744,
-0.07990653067827225,
-0.10952118039131165,
-0.10736225545406342,
0.06062933802604675,
0.009948390536010265,
0.04399874433875084,
-0.012336165644228458,
0.035581909120082855,
0.06960096955299377,
0.08501493185758591,
0.28777289390563965,
-0.21750257909297943,
0.05408826470375061,
-0.005483652465045452,
0.014647978357970715,
-0.018214073032140732,
-0.02806905098259449,
-0.000359136174665764,
0.022346653044223785,
-0.006234664935618639,
0.1155967190861702,
-0.03290300816297531,
-0.0349464938044548,
-0.08164368569850922,
-0.08515162020921707,
-0.05092468112707138,
0.030938072130084038,
-0.008942971006035805,
-0.142914280295372,
-0.05219169706106186,
-0.09457225352525711,
-0.11648644506931305,
-0.04575563967227936,
-0.06269162148237228,
0.03050222247838974,
0.02961692214012146,
0.05917911231517792,
-0.1250571310520172,
-0.05383501201868057,
-0.06500304490327835,
-0.048984114080667496,
0.09569039940834045,
0.08187763392925262,
-0.00008157509000739083,
-0.03210219368338585,
0.05048232898116112,
-0.14152821898460388,
-0.030534280464053154,
-0.0767965093255043,
-0.027971841394901276,
-0.11784246563911438,
-0.033770859241485596,
-0.02069336175918579,
-0.2762937545776367,
-0.04988403618335724,
0.10683689266443253,
-0.08051185309886932,
0.06880128383636475,
-0.11765162646770477,
-0.020827893167734146,
-0.05420958250761032,
0.10980216413736343,
-0.10143106430768967,
0.02735726535320282,
0.047998860478401184,
0.04516512528061867,
-0.05751703307032585,
0.018349459394812584,
0.009556286036968231,
0.003087387653067708,
0.07997032254934311,
0.0509355328977108,
0.028311362490057945,
0.10103844106197357,
-0.04204127937555313,
-0.007809678092598915,
0.12962430715560913,
-0.1271509975194931,
-0.052801188081502914,
-0.02132655866444111,
-0.0598209947347641,
-0.016522208228707314,
0.11148488521575928,
-0.1056545227766037,
-0.0881795659661293,
0.03481140360236168,
-0.08687451481819153,
-0.07314072549343109,
-0.0784415677189827,
-0.0956176295876503,
-0.02282153069972992,
-0.015642201527953148,
-0.08476534485816956,
-0.04933992400765419,
-0.21562758088111877,
-0.05708669126033783,
0.006177863106131554,
0.01920018531382084,
-0.031570762395858765,
0.031256407499313354,
0.02618427947163582,
-0.057127878069877625,
-0.007649597711861134,
0.034335728734731674,
0.022655397653579712,
0.06967982649803162,
-0.04342816025018692,
0.08264375478029251,
-0.015015790238976479,
0.050545331090688705,
-0.09146936982870102,
0.05049792304635048,
-0.20758026838302612,
0.048930540680885315,
-0.024881582707166672,
-0.0831712856888771,
-0.12853099405765533,
-0.0686766728758812,
-0.0997837483882904,
-0.03880320489406586,
0.04509603604674339,
0.12831276655197144,
-0.20325587689876556,
-0.03597751259803772,
0.29101893305778503,
-0.07376157492399216,
0.02050589583814144,
0.09805932641029358,
-0.019237346947193146,
0.04113933444023132,
0.14027005434036255,
0.10761478543281555,
0.005137158092111349,
-0.09748189896345139,
-0.060590747743844986,
-0.05032734200358391,
-0.04245834797620773,
0.09257670491933823,
0.03680889680981636,
-0.05460122227668762,
-0.03857540339231491,
0.048196081072092056,
0.04414678364992142,
-0.04278385639190674,
-0.002183607080951333,
-0.032788556069135666,
0.03897296264767647,
-0.014155257493257523,
0.0297574270516634,
-0.05344213545322418,
0.003433309029787779,
0.02136549912393093,
-0.05625345557928085,
-0.058026060461997986,
0.09156728535890579,
-0.09111946076154709,
0.11009923368692398,
-0.016256382688879967,
0.023759344592690468,
0.0322783924639225,
0.02259717881679535,
-0.15185047686100006,
0.010687139816582203,
0.0824674591422081,
-0.1340092420578003,
0.04421338066458702,
-0.07716281712055206,
-0.003003097139298916,
0.05365893244743347,
-0.03952116519212723,
0.02680443786084652,
-0.03395269066095352,
-0.028887828812003136,
-0.1057371199131012,
-0.0777784138917923,
-0.05339939892292023,
-0.02717949077486992,
-0.0519583635032177,
-0.018334519118070602,
0.00006310602475423366,
-0.08650539815425873,
0.05356806889176369,
0.04948365315794945,
-0.06641611456871033,
0.014142594300210476,
0.027790619060397148,
-0.00008135278767440468,
-0.030459394678473473,
-0.02999097667634487,
-0.016915995627641678,
-0.02314065210521221,
0.0646999254822731,
-0.10955415666103363,
-0.11210399121046066,
0.04183534160256386,
0.10439782589673996,
-0.11583279073238373,
0.029951943084597588,
-0.031645532697439194,
-0.034792907536029816,
-0.03953489661216736,
-0.009146620519459248,
0.16955174505710602,
-0.004486382473260164,
0.14052852988243103,
-0.06526157259941101,
0.05535760149359703,
-0.0017313119024038315,
0.05151067674160004,
-0.04430670291185379,
-0.00752533134073019,
0.027117324993014336,
-0.12743882834911346,
0.057642992585897446,
-0.14423996210098267,
0.11835499852895737,
0.12270677089691162,
0.05552658811211586,
-0.12867288291454315,
-0.0067553031258285046,
-0.032830823212862015,
0.013718239031732082,
0.0769076868891716,
0.07443339377641678,
0.04245685413479805,
0.03640309348702431,
0.018808064982295036,
0.06273481994867325,
-0.03697297349572182,
0.08537044376134872,
0.07031916081905365,
-0.09384804219007492,
-0.019541488960385323,
-0.09405993670225143,
0.01918058469891548,
0.11939983069896698,
0.05883195996284485,
0.06783382594585419,
-0.04731845110654831,
0.013813827186822891,
-0.09321143478155136,
0.19375286996364594,
-0.0897655263543129,
-0.2731228470802307,
-0.1764502078294754,
-0.013495164923369884,
0.021020635962486267,
0.051653288304805756,
0.017894385382533073,
0.0277472585439682,
-0.0650918260216713,
-0.09926282614469528,
0.057437196373939514,
-0.03749967738986015,
0.045386120676994324,
0.010609807446599007,
-0.022679291665554047,
0.07133250683546066,
-0.1010783240199089,
0.00595709728077054,
-0.02898160181939602,
-0.009412899613380432,
-0.010798873379826546,
-0.0505530871450901,
0.10445144027471542,
0.049255430698394775,
-0.06584896147251129,
-0.033441346138715744,
-0.0703115239739418,
0.09758159518241882,
-0.021854480728507042,
0.11804627627134323,
0.05282159894704819,
-0.04670543223619461,
0.07952630519866943,
0.1082855686545372,
-0.0187626201659441,
-0.006617054343223572,
0.027994295582175255,
0.08606772869825363,
-0.04707278311252594,
-0.19717198610305786,
-0.0035820994526147842,
-0.01331870537251234,
0.057873230427503586,
0.08060073852539062,
0.06579175591468811,
0.046684879809617996,
0.010724293068051338,
-0.06850855052471161,
-0.007890039123594761,
-0.010853697545826435,
0.050828199833631516,
-0.038314979523420334,
-0.02902296371757984,
0.05396386235952377,
-0.015995662659406662,
0.03816114366054535,
0.11294157058000565,
0.044106315821409225,
0.13828982412815094,
-0.06464479863643646,
0.23428595066070557,
0.028562111780047417,
0.010042519308626652,
0.004011732060462236,
0.1269809752702713,
-0.04326898977160454,
0.05168328061699867,
-0.05187390744686127,
-0.023650135844945908,
-0.01801472157239914,
0.02011037804186344,
-0.017657650634646416,
-0.005446520633995533,
-0.018365956842899323,
0.02422075718641281,
0.04468062147498131,
0.2696194052696228,
0.0076066916808485985,
-0.19451917707920074,
-0.03425540775060654,
-0.03951198607683182,
-0.07334364950656891,
-0.058003418147563934,
-0.012815605849027634,
-0.02362660877406597,
-0.09042952209711075,
0.03415394946932793,
-0.0980420708656311,
0.08461359888315201,
-0.06926684826612473,
0.014085541479289532,
0.02923298254609108,
0.0409126803278923,
-0.062498100101947784,
0.06748902797698975,
-0.30164992809295654,
0.14571531116962433,
0.029280027374625206,
0.0634872317314148,
-0.08777707070112228,
0.04079544171690941,
-0.014071567915380001,
-0.02656332403421402,
0.19444990158081055,
-0.017669547349214554,
-0.11075253784656525,
-0.14997409284114838,
-0.11607105284929276,
0.01818111166357994,
0.10409174859523773,
0.06572597473859787,
0.15829865634441376,
-0.019816825166344643,
0.030476190149784088,
0.006740856915712357,
0.13348400592803955,
-0.1407415121793747,
-0.10449932515621185,
0.07416892796754837,
-0.06463201344013214,
0.017407895997166634,
-0.045086994767189026,
-0.06892195343971252,
-0.09725077450275421,
0.19025124609470367,
-0.056692637503147125,
-0.06037689372897148,
-0.15753182768821716,
0.019574886187911034,
0.11351446807384491,
-0.14228235185146332,
0.07043331116437912,
0.0048662093468010426,
0.06585657596588135,
-0.10600687563419342,
-0.18791279196739197,
0.07923413068056107,
-0.009185420349240303,
-0.11460341513156891,
-0.04759525880217552,
0.04707317054271698,
0.09806602448225021,
0.054373517632484436,
0.02428615838289261,
0.018196122720837593,
0.059533458203077316,
-0.06249263882637024,
-0.010122285224497318,
-0.031108347699046135,
0.12496268004179001,
0.06135803088545799,
-0.14136944711208344,
-0.09703671932220459,
-0.10697400569915771,
0.07516887784004211,
0.07860300689935684,
0.2552330791950226,
-0.06016518548130989,
0.1675281673669815,
0.13255739212036133,
-0.1094827875494957,
-0.16026024520397186,
-0.03007301315665245,
0.02387184463441372,
0.048202335834503174,
0.05599686875939369,
-0.2718351483345032,
-0.049720361828804016,
0.04056181386113167,
-0.05298863723874092,
0.016939101740717888,
-0.16901743412017822,
-0.14647264778614044,
0.07520880550146103,
0.04208623245358467,
0.07856830954551697,
-0.08357661962509155,
-0.004988864529877901,
-0.009838080033659935,
-0.009832287207245827,
0.039640240371227264,
0.1446518748998642,
0.1586960256099701,
-0.040707994252443314,
-0.07773403078317642,
0.0449739471077919,
-0.05875782296061516,
0.036909643560647964,
-0.022344857454299927,
0.042240872979164124,
-0.0058637866750359535,
0.024422792717814445,
0.04341281205415726,
-0.000486217497382313,
0.1320691853761673,
0.08054228872060776,
0.0788576528429985,
-0.0688217431306839,
-0.06912540644407272,
-0.0946749821305275,
0.07457051426172256,
-0.02430155873298645,
-0.07693106681108475,
-0.08340229094028473,
0.10285623371601105,
0.13098837435245514,
-0.03580755740404129,
-0.009645892307162285,
-0.03365402668714523,
0.07101892679929733,
0.2201344072818756,
-0.046193886548280716,
0.026162974536418915,
-0.06698692589998245,
0.10317157953977585,
-0.031197909265756607,
0.016013862565159798,
0.011067748069763184,
0.06463723629713058,
0.07887089252471924,
0.016550451517105103,
0.14087089896202087,
0.027438750490546227,
-0.1803656369447708,
-0.014434900134801865,
0.03412168473005295,
-0.16003476083278656,
-0.13840657472610474,
-0.00654145423322916,
-0.20845827460289001,
-0.13230015337467194,
-0.007681933231651783,
0.058603107929229736,
-0.05422024056315422,
-0.04766552522778511,
-0.011068259365856647,
0.06356856971979141,
0.02142399735748768,
0.16283555328845978,
0.05782092735171318,
0.004606835078448057,
-0.05150756239891052,
0.1378321498632431,
0.07374036312103271,
-0.0919223427772522,
0.06269906461238861,
0.08245372027158737,
-0.08339430391788483,
0.01963980309665203,
0.052297383546829224,
-0.008676253259181976,
0.028239980340003967,
-0.023978477343916893,
-0.14379170536994934,
0.0025603855028748512,
0.10751362890005112,
0.0995073989033699,
0.022486835718154907,
0.06610050797462463,
-0.02586979977786541,
-0.004056904464960098,
-0.059530436992645264,
0.08371841162443161,
0.01659199967980385,
0.07506868988275528,
0.018999770283699036,
0.18527747690677643,
-0.0423232726752758,
0.09805132448673248,
-0.04889228194952011,
0.05099494010210037,
-0.04953385144472122,
-0.04494551569223404,
-0.02883569896221161,
-0.020711755380034447,
-0.12561482191085815,
-0.07651280611753464,
-0.03359473869204521,
0.027443382889032364,
0.0428200289607048,
0.05799112096428871,
-0.05775991827249527,
-0.06891903281211853,
-0.05879419296979904,
0.021591149270534515,
-0.14820127189159393,
0.026139119639992714,
0.029534287750720978,
0.0022174629848450422,
0.07616551965475082,
-0.024058301001787186,
-0.027380933985114098,
0.037498146295547485,
0.03291426971554756,
0.024487024173140526,
0.011300922371447086,
0.002034218283370137,
-0.03885359317064285,
-0.11915271729230881,
-0.0038460129871964455,
-0.020966006442904472,
-0.024294830858707428,
-0.05884663388133049,
0.07007521390914917,
-0.13357320427894592,
-0.026467829942703247,
-0.03443199396133423,
0.09369164705276489,
-0.01605251058936119,
0.05498795956373215,
-0.006018882617354393,
-0.0021533393301069736,
0.04686569795012474,
-0.04111623391509056,
0.04366190358996391,
-0.14413566887378693,
-0.000633788004051894,
-0.008139229379594326,
-0.05856575444340706,
0.06626097112894058,
-0.026499291881918907,
0.044523149728775024,
0.01781521365046501,
0.11334631592035294,
-0.001039195922203362,
-0.01764082722365856,
0.021509889513254166,
-0.09648500382900238,
-0.017925802618265152,
-0.002467419719323516,
0.13965098559856415,
-0.05761511251330376,
-0.09943535178899765,
0.09304457902908325,
0.06214665248990059,
0.012973541393876076,
0.19181720912456512,
0.24992088973522186,
0.17172808945178986,
0.13113093376159668,
0.027120627462863922,
-0.04758583381772041,
-0.055979229509830475,
-0.1487462818622589,
-0.07655609399080276,
-0.03796447068452835,
0.047832194715738297,
0.03941725939512253,
-0.012443858198821545,
0.091255322098732,
-0.15840215981006622,
0.18857945501804352,
0.06345568597316742,
-0.07581699639558792,
-0.09576883912086487,
-0.1584073305130005,
-0.06405071169137955,
0.043401461094617844,
-0.041626814752817154,
-0.14035674929618835,
0.01515989564359188,
0.13138240575790405,
-0.01932048425078392,
-0.022748788818717003,
0.14202825725078583,
-0.07836445420980453,
-0.11940373480319977,
0.08291197568178177,
0.027403054758906364,
-0.027531800791621208,
0.06017271429300308,
-0.05191268026828766,
0.06836353242397308,
0.01684420555830002,
0.09163609892129898,
-0.012976971454918385,
0.08377595245838165,
0.07617875933647156,
-0.012341427616775036,
-0.04898446053266525,
-0.017737479880452156,
-0.013968967832624912,
0.09400556981563568,
0.16461358964443207,
-0.018576983362436295,
-0.06158333271741867,
-0.01627424918115139,
0.13024181127548218,
-0.030070314183831215,
0.02068832330405712,
-0.124693863093853,
0.2741847038269043,
0.11937949061393738,
-0.032662950456142426,
0.035201262682676315,
-0.10676752030849457,
0.009600626304745674,
0.24798816442489624,
0.11683229357004166,
0.03730296343564987,
0.015461581759154797,
-0.007246461231261492,
0.01885863021016121,
0.0961199551820755,
0.11505427211523056,
0.032604336738586426,
0.045199040323495865,
-0.013736766763031483,
0.06626701354980469,
-0.011060882359743118,
-0.025075655430555344,
-0.09628041088581085,
0.16113446652889252,
0.04003368690609932,
-0.01768142729997635,
-0.07449907809495926,
-0.006165439262986183,
-0.03643527626991272,
-0.3013727068901062,
-0.06129632890224457,
-0.053326912224292755,
-0.11157356202602386,
-0.03843562304973602,
-0.08071702718734741,
0.020004982128739357,
0.03667747974395752,
0.03126465901732445,
0.11159941554069519,
0.12297598272562027,
0.06197895109653473,
-0.08123017847537994,
-0.06081712245941162,
0.05582742020487785,
-0.10526636987924576,
0.2191900759935379,
0.04355299472808838,
-0.05570599064230919,
0.044033151119947433,
-0.02826712839305401,
-0.11648732423782349,
0.06360650807619095,
-0.0004697571275755763,
0.027145210653543472,
0.044818732887506485,
0.18472519516944885,
0.007419068366289139,
0.009932284243404865,
-0.028633780777454376,
-0.06298019737005234,
0.04942432418465614,
-0.06183011457324028,
-0.06819487363100052,
-0.08883577585220337,
0.055467236787080765,
-0.08942516893148422,
0.13376666605472565,
0.18491673469543457,
-0.0018199807964265347,
0.04333365336060524,
-0.07365332543849945,
-0.020772336050868034,
0.03162096068263054,
0.060758400708436966,
-0.07894327491521835,
-0.15261894464492798,
-0.020634401589632034,
-0.09706107527017593,
0.029465345665812492,
-0.24112991988658905,
-0.029628898948431015,
0.0033124343026429415,
-0.09539756178855896,
-0.035445600748062134,
0.11389769613742828,
-0.0224800743162632,
-0.017796089872717857,
-0.006886309012770653,
-0.059331487864255905,
0.01612827181816101,
0.06717712432146072,
-0.18477782607078552,
-0.07065814733505249
] |
null | null | transformers |
# ProtT5-XL-UniRef50 model
Pretrained model on protein sequences using a masked language modeling (MLM) objective. It was introduced in
[this paper](https://doi.org/10.1101/2020.07.12.199554) and first released in
[this repository](https://github.com/agemagician/ProtTrans). This model is trained on uppercase amino acids: it only works with capital letter amino acids.
## Model description
ProtT5-XL-UniRef50 is based on the `t5-3b` model and was pretrained on a large corpus of protein sequences in a self-supervised fashion.
This means it was pretrained on the raw protein sequences only, with no humans labelling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those protein sequences.
One important difference between this T5 model and the original T5 version is the denosing objective.
The original T5-3B model was pretrained using a span denosing objective, while this model was pre-trained with a Bart-like MLM denosing objective.
The masking probability is consistent with the original T5 training by randomly masking 15% of the amino acids in the input.
It has been shown that the features extracted from this self-supervised model (LM-embeddings) captured important biophysical properties governing protein shape.
shape.
This implied learning some of the grammar of the language of life realized in protein sequences.
## Intended uses & limitations
The model could be used for protein feature extraction or to be fine-tuned on downstream tasks.
We have noticed in some tasks on can gain more accuracy by fine-tuning the model rather than using it as a feature extractor.
We have also noticed that for feature extraction, its better to use the feature extracted from the encoder not from the decoder.
### How to use
Here is how to use this model to extract the features of a given protein sequence in PyTorch:
```python
sequence_examples = ["PRTEINO", "SEQWENCE"]
# this will replace all rare/ambiguous amino acids by X and introduce white-space between all amino acids
sequence_examples = [" ".join(list(re.sub(r"[UZOB]", "X", sequence))) for sequence in sequence_examples]
# tokenize sequences and pad up to the longest sequence in the batch
ids = tokenizer.batch_encode_plus(sequence_examples, add_special_tokens=True, padding="longest")
input_ids = torch.tensor(ids['input_ids']).to(device)
attention_mask = torch.tensor(ids['attention_mask']).to(device)
# generate embeddings
with torch.no_grad():
embedding_repr = model(input_ids=input_ids,attention_mask=attention_mask)
# extract embeddings for the first ([0,:]) sequence in the batch while removing padded & special tokens ([0,:7])
emb_0 = embedding_repr.last_hidden_state[0,:7] # shape (7 x 1024)
print(f"Shape of per-residue embedding of first sequences: {emb_0.shape}")
# do the same for the second ([1,:]) sequence in the batch while taking into account different sequence lengths ([1,:8])
emb_1 = embedding_repr.last_hidden_state[1,:8] # shape (8 x 1024)
# if you want to derive a single representation (per-protein embedding) for the whole protein
emb_0_per_protein = emb_0.mean(dim=0) # shape (1024)
print(f"Shape of per-protein embedding of first sequences: {emb_0_per_protein.shape}")
```
## Training data
The ProtT5-XL-UniRef50 model was pretrained on [UniRef50](https://www.uniprot.org/help/uniref), a dataset consisting of 45 million protein sequences.
## Training procedure
### Preprocessing
The protein sequences are uppercased and tokenized using a single space and a vocabulary size of 21. The rare amino acids "U,Z,O,B" were mapped to "X".
The inputs of the model are then of the form:
```
Protein Sequence [EOS]
```
The preprocessing step was performed on the fly, by cutting and padding the protein sequences up to 512 tokens.
The details of the masking procedure for each sequence are as follows:
- 15% of the amino acids are masked.
- In 90% of the cases, the masked amino acids are replaced by `[MASK]` token.
- In 10% of the cases, the masked amino acids are replaced by a random amino acid (different) from the one they replace.
### Pretraining
The model was trained on a single TPU Pod V2-256 for 991.5 thousand steps in total, using sequence length 512 (batch size 2k).
It was trained using ProtT5-XL-BFD model as an initial checkpoint, rather than training from scratch.
It has a total of approximately 3B parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
When the model is used for feature extraction, this model achieves the following results:
Test results :
| Task/Dataset | secondary structure (3-states) | secondary structure (8-states) | Localization | Membrane |
|:-----:|:-----:|:-----:|:-----:|:-----:|
| CASP12 | 81 | 70 | | |
| TS115 | 87 | 77 | | |
| CB513 | 86 | 74 | | |
| DeepLoc | | | 81 | 91 |
### BibTeX entry and citation info
```bibtex
@article {Elnaggar2020.07.12.199554,
author = {Elnaggar, Ahmed and Heinzinger, Michael and Dallago, Christian and Rehawi, Ghalia and Wang, Yu and Jones, Llion and Gibbs, Tom and Feher, Tamas and Angerer, Christoph and Steinegger, Martin and BHOWMIK, DEBSINDHU and Rost, Burkhard},
title = {ProtTrans: Towards Cracking the Language of Life{\textquoteright}s Code Through Self-Supervised Deep Learning and High Performance Computing},
elocation-id = {2020.07.12.199554},
year = {2020},
doi = {10.1101/2020.07.12.199554},
publisher = {Cold Spring Harbor Laboratory},
abstract = {Computational biology and bioinformatics provide vast data gold-mines from protein sequences, ideal for Language Models (LMs) taken from Natural Language Processing (NLP). These LMs reach for new prediction frontiers at low inference costs. Here, we trained two auto-regressive language models (Transformer-XL, XLNet) and two auto-encoder models (Bert, Albert) on data from UniRef and BFD containing up to 393 billion amino acids (words) from 2.1 billion protein sequences (22- and 112 times the entire English Wikipedia). The LMs were trained on the Summit supercomputer at Oak Ridge National Laboratory (ORNL), using 936 nodes (total 5616 GPUs) and one TPU Pod (V3-512 or V3-1024). We validated the advantage of up-scaling LMs to larger models supported by bigger data by predicting secondary structure (3-states: Q3=76-84, 8 states: Q8=65-73), sub-cellular localization for 10 cellular compartments (Q10=74) and whether a protein is membrane-bound or water-soluble (Q2=89). Dimensionality reduction revealed that the LM-embeddings from unlabeled data (only protein sequences) captured important biophysical properties governing protein shape. This implied learning some of the grammar of the language of life realized in protein sequences. The successful up-scaling of protein LMs through HPC to larger data sets slightly reduced the gap between models trained on evolutionary information and LMs. Availability ProtTrans: \<a href="https://github.com/agemagician/ProtTrans"\>https://github.com/agemagician/ProtTrans\</a\>Competing Interest StatementThe authors have declared no competing interest.},
URL = {https://www.biorxiv.org/content/early/2020/07/21/2020.07.12.199554},
eprint = {https://www.biorxiv.org/content/early/2020/07/21/2020.07.12.199554.full.pdf},
journal = {bioRxiv}
}
```
> Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
| {"tags": ["protein language model"], "datasets": ["UniRef50"]} | text2text-generation | Rostlab/prot_t5_xl_uniref50 | [
"transformers",
"pytorch",
"t5",
"text2text-generation",
"protein language model",
"dataset:UniRef50",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #t5 #text2text-generation #protein language model #dataset-UniRef50 #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
| ProtT5-XL-UniRef50 model
========================
Pretrained model on protein sequences using a masked language modeling (MLM) objective. It was introduced in
this paper and first released in
this repository. This model is trained on uppercase amino acids: it only works with capital letter amino acids.
Model description
-----------------
ProtT5-XL-UniRef50 is based on the 't5-3b' model and was pretrained on a large corpus of protein sequences in a self-supervised fashion.
This means it was pretrained on the raw protein sequences only, with no humans labelling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those protein sequences.
One important difference between this T5 model and the original T5 version is the denosing objective.
The original T5-3B model was pretrained using a span denosing objective, while this model was pre-trained with a Bart-like MLM denosing objective.
The masking probability is consistent with the original T5 training by randomly masking 15% of the amino acids in the input.
It has been shown that the features extracted from this self-supervised model (LM-embeddings) captured important biophysical properties governing protein shape.
shape.
This implied learning some of the grammar of the language of life realized in protein sequences.
Intended uses & limitations
---------------------------
The model could be used for protein feature extraction or to be fine-tuned on downstream tasks.
We have noticed in some tasks on can gain more accuracy by fine-tuning the model rather than using it as a feature extractor.
We have also noticed that for feature extraction, its better to use the feature extracted from the encoder not from the decoder.
### How to use
Here is how to use this model to extract the features of a given protein sequence in PyTorch:
Training data
-------------
The ProtT5-XL-UniRef50 model was pretrained on UniRef50, a dataset consisting of 45 million protein sequences.
Training procedure
------------------
### Preprocessing
The protein sequences are uppercased and tokenized using a single space and a vocabulary size of 21. The rare amino acids "U,Z,O,B" were mapped to "X".
The inputs of the model are then of the form:
The preprocessing step was performed on the fly, by cutting and padding the protein sequences up to 512 tokens.
The details of the masking procedure for each sequence are as follows:
* 15% of the amino acids are masked.
* In 90% of the cases, the masked amino acids are replaced by '[MASK]' token.
* In 10% of the cases, the masked amino acids are replaced by a random amino acid (different) from the one they replace.
### Pretraining
The model was trained on a single TPU Pod V2-256 for 991.5 thousand steps in total, using sequence length 512 (batch size 2k).
It was trained using ProtT5-XL-BFD model as an initial checkpoint, rather than training from scratch.
It has a total of approximately 3B parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
When the model is used for feature extraction, this model achieves the following results:
Test results :
### BibTeX entry and citation info
>
> Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to extract the features of a given protein sequence in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe ProtT5-XL-UniRef50 model was pretrained on UniRef50, a dataset consisting of 45 million protein sequences.\n\n\nTraining procedure\n------------------",
"### Preprocessing\n\n\nThe protein sequences are uppercased and tokenized using a single space and a vocabulary size of 21. The rare amino acids \"U,Z,O,B\" were mapped to \"X\".\nThe inputs of the model are then of the form:\n\n\nThe preprocessing step was performed on the fly, by cutting and padding the protein sequences up to 512 tokens.\n\n\nThe details of the masking procedure for each sequence are as follows:\n\n\n* 15% of the amino acids are masked.\n* In 90% of the cases, the masked amino acids are replaced by '[MASK]' token.\n* In 10% of the cases, the masked amino acids are replaced by a random amino acid (different) from the one they replace.",
"### Pretraining\n\n\nThe model was trained on a single TPU Pod V2-256 for 991.5 thousand steps in total, using sequence length 512 (batch size 2k).\nIt was trained using ProtT5-XL-BFD model as an initial checkpoint, rather than training from scratch. \n\nIt has a total of approximately 3B parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for feature extraction, this model achieves the following results:\n\n\nTest results :",
"### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #t5 #text2text-generation #protein language model #dataset-UniRef50 #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to extract the features of a given protein sequence in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe ProtT5-XL-UniRef50 model was pretrained on UniRef50, a dataset consisting of 45 million protein sequences.\n\n\nTraining procedure\n------------------",
"### Preprocessing\n\n\nThe protein sequences are uppercased and tokenized using a single space and a vocabulary size of 21. The rare amino acids \"U,Z,O,B\" were mapped to \"X\".\nThe inputs of the model are then of the form:\n\n\nThe preprocessing step was performed on the fly, by cutting and padding the protein sequences up to 512 tokens.\n\n\nThe details of the masking procedure for each sequence are as follows:\n\n\n* 15% of the amino acids are masked.\n* In 90% of the cases, the masked amino acids are replaced by '[MASK]' token.\n* In 10% of the cases, the masked amino acids are replaced by a random amino acid (different) from the one they replace.",
"### Pretraining\n\n\nThe model was trained on a single TPU Pod V2-256 for 991.5 thousand steps in total, using sequence length 512 (batch size 2k).\nIt was trained using ProtT5-XL-BFD model as an initial checkpoint, rather than training from scratch. \n\nIt has a total of approximately 3B parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for feature extraction, this model achieves the following results:\n\n\nTest results :",
"### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>"
] | [
64,
69,
176,
138,
34
] | [
"passage: TAGS\n#transformers #pytorch #t5 #text2text-generation #protein language model #dataset-UniRef50 #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to extract the features of a given protein sequence in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe ProtT5-XL-UniRef50 model was pretrained on UniRef50, a dataset consisting of 45 million protein sequences.\n\n\nTraining procedure\n------------------### Preprocessing\n\n\nThe protein sequences are uppercased and tokenized using a single space and a vocabulary size of 21. The rare amino acids \"U,Z,O,B\" were mapped to \"X\".\nThe inputs of the model are then of the form:\n\n\nThe preprocessing step was performed on the fly, by cutting and padding the protein sequences up to 512 tokens.\n\n\nThe details of the masking procedure for each sequence are as follows:\n\n\n* 15% of the amino acids are masked.\n* In 90% of the cases, the masked amino acids are replaced by '[MASK]' token.\n* In 10% of the cases, the masked amino acids are replaced by a random amino acid (different) from the one they replace.### Pretraining\n\n\nThe model was trained on a single TPU Pod V2-256 for 991.5 thousand steps in total, using sequence length 512 (batch size 2k).\nIt was trained using ProtT5-XL-BFD model as an initial checkpoint, rather than training from scratch. \n\nIt has a total of approximately 3B parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for feature extraction, this model achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>"
] | [
0.011010963469743729,
0.10694486647844315,
-0.0023246435448527336,
0.052557412534952164,
0.08139906823635101,
0.005050817504525185,
-0.009587361477315426,
0.15214009582996368,
-0.00816518347710371,
0.06853312253952026,
0.018797947093844414,
0.003563677193596959,
0.13953320682048798,
0.22129812836647034,
0.07194837182760239,
-0.18403929471969604,
0.03885839879512787,
-0.02092059515416622,
0.042189277708530426,
0.08089431375265121,
0.06307438015937805,
-0.0728459432721138,
0.08525446057319641,
-0.03674735873937607,
0.03534064069390297,
0.026320725679397583,
0.011211954057216644,
-0.0255600418895483,
0.10667668282985687,
0.03343971446156502,
0.05135522037744522,
0.010180461220443249,
0.045803461223840714,
-0.21516335010528564,
0.030858339741826057,
0.10334470123052597,
-0.0023866230621933937,
0.07075179368257523,
0.05752573534846306,
0.12232360243797302,
-0.010276520624756813,
-0.08404264599084854,
0.006287053227424622,
0.08032185584306717,
-0.09254000335931778,
-0.05498243495821953,
-0.11306989192962646,
0.05112157016992569,
0.042246054857969284,
0.057299330830574036,
-0.012474755756556988,
-0.013713027350604534,
0.08135902136564255,
0.059570588171482086,
0.20646820962429047,
-0.20973046123981476,
0.04280506446957588,
-0.014554009772837162,
0.006274986080825329,
-0.009984269738197327,
-0.044757068157196045,
-0.0013983104145154357,
-0.00041704546310938895,
-0.021371090784668922,
0.10853895545005798,
-0.04745325818657875,
-0.061341241002082825,
-0.06541342288255692,
-0.050732821226119995,
-0.03712787851691246,
0.01599722169339657,
0.006893094163388014,
-0.1280137151479721,
-0.06670363992452621,
-0.07710658013820648,
-0.11804314702749252,
-0.023045511916279793,
-0.09677712619304657,
0.04924473911523819,
0.04313259944319725,
0.059646107256412506,
-0.14143386483192444,
-0.04682323336601257,
-0.03639593347907066,
-0.04606322944164276,
0.08388301730155945,
0.08493225276470184,
-0.00964951142668724,
-0.014382462948560715,
0.03577860817313194,
-0.11931717395782471,
-0.04890270158648491,
-0.07921122014522552,
-0.00793765950948,
-0.12642912566661835,
-0.04592151194810867,
-0.01688980869948864,
-0.2651190161705017,
-0.038631122559309006,
0.10301048308610916,
-0.08662858605384827,
0.09694413840770721,
-0.0899127647280693,
-0.024075452238321304,
-0.03702663257718086,
0.15841281414031982,
-0.07190267741680145,
-0.006826902739703655,
0.040291838347911835,
0.052658721804618835,
-0.03631778433918953,
-0.0037921613547950983,
-0.00043786337482742965,
0.0025612276513129473,
0.06473544239997864,
0.042302995920181274,
0.010015448555350304,
0.07661791145801544,
-0.052791859954595566,
-0.017681030556559563,
0.09592406451702118,
-0.13950493931770325,
0.0022660258691757917,
-0.014894823543727398,
-0.06519271433353424,
0.02973191626369953,
0.10011716932058334,
-0.0769374743103981,
-0.10123196989297867,
0.02091400697827339,
-0.07381927967071533,
-0.044562928378582,
-0.09088059514760971,
-0.06089070811867714,
-0.009643097408115864,
-0.012137328274548054,
-0.07599565386772156,
-0.06852500885725021,
-0.18132935464382172,
-0.08251891285181046,
0.02274942956864834,
-0.0016232745256274939,
-0.035585373640060425,
0.0020763997454196215,
0.03443277254700661,
-0.03961837291717529,
-0.01592973619699478,
-0.010621806606650352,
0.017740245908498764,
0.032058246433734894,
-0.03869609162211418,
0.06388780474662781,
0.0018698034109547734,
0.06594609469175339,
-0.10787507146596909,
0.03800126910209656,
-0.21064504981040955,
0.062432821840047836,
-0.01301604975014925,
-0.046610236167907715,
-0.14025656878948212,
-0.02791316621005535,
-0.09322988986968994,
-0.024172954261302948,
0.05540740117430687,
0.1099289059638977,
-0.18004199862480164,
-0.03147340938448906,
0.2259848564863205,
-0.052518874406814575,
-0.01000156532973051,
0.08121746778488159,
-0.015566927380859852,
0.04319469630718231,
0.14323309063911438,
0.13044747710227966,
0.01871156319975853,
-0.1029607281088829,
-0.09191318601369858,
-0.06205112859606743,
-0.03980986401438713,
0.09724583476781845,
0.04904143139719963,
-0.050276774913072586,
-0.02095920965075493,
0.031209729611873627,
0.06449125707149506,
-0.04365948215126991,
0.025786807760596275,
-0.04057616740465164,
0.031126651912927628,
0.0010043028742074966,
-0.013114715926349163,
-0.0500316396355629,
-0.0035790654364973307,
0.02920658141374588,
-0.07975124567747116,
-0.07187800109386444,
0.07281025499105453,
-0.08883261680603027,
0.11916469782590866,
-0.030276983976364136,
-0.0025769679341465235,
0.060712914913892746,
0.0347244068980217,
-0.1129881963133812,
-0.020752891898155212,
0.08385201543569565,
-0.1450253278017044,
0.0685616061091423,
-0.11853419989347458,
-0.00380246271379292,
0.018776314333081245,
-0.032210465520620346,
0.017714204266667366,
-0.040064070373773575,
-0.03234781324863434,
-0.10653062164783478,
-0.0693129301071167,
-0.03009282983839512,
-0.025417359545826912,
-0.06957708299160004,
-0.024021850898861885,
-0.001109244185499847,
-0.09015271067619324,
0.06448312103748322,
0.05315316841006279,
-0.06341702491044998,
0.02708376571536064,
0.03775007650256157,
0.006718700751662254,
-0.03373684361577034,
-0.02423866093158722,
-0.009903717786073685,
-0.028493765741586685,
0.07400187849998474,
-0.1053876206278801,
-0.12642793357372284,
0.03635295480489731,
0.09191467612981796,
-0.10181456804275513,
-0.005263117607682943,
0.0030447307508438826,
-0.04910197854042053,
-0.05167275667190552,
-0.005117299035191536,
0.24552109837532043,
0.00033565476769581437,
0.12573984265327454,
-0.05529990792274475,
0.042597632855176926,
-0.02812293730676174,
0.03270460292696953,
-0.030787087976932526,
-0.009331668727099895,
0.02077312022447586,
-0.1372714787721634,
0.06360466033220291,
-0.09709574282169342,
0.10886460542678833,
0.11500220745801926,
0.060102034360170364,
-0.12606878578662872,
-0.007456389721482992,
-0.03397846966981888,
0.0016904000658541918,
0.061382610350847244,
0.08834561705589294,
0.04661209508776665,
0.013711213134229183,
0.03652794659137726,
0.04500077664852142,
-0.039340488612651825,
0.09562283009290695,
0.07263682037591934,
-0.08680973947048187,
0.008358907885849476,
-0.09577111154794693,
0.02528597041964531,
0.0803910419344902,
0.07134554535150528,
0.0989297404885292,
-0.04256771504878998,
0.001419341773726046,
-0.0901336520910263,
0.17678867280483246,
-0.07982686161994934,
-0.2792729139328003,
-0.1762276142835617,
-0.015980985015630722,
0.019660281017422676,
0.03340735659003258,
0.017787272110581398,
0.00589356292039156,
-0.06985922157764435,
-0.1044674962759018,
0.05927504599094391,
-0.021520452573895454,
0.03181085363030434,
0.015274187549948692,
-0.00027084272005595267,
0.08440839499235153,
-0.079122394323349,
0.0006859437562525272,
-0.01565738208591938,
-0.017690390348434448,
-0.044691674411296844,
-0.00374449766241014,
0.09798746556043625,
0.04091876000165939,
-0.06778476387262344,
-0.03394279628992081,
-0.054455485194921494,
0.0749073401093483,
-0.038571566343307495,
0.11625506728887558,
0.062444791197776794,
-0.08018362522125244,
0.08861839026212692,
0.14208577573299408,
-0.011173021048307419,
-0.003064686432480812,
0.027193572372198105,
0.09582755714654922,
-0.02117604948580265,
-0.21892119944095612,
0.020312344655394554,
-0.0015500715235248208,
0.04753507673740387,
0.10721994936466217,
0.07763421535491943,
0.017507066950201988,
0.016791751608252525,
-0.04076424613595009,
-0.00436252448707819,
-0.01389413233846426,
0.05883600562810898,
0.016009798273444176,
-0.004436539951711893,
0.04465467110276222,
-0.026272084563970566,
0.039145808666944504,
0.10720110684633255,
0.08982443809509277,
0.13230066001415253,
-0.020594196394085884,
0.193108931183815,
0.004634520970284939,
-0.003991302102804184,
-0.0002561669098213315,
0.061304204165935516,
-0.03833083063364029,
0.03854469954967499,
-0.04833398014307022,
-0.0160687156021595,
-0.02438965253531933,
0.03821227326989174,
-0.016754375770688057,
-0.0032866811379790306,
-0.004774179309606552,
0.04058437421917915,
0.06226547062397003,
0.24096383154392242,
0.00929152499884367,
-0.17872093617916107,
-0.014491720125079155,
-0.025446033105254173,
-0.10447634011507034,
-0.05610447749495506,
0.003527275053784251,
-0.02188154309988022,
-0.07917270064353943,
0.03255369886755943,
-0.09756729006767273,
0.0902061015367508,
-0.11598679423332214,
0.007908805273473263,
0.096444271504879,
0.06443704664707184,
-0.04711730405688286,
0.05664508044719696,
-0.29089605808258057,
0.09284904599189758,
0.029614005237817764,
0.08942258358001709,
-0.04103273153305054,
0.04916951432824135,
0.0017522732960060239,
-0.03332476690411568,
0.17873956263065338,
-0.004021470434963703,
-0.19518586993217468,
-0.15230831503868103,
-0.1382385492324829,
0.03175123408436775,
0.0954432412981987,
0.04980473592877388,
0.1566772758960724,
-0.041883792728185654,
0.043633945286273956,
-0.003670369740575552,
0.09966680407524109,
-0.15645037591457367,
-0.12170020490884781,
0.042065609246492386,
-0.0496760718524456,
0.06282239407300949,
-0.013113271445035934,
-0.038802552968263626,
-0.09726209938526154,
0.18008343875408173,
-0.06495163589715958,
-0.06088623031973839,
-0.1504773497581482,
-0.007463634479790926,
0.06880085170269012,
-0.1391308754682541,
0.05613265931606293,
0.010147418826818466,
0.043615102767944336,
-0.06434210389852524,
-0.19496813416481018,
0.08094299584627151,
-0.005572578404098749,
-0.08671914041042328,
-0.04350480064749718,
0.03244246542453766,
0.0935850515961647,
0.044810179620981216,
0.024937978014349937,
0.027386244386434555,
0.039490990340709686,
-0.0623769648373127,
0.01089417189359665,
0.0179708544164896,
0.11586574465036392,
0.04196261987090111,
-0.1433779001235962,
-0.04715268686413765,
-0.11913297325372696,
0.049834687262773514,
0.08481425046920776,
0.18961574137210846,
-0.04152446612715721,
0.1398567408323288,
0.13561803102493286,
-0.13528524339199066,
-0.1802421361207962,
-0.04730541259050369,
0.022399064153432846,
0.053266093134880066,
0.06159890815615654,
-0.3008040487766266,
-0.015762923285365105,
0.04789956286549568,
-0.042556438595056534,
0.010202305391430855,
-0.1552947461605072,
-0.13301780819892883,
0.05067120119929314,
0.03179343044757843,
0.06487097591161728,
-0.09040378779172897,
-0.017089437693357468,
-0.0016681902343407273,
0.03133614361286163,
0.06179776042699814,
0.11320929229259491,
0.14754994213581085,
-0.04211285710334778,
-0.04631093516945839,
0.036073096096515656,
-0.03167354315519333,
0.024894729256629944,
-0.025074096396565437,
0.04171920567750931,
-0.00399295287206769,
0.04056143760681152,
-0.022135907784104347,
-0.03996363282203674,
0.1265886425971985,
0.04407153278589249,
0.053752534091472626,
-0.08907204121351242,
-0.060334768146276474,
-0.049214620143175125,
0.05628223344683647,
-0.014609342440962791,
-0.05245446786284447,
-0.09514889866113663,
0.0986955389380455,
0.1377984881401062,
-0.03424975275993347,
0.06924823671579361,
-0.032116178423166275,
0.04194936156272888,
0.21794486045837402,
-0.017544781789183617,
0.04018661752343178,
-0.06720906496047974,
0.10093645006418228,
-0.026142841205000877,
0.023106997832655907,
-0.035370633006095886,
0.07081878185272217,
0.07443156838417053,
-0.012344622053205967,
0.14145904779434204,
0.018225228413939476,
-0.159966379404068,
-0.00956330168992281,
0.04001418873667717,
-0.1353205144405365,
-0.11446981132030487,
-0.01597476750612259,
-0.18551334738731384,
-0.1350187212228775,
-0.02441900409758091,
0.08198460191488266,
-0.06253736466169357,
-0.041951026767492294,
0.00975262001156807,
0.074464812874794,
0.01906776800751686,
0.11952131986618042,
0.051117926836013794,
-0.020086362957954407,
-0.03444754704833031,
0.19360579550266266,
0.0827452763915062,
-0.11369738727807999,
0.0878603607416153,
0.12745238840579987,
-0.06657145172357559,
0.01809050142765045,
0.006634427234530449,
0.043631136417388916,
0.024476123973727226,
-0.04613318294286728,
-0.12781089544296265,
0.0036327799316495657,
0.1018877923488617,
0.05079605057835579,
0.01784900762140751,
0.051283594220876694,
-0.025703061372041702,
0.01250346377491951,
-0.05147361010313034,
0.10831615328788757,
0.013447542674839497,
0.06799089908599854,
-0.01641320250928402,
0.0875631645321846,
-0.021770069375634193,
0.07214675843715668,
-0.03612902760505676,
0.04324685037136078,
-0.07923438400030136,
-0.04789341986179352,
0.02271859161555767,
-0.017192987725138664,
-0.10350196808576584,
-0.06663916260004044,
-0.018718209117650986,
0.009720584377646446,
0.01784740388393402,
0.0677478089928627,
-0.027095051482319832,
-0.08266828954219818,
-0.06061870977282524,
0.03927174583077431,
-0.1460612714290619,
0.02534063160419464,
0.03162359818816185,
0.005422429647296667,
0.062236249446868896,
-0.03848915547132492,
-0.03139008581638336,
0.031534768640995026,
0.01431294996291399,
0.02567139081656933,
-0.002961866557598114,
-0.0020772081334143877,
-0.03774918243288994,
-0.15795078873634338,
-0.003094753483310342,
-0.022756781429052353,
-0.0349242277443409,
-0.06404431909322739,
0.08629867434501648,
-0.11641573160886765,
-0.030423365533351898,
-0.04991639778017998,
0.07666970044374466,
-0.023417944088578224,
0.04954972490668297,
0.017599528655409813,
0.007938402704894543,
0.08098313957452774,
-0.04777166619896889,
0.04764838516712189,
-0.13829191029071808,
0.008308540098369122,
0.02533023990690708,
-0.03516409173607826,
0.030987940728664398,
-0.06280723959207535,
0.03977592661976814,
0.005325853358954191,
0.09891250729560852,
0.011731332167983055,
-0.019524313509464264,
0.04381328448653221,
-0.10928087681531906,
-0.08150185644626617,
0.005089950747787952,
0.11394742876291275,
-0.05695963650941849,
-0.09844233095645905,
0.09925319999456406,
0.0426788404583931,
0.021775782108306885,
0.13419540226459503,
0.23435306549072266,
0.18171213567256927,
0.12877966463565826,
0.04875822737812996,
-0.05972748249769211,
-0.038804203271865845,
-0.1703074872493744,
0.018232468515634537,
-0.0534411296248436,
0.015626952052116394,
0.025046207010746002,
0.040010735392570496,
0.10064426809549332,
-0.15266691148281097,
0.18247346580028534,
0.04568926990032196,
-0.07481778413057327,
-0.10719442367553711,
-0.1650555580854416,
-0.0807899683713913,
-0.018803006038069725,
-0.04267987236380577,
-0.12860924005508423,
0.046910449862480164,
0.16074799001216888,
-0.024398300796747208,
-0.016689853742718697,
0.14105604588985443,
-0.06910760700702667,
-0.11677005887031555,
0.07830281555652618,
0.02540556900203228,
0.002848264994099736,
0.06851313263177872,
-0.041602373123168945,
0.052866630256175995,
0.024555793032050133,
0.10136744379997253,
0.0061854771338403225,
0.10787802934646606,
0.07004367560148239,
-0.016659444198012352,
-0.04978441819548607,
-0.020780891180038452,
-0.009774340316653252,
0.07336296141147614,
0.1418394148349762,
0.0032463171519339085,
-0.05816251039505005,
-0.009463062509894371,
0.16792479157447815,
-0.045070894062519073,
0.056803930550813675,
-0.1258484572172165,
0.21432295441627502,
0.12999671697616577,
-0.030002988874912262,
0.0207501407712698,
-0.11246722936630249,
0.006968408357352018,
0.23003466427326202,
0.058469533920288086,
-0.0010414605494588614,
0.008940801955759525,
-0.03227822855114937,
0.02309020794928074,
0.08751356601715088,
0.11094643175601959,
0.04124874621629715,
-0.020651916041970253,
0.0008504082798026502,
0.04760247468948364,
-0.004431856796145439,
-0.04790312424302101,
-0.10207472741603851,
0.15808285772800446,
0.017942020669579506,
0.01925653964281082,
-0.09233993291854858,
0.0008939193212427199,
-0.032643262296915054,
-0.2666781544685364,
-0.05384962633252144,
-0.07007424533367157,
-0.10108444094657898,
-0.041177183389663696,
-0.016785433515906334,
0.003446441376581788,
0.030712325125932693,
0.06179274618625641,
0.10307818651199341,
0.14595414698123932,
0.06806765496730804,
-0.06309103220701218,
-0.04795914888381958,
0.03166257590055466,
-0.11272991448640823,
0.22510962188243866,
0.04993361234664917,
-0.07115025818347931,
0.041215501725673676,
-0.03441131114959717,
-0.12279623001813889,
0.031357262283563614,
-0.010224918834865093,
0.016604730859398842,
0.03194818273186684,
0.1693350076675415,
0.014974129386246204,
0.0635882019996643,
0.007078571245074272,
-0.024684833362698555,
0.0315474309027195,
-0.1048947274684906,
-0.07692208886146545,
-0.08277168869972229,
0.039277609437704086,
-0.08891142904758453,
0.15048107504844666,
0.19060704112052917,
0.009183920919895172,
0.04283634200692177,
-0.05577020347118378,
-0.02020995318889618,
0.018190644681453705,
0.07658883184194565,
-0.051202863454818726,
-0.11636935919523239,
-0.014506913721561432,
-0.0355236791074276,
0.05547357723116875,
-0.20404230058193207,
-0.045078348368406296,
0.015865439549088478,
-0.06467531621456146,
-0.04243185371160507,
0.12612606585025787,
-0.006646856665611267,
-0.026410004124045372,
-0.020946957170963287,
-0.12880417704582214,
0.03390850871801376,
0.06802739948034286,
-0.1582152247428894,
-0.06256205588579178
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilroberta-base-finetuned-wikitext2
This model is a fine-tuned version of [distilroberta-base](https://huggingface.co/distilroberta-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 2.2005
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| No log | 1.0 | 58 | 2.2650 |
| No log | 2.0 | 116 | 2.2408 |
| No log | 3.0 | 174 | 2.1696 |
### Framework versions
- Transformers 4.12.3
- Pytorch 1.9.0+cu111
- Datasets 1.15.1
- Tokenizers 0.10.3
| {"license": "apache-2.0", "tags": ["generated_from_trainer"], "model-index": [{"name": "distilroberta-base-finetuned-wikitext2", "results": []}]} | fill-mask | Roy029/distilroberta-base-finetuned-wikitext2 | [
"transformers",
"pytorch",
"tensorboard",
"roberta",
"fill-mask",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #roberta #fill-mask #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
| distilroberta-base-finetuned-wikitext2
======================================
This model is a fine-tuned version of distilroberta-base on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 2.2005
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3.0
### Training results
### Framework versions
* Transformers 4.12.3
* Pytorch 1.9.0+cu111
* Datasets 1.15.1
* Tokenizers 0.10.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.3\n* Pytorch 1.9.0+cu111\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #roberta #fill-mask #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.3\n* Pytorch 1.9.0+cu111\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] | [
56,
98,
4,
34
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #roberta #fill-mask #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0### Training results### Framework versions\n\n\n* Transformers 4.12.3\n* Pytorch 1.9.0+cu111\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] | [
-0.11075592041015625,
0.059391822665929794,
-0.0023591185454279184,
0.12596988677978516,
0.1646549552679062,
0.025098970159888268,
0.12100282311439514,
0.11386652290821075,
-0.10298281908035278,
0.023443469777703285,
0.1337265968322754,
0.17026154696941376,
0.01145606953650713,
0.13295583426952362,
-0.027879735454916954,
-0.23931725323200226,
-0.009239284321665764,
0.03975997492671013,
-0.10814931988716125,
0.1396605372428894,
0.08913110941648483,
-0.13431686162948608,
0.07287351042032242,
0.0140312435105443,
-0.21841733157634735,
0.010047892108559608,
0.02294311299920082,
-0.06620075553655624,
0.1509304791688919,
0.005636443383991718,
0.14089447259902954,
-0.006321447901427746,
0.08235185593366623,
-0.15547668933868408,
0.016028890386223793,
0.05537404865026474,
0.005440525244921446,
0.08382658660411835,
0.04251602664589882,
0.0032209022901952267,
0.09950659424066544,
-0.0886029526591301,
0.05514038726687431,
0.020223550498485565,
-0.12614959478378296,
-0.2414097785949707,
-0.0838465541601181,
0.017768261954188347,
0.05836067348718643,
0.10932207852602005,
0.006389291025698185,
0.1504017561674118,
-0.0927651897072792,
0.08913489431142807,
0.24690987169742584,
-0.2799496054649353,
-0.07027678936719894,
0.021685605868697166,
0.019397305324673653,
0.04366970434784889,
-0.10053696483373642,
-0.017358504235744476,
0.043131500482559204,
0.050617773085832596,
0.14591068029403687,
-0.036349743604660034,
-0.10705140978097916,
0.011011968366801739,
-0.13883577287197113,
-0.03691354766488075,
0.0997094139456749,
0.02718651108443737,
-0.03258824720978737,
-0.02874274179339409,
-0.07034582644701004,
-0.16299280524253845,
-0.04531489312648773,
-0.007222933229058981,
0.04556354507803917,
-0.043365105986595154,
-0.08534856885671616,
0.0010310541838407516,
-0.1022174209356308,
-0.07258714735507965,
-0.07765644043684006,
0.16741447150707245,
0.040304649621248245,
0.027166426181793213,
-0.0365925095975399,
0.10121127963066101,
-0.011341206729412079,
-0.14071336388587952,
0.03128001466393471,
0.035757824778556824,
-0.009790911339223385,
-0.027247142046689987,
-0.0694223940372467,
-0.08240554481744766,
0.020135361701250076,
0.12683230638504028,
-0.04802205413579941,
0.039983730763196945,
0.05080324411392212,
0.054074808955192566,
-0.11506281048059464,
0.18974041938781738,
-0.05168084055185318,
-0.01564129814505577,
0.0182297732681036,
0.043443359434604645,
0.028742404654622078,
-0.005524312146008015,
-0.10789046436548233,
-0.0005218333099037409,
0.0902329832315445,
0.017715413123369217,
-0.05117755010724068,
0.0580768845975399,
-0.05804285407066345,
-0.011970805004239082,
0.009533744305372238,
-0.09738359600305557,
0.024190746247768402,
-0.015035557560622692,
-0.07562074810266495,
-0.024512698873877525,
0.042633067816495895,
0.007459450047463179,
-0.009764814749360085,
0.12055640667676926,
-0.09006845951080322,
0.03607708215713501,
-0.11035343259572983,
-0.1088331863284111,
0.008436225354671478,
-0.08635284006595612,
0.022871147841215134,
-0.10336873680353165,
-0.16141554713249207,
0.00019315618555992842,
0.07017739117145538,
-0.023366188630461693,
-0.050960998982191086,
-0.02122453972697258,
-0.07522395998239517,
0.005025370512157679,
-0.012260876595973969,
0.16828905045986176,
-0.05686800926923752,
0.11314475536346436,
0.051042817533016205,
0.08586300164461136,
-0.04902328923344612,
0.05333510786294937,
-0.09488353878259659,
0.008283298462629318,
-0.2015516310930252,
0.0176264438778162,
-0.045330505818128586,
0.06694331020116806,
-0.08562944084405899,
-0.11323821544647217,
0.002730285981670022,
-0.008093513548374176,
0.08681315183639526,
0.0937856137752533,
-0.16395555436611176,
-0.07543649524450302,
0.16781412065029144,
-0.06533161550760269,
-0.10524439066648483,
0.11905679106712341,
-0.051187288016080856,
0.03538006171584129,
0.05315038189291954,
0.12918266654014587,
0.06397974491119385,
-0.1030612513422966,
0.03678072243928909,
-0.004690846428275108,
0.036707766354084015,
-0.08401409536600113,
0.06835054606199265,
-0.00673590786755085,
0.001339729642495513,
0.03214285522699356,
-0.030001040548086166,
0.07266354560852051,
-0.09649331867694855,
-0.10287924855947495,
-0.043665073812007904,
-0.10962827503681183,
0.06657309085130692,
0.07202696800231934,
0.0785435289144516,
-0.09808439016342163,
-0.08575323224067688,
0.033429671078920364,
0.07517992705106735,
-0.04652374982833862,
0.031475797295570374,
-0.06059790775179863,
0.07022742927074432,
-0.06249500438570976,
-0.02765773795545101,
-0.1904604434967041,
-0.01894688792526722,
0.0001715736580081284,
-0.023412121459841728,
0.01732122339308262,
0.007474854588508606,
0.0838008001446724,
0.06616417318582535,
-0.053789060562849045,
-0.014642207883298397,
-0.04588969051837921,
-0.009653499349951744,
-0.1265760362148285,
-0.19490507245063782,
-0.03958648070693016,
-0.019829701632261276,
0.12091834098100662,
-0.15676090121269226,
0.028011301532387733,
-0.05768739432096481,
0.06617379188537598,
0.00419363658875227,
-0.010764489881694317,
-0.04522038996219635,
0.09006479382514954,
-0.018161319196224213,
-0.054569587111473083,
0.07013953477144241,
-0.0014351987047120929,
-0.08170834928750992,
-0.04537196457386017,
-0.08887611329555511,
0.1876198947429657,
0.13766004145145416,
-0.11264567822217941,
-0.08542319387197495,
0.04125916585326195,
-0.06718042492866516,
-0.03429751843214035,
-0.043119072914123535,
0.047773346304893494,
0.16810277104377747,
-0.00426541967317462,
0.13952133059501648,
-0.06351305544376373,
-0.03863328695297241,
0.03510706126689911,
-0.038845494389534,
0.03324387967586517,
0.0953073799610138,
0.13249927759170532,
-0.038922447711229324,
0.13624632358551025,
0.16042852401733398,
-0.118741475045681,
0.11970224231481552,
-0.031262923032045364,
-0.07604118436574936,
-0.01957300305366516,
-0.025575770065188408,
0.011274325661361217,
0.12192888557910919,
-0.1299656182527542,
-0.00016946929099503905,
0.024669667705893517,
0.0006646652473136783,
0.02210739627480507,
-0.23475836217403412,
-0.049491144716739655,
0.027363568544387817,
-0.03781507536768913,
-0.019425777718424797,
-0.0027595392893999815,
0.005542979575693607,
0.100583016872406,
0.005712851416319609,
-0.08837394416332245,
0.04363921284675598,
0.008774318732321262,
-0.06357698142528534,
0.21542875468730927,
-0.0823756605386734,
-0.1626495122909546,
-0.1238359734416008,
-0.07977564632892609,
-0.03804631158709526,
0.00902383029460907,
0.05848599970340729,
-0.09561562538146973,
-0.036153845489025116,
-0.041388653218746185,
0.014500632882118225,
0.011821066960692406,
0.05182819440960884,
0.006562741938978434,
-0.005451678764075041,
0.08827024698257446,
-0.10838513821363449,
-0.00789128802716732,
-0.05013751983642578,
-0.06265682727098465,
0.060618165880441666,
0.06155878305435181,
0.1261897087097168,
0.15288271009922028,
-0.01987658441066742,
0.004083753097802401,
-0.016997385770082474,
0.22271421551704407,
-0.07109357416629791,
-0.0322640985250473,
0.14485350251197815,
0.002396008465439081,
0.061690304428339005,
0.10230652987957001,
0.07434104382991791,
-0.08452378958463669,
0.007803053595125675,
0.02505202405154705,
-0.04977891966700554,
-0.20808298885822296,
-0.039710089564323425,
-0.06493967771530151,
-0.04735051095485687,
0.09117449074983597,
0.028662873432040215,
0.045534368604421616,
0.0745643898844719,
0.052118055522441864,
0.07770340889692307,
-0.06497398763895035,
0.04563520476222038,
0.07844989746809006,
0.051085226237773895,
0.1265549510717392,
-0.0426567904651165,
-0.0741422101855278,
0.024192197248339653,
-0.014074087142944336,
0.22669945657253265,
-0.0018306693527847528,
0.11543618887662888,
0.07043614983558655,
0.21006490290164948,
-0.0013567593414336443,
0.0993744283914566,
0.0037719672545790672,
-0.053884170949459076,
-0.007453163154423237,
-0.046301115304231644,
-0.036008547991514206,
0.008736061863601208,
-0.044160906225442886,
0.06873928755521774,
-0.10493221133947372,
-0.009985534474253654,
0.04244927689433098,
0.27547451853752136,
0.03270662575960159,
-0.33216145634651184,
-0.0853123590350151,
-0.014006357640028,
-0.01850457303225994,
-0.014139114879071712,
0.004117010626941919,
0.08313293755054474,
-0.09486939013004303,
0.029705584049224854,
-0.07971572875976562,
0.08514373749494553,
0.00088412722107023,
0.04087657853960991,
0.07296821475028992,
0.11401639878749847,
0.016166353598237038,
0.0689251497387886,
-0.3108062744140625,
0.2882195711135864,
-0.0013733728555962443,
0.0828886553645134,
-0.08396275341510773,
0.005943994969129562,
0.0435829721391201,
0.031499262899160385,
0.07137405127286911,
-0.015637576580047607,
-0.008945038542151451,
-0.19098052382469177,
-0.05817505717277527,
0.03137192875146866,
0.08268004655838013,
-0.014556755311787128,
0.09031751751899719,
-0.018450481817126274,
-0.007544775027781725,
0.07348170876502991,
0.010941498912870884,
-0.056566596031188965,
-0.08414043486118317,
-0.004227782133966684,
0.017257245257496834,
-0.07126462459564209,
-0.0716019868850708,
-0.11960645020008087,
-0.12574833631515503,
0.15451744198799133,
0.009373340755701065,
-0.03155655413866043,
-0.11436852067708969,
0.07905969768762589,
0.09647095203399658,
-0.08955633640289307,
0.06509952992200851,
-0.0008002538233995438,
0.06048256531357765,
0.021009191870689392,
-0.07546248286962509,
0.1080741360783577,
-0.07264616340398788,
-0.14432327449321747,
-0.06316845864057541,
0.09448690712451935,
0.03050614893436432,
0.06924314051866531,
-0.019919170066714287,
0.021991508081555367,
-0.04147689789533615,
-0.08277089148759842,
0.03283587098121643,
-0.035340745002031326,
0.0700947567820549,
0.02804173156619072,
-0.04020413011312485,
0.003696818370372057,
-0.05578117445111275,
-0.024140283465385437,
0.17642463743686676,
0.2329464554786682,
-0.10217124223709106,
0.022454923018813133,
0.03615768626332283,
-0.04966489598155022,
-0.20980621874332428,
0.03673475980758667,
0.05781490355730057,
0.016092002391815186,
0.05365399271249771,
-0.16517257690429688,
0.12891872227191925,
0.09682271629571915,
-0.01594826579093933,
0.130156010389328,
-0.34072455763816833,
-0.12664808332920074,
0.12845291197299957,
0.1613301783800125,
0.14633074402809143,
-0.14473311603069305,
-0.01486221794039011,
-0.02370155043900013,
-0.12350953370332718,
0.06955637037754059,
-0.08742959797382355,
0.12788577377796173,
-0.04074037820100784,
0.08769288659095764,
-0.001155101926997304,
-0.07687485963106155,
0.1228608787059784,
-0.0023921774700284004,
0.09310659766197205,
-0.058615561574697495,
-0.015539196319878101,
0.06334599107503891,
-0.0306397657841444,
0.006950616370886564,
-0.06929152458906174,
0.025875374674797058,
-0.04533936828374863,
-0.013013836927711964,
-0.09012842923402786,
0.05647038295865059,
-0.028095075860619545,
-0.05625060200691223,
-0.02993122860789299,
0.024158015847206116,
0.03950781002640724,
-0.01918121427297592,
0.1152147725224495,
0.03614034503698349,
0.15715456008911133,
0.09467550367116928,
0.03289256617426872,
-0.05731268599629402,
-0.10239973664283752,
-0.012024938128888607,
-0.019793642684817314,
0.06208658590912819,
-0.12722109258174896,
0.01776220090687275,
0.12738177180290222,
0.02968674525618553,
0.11597561836242676,
0.08367657661437988,
-0.03054916486144066,
0.01883469708263874,
0.07604628801345825,
-0.1607884019613266,
-0.0658203735947609,
0.01142396591603756,
-0.07340528070926666,
-0.11204340308904648,
0.0407259576022625,
0.07571486383676529,
-0.06934764236211777,
-0.009652560576796532,
-0.011262094601988792,
0.004887794144451618,
-0.07960629463195801,
0.21702025830745697,
0.06405586004257202,
0.05134085565805435,
-0.1042817234992981,
0.060024816542863846,
0.04296291992068291,
-0.07491222023963928,
-0.010070589371025562,
0.06971793621778488,
-0.07403002679347992,
-0.037838563323020935,
0.1179991140961647,
0.16169936954975128,
-0.04830535873770714,
-0.03939446061849594,
-0.15225985646247864,
-0.11422701925039291,
0.06988441199064255,
0.15974414348602295,
0.10913633555173874,
0.0015637284377589822,
-0.04942360520362854,
0.013915613293647766,
-0.11837703734636307,
0.07017442584037781,
0.047598544508218765,
0.0688275620341301,
-0.12318094074726105,
0.1665468066930771,
0.014396606013178825,
0.06334973126649857,
-0.02400023862719536,
0.033952780067920685,
-0.0932302474975586,
0.022004447877407074,
-0.1161150187253952,
-0.03357381373643875,
-0.020723991096019745,
-0.009476404637098312,
-0.010798823088407516,
-0.05508962273597717,
-0.06401972472667694,
0.024987753480672836,
-0.12223649024963379,
-0.03460026532411575,
0.0367257259786129,
0.02734186500310898,
-0.11462375521659851,
-0.0437471829354763,
0.032675210386514664,
-0.05871686711907387,
0.05346061289310455,
0.05848998948931694,
0.014478865079581738,
0.06253319978713989,
-0.14606378972530365,
-0.019269566982984543,
0.06937860697507858,
0.007409386802464724,
0.0730949118733406,
-0.08490625023841858,
-0.012165131978690624,
-0.006459797732532024,
0.06943301856517792,
0.009883926250040531,
0.07933254539966583,
-0.1495269387960434,
-0.00002059156213363167,
-0.0279270987957716,
-0.08230258524417877,
-0.06225552409887314,
0.014278537593781948,
0.08788257092237473,
0.012325983494520187,
0.19399426877498627,
-0.08700311928987503,
0.05507182702422142,
-0.20446082949638367,
0.00003098898014286533,
-0.02875133603811264,
-0.09799274802207947,
-0.11353031545877457,
-0.0450933575630188,
0.0682227835059166,
-0.05216020345687866,
0.1312253475189209,
0.01768726482987404,
0.05297762528061867,
0.02270461805164814,
-0.016995688900351524,
0.016414890065789223,
0.01628309115767479,
0.20687249302864075,
0.029169941321015358,
-0.03377198055386543,
0.07617640495300293,
0.0690094456076622,
0.09627053886651993,
0.11222558468580246,
0.2066889852285385,
0.15453758835792542,
0.03188483044505119,
0.10196051001548767,
0.020138155668973923,
-0.04790003225207329,
-0.14721857011318207,
0.02502698451280594,
-0.05029283091425896,
0.09413141757249832,
-0.01349108386784792,
0.18965695798397064,
0.07212711125612259,
-0.16132251918315887,
0.05800415202975273,
-0.045118290930986404,
-0.08315196633338928,
-0.10277950763702393,
-0.03669671714305878,
-0.0765988901257515,
-0.1250169277191162,
0.007250132039189339,
-0.08661189675331116,
0.012915375642478466,
0.1263120472431183,
-0.0011438203509896994,
-0.021475408226251602,
0.1900094747543335,
0.03196992725133896,
0.03283599019050598,
0.04097606986761093,
0.010673281736671925,
-0.0287065040320158,
-0.0768517330288887,
-0.059940993785858154,
-0.02803712710738182,
-0.013222664594650269,
0.0405457504093647,
-0.06535544991493225,
-0.08229133486747742,
0.055591925978660583,
-0.019180884584784508,
-0.1029294952750206,
0.014840863645076752,
0.016831617802381516,
0.06911002844572067,
0.048435088247060776,
0.012844223529100418,
0.025634126737713814,
-0.022671915590763092,
0.19283398985862732,
-0.08143778145313263,
-0.09840595722198486,
-0.09522035717964172,
0.2559504210948944,
0.03729220852255821,
-0.0227817315608263,
0.024476584047079086,
-0.057785216718912125,
-0.006612717639654875,
0.2615496516227722,
0.21790823340415955,
-0.08741395175457001,
-0.0014374848688021302,
0.009883097372949123,
-0.015202329494059086,
-0.04002789780497551,
0.12101148068904877,
0.14248810708522797,
0.06304525583982468,
-0.1049455776810646,
-0.047481950372457504,
-0.0657849982380867,
-0.013042607344686985,
-0.06518486887216568,
0.03687239810824394,
0.036247722804546356,
0.0037105719093233347,
-0.03848050907254219,
0.05427040159702301,
-0.05856829509139061,
-0.1056407168507576,
0.09327035397291183,
-0.20130464434623718,
-0.1678907424211502,
-0.010556294582784176,
0.1084728017449379,
0.007351717911660671,
0.07069779932498932,
-0.030122902244329453,
0.007904783822596073,
0.06556666642427444,
-0.01488908939063549,
-0.08410490304231644,
-0.09705791622400284,
0.09795858711004257,
-0.10168327391147614,
0.21128766238689423,
-0.039686743170022964,
0.06564068049192429,
0.12555210292339325,
0.07547150552272797,
-0.07179085910320282,
0.06634742021560669,
0.0388154499232769,
-0.09006781131029129,
0.028009751811623573,
0.09835560619831085,
-0.03131381422281265,
0.019577765837311745,
0.028141742572188377,
-0.10369648039340973,
0.024684282019734383,
-0.08191868662834167,
-0.028073100373148918,
-0.03561278060078621,
-0.0373862087726593,
-0.06180679053068161,
0.11776822805404663,
0.21385900676250458,
-0.021205171942710876,
0.009702547453343868,
-0.08033525943756104,
0.015850117430090904,
0.06205058470368385,
0.02003706805408001,
-0.10049272328615189,
-0.21797999739646912,
0.017590733245015144,
0.031034380197525024,
-0.031852759420871735,
-0.2368430495262146,
-0.09946068376302719,
-0.0004984226543456316,
-0.08749312162399292,
-0.08996158093214035,
0.06853914260864258,
0.07147354632616043,
0.059942472726106644,
-0.045299772173166275,
-0.07290127128362656,
-0.07978186011314392,
0.14928977191448212,
-0.1696186512708664,
-0.09430137276649475
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# japanese-roberta-base-finetuned-wikitext2
This model is a fine-tuned version of [rinna/japanese-roberta-base](https://huggingface.co/rinna/japanese-roberta-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.2302
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| No log | 1.0 | 18 | 3.4128 |
| No log | 2.0 | 36 | 3.1374 |
| No log | 3.0 | 54 | 3.2285 |
### Framework versions
- Transformers 4.12.3
- Pytorch 1.9.0+cu111
- Datasets 1.15.1
- Tokenizers 0.10.3
| {"license": "mit", "tags": ["generated_from_trainer"], "model-index": [{"name": "japanese-roberta-base-finetuned-wikitext2", "results": []}]} | fill-mask | Roy029/japanese-roberta-base-finetuned-wikitext2 | [
"transformers",
"pytorch",
"tensorboard",
"roberta",
"fill-mask",
"generated_from_trainer",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #tensorboard #roberta #fill-mask #generated_from_trainer #license-mit #autotrain_compatible #endpoints_compatible #region-us
| japanese-roberta-base-finetuned-wikitext2
=========================================
This model is a fine-tuned version of rinna/japanese-roberta-base on the None dataset.
It achieves the following results on the evaluation set:
* Loss: 3.2302
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 8
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 3.0
### Training results
### Framework versions
* Transformers 4.12.3
* Pytorch 1.9.0+cu111
* Datasets 1.15.1
* Tokenizers 0.10.3
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.3\n* Pytorch 1.9.0+cu111\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] | [
"TAGS\n#transformers #pytorch #tensorboard #roberta #fill-mask #generated_from_trainer #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.3\n* Pytorch 1.9.0+cu111\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] | [
53,
98,
4,
34
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #roberta #fill-mask #generated_from_trainer #license-mit #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0### Training results### Framework versions\n\n\n* Transformers 4.12.3\n* Pytorch 1.9.0+cu111\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] | [
-0.10848595947027206,
0.03653303533792496,
-0.0022280144039541483,
0.1270245611667633,
0.1745331585407257,
0.03148312494158745,
0.1255502998828888,
0.10141364485025406,
-0.09461109340190887,
0.026579657569527626,
0.13819600641727448,
0.17357158660888672,
0.0053909700363874435,
0.12288543581962585,
-0.03756091371178627,
-0.26495811343193054,
-0.02974976785480976,
0.0319535918533802,
-0.09693918377161026,
0.13387124240398407,
0.09081888198852539,
-0.14081019163131714,
0.07192747294902802,
0.012088057585060596,
-0.22738079726696014,
0.01526800636202097,
0.03368959203362465,
-0.06364186108112335,
0.1556365042924881,
-0.003569614142179489,
0.15815353393554688,
-0.006467198021709919,
0.09337324649095535,
-0.14410915970802307,
0.020045818760991096,
0.047114528715610504,
0.0035478253848850727,
0.08028840273618698,
0.05133618786931038,
-0.006813661195337772,
0.1103767603635788,
-0.09057920426130295,
0.06104389950633049,
0.01352347619831562,
-0.13818159699440002,
-0.2204093486070633,
-0.08041343837976456,
-0.00020883462275378406,
0.050787415355443954,
0.10345732420682907,
-0.0005554163362830877,
0.1655970811843872,
-0.09454856067895889,
0.09724755585193634,
0.24111127853393555,
-0.28430384397506714,
-0.07969412952661514,
0.028462590649724007,
0.018653348088264465,
0.04535534605383873,
-0.11184467375278473,
-0.008309380151331425,
0.04777359217405319,
0.05640067905187607,
0.13730008900165558,
-0.04298929497599602,
-0.08813182264566422,
0.01258434820920229,
-0.13779479265213013,
-0.024305373430252075,
0.07447612285614014,
0.0213174968957901,
-0.03701420873403549,
-0.022765768691897392,
-0.06146468594670296,
-0.15318410098552704,
-0.04457375779747963,
-0.005696311593055725,
0.03561471775174141,
-0.05639041215181351,
-0.10980018973350525,
-0.004031593911349773,
-0.10987650603055954,
-0.07006935030221939,
-0.07482150942087173,
0.1924152672290802,
0.0355529822409153,
0.026002129539847374,
-0.037120379507541656,
0.10074646770954132,
-0.021249476820230484,
-0.14630277454853058,
0.03969525545835495,
0.03301240876317024,
-0.01736498437821865,
-0.042354099452495575,
-0.07477091997861862,
-0.10067238658666611,
0.009792974218726158,
0.11810874938964844,
-0.04999696463346481,
0.03923410549759865,
0.06338860839605331,
0.05193982273340225,
-0.10532017797231674,
0.17980849742889404,
-0.040835171937942505,
-0.012513590045273304,
0.009872300550341606,
0.03818654641509056,
0.016111593693494797,
-0.011504591442644596,
-0.10847047716379166,
0.0038952957838773727,
0.08580350875854492,
0.009373941458761692,
-0.059959203004837036,
0.05353459343314171,
-0.057395435869693756,
-0.013212934136390686,
-0.007293723523616791,
-0.0957498773932457,
0.034980934113264084,
-0.016693638637661934,
-0.07899308204650879,
-0.03652500361204147,
0.025309870019555092,
0.011756276711821556,
-0.0022094983141869307,
0.15210461616516113,
-0.09329576790332794,
0.04628003388643265,
-0.11229488253593445,
-0.11947529017925262,
-0.0017726779915392399,
-0.09889397770166397,
0.023453373461961746,
-0.10134255886077881,
-0.15340107679367065,
-0.01058627013117075,
0.06960764527320862,
-0.029019789770245552,
-0.03499358147382736,
-0.02105790562927723,
-0.07480000704526901,
0.0018653806764632463,
-0.006201588083058596,
0.1677137017250061,
-0.05243014171719551,
0.11571432650089264,
0.04934981092810631,
0.08520546555519104,
-0.060878679156303406,
0.04898213967680931,
-0.09191994369029999,
0.0028748863842338324,
-0.20122535526752472,
0.017921147868037224,
-0.04435691982507706,
0.06911192089319229,
-0.07107210159301758,
-0.12234853208065033,
-0.0016427556984126568,
-0.006136932875961065,
0.09210490435361862,
0.09067288786172867,
-0.17720983922481537,
-0.08081096410751343,
0.16170337796211243,
-0.06042257323861122,
-0.0916178971529007,
0.1269933432340622,
-0.06672177463769913,
0.040783267468214035,
0.05568729341030121,
0.13729701936244965,
0.062322862446308136,
-0.11776222288608551,
0.04677926376461983,
-0.01974121481180191,
0.03699484094977379,
-0.06923885643482208,
0.053259894251823425,
0.00006018095155013725,
0.0053544132970273495,
0.026701664552092552,
-0.013578144833445549,
0.06356553733348846,
-0.10864101350307465,
-0.10055460780858994,
-0.03140031173825264,
-0.10818830877542496,
0.07250654697418213,
0.0785493478178978,
0.08769740164279938,
-0.11300508677959442,
-0.0742865577340126,
0.02816622331738472,
0.05737122520804405,
-0.03753983601927757,
0.023834355175495148,
-0.0556584894657135,
0.07029696553945541,
-0.05373460426926613,
-0.04047469049692154,
-0.1925889104604721,
-0.025792304426431656,
-0.0017491972539573908,
0.007579992990940809,
0.03455697000026703,
0.017918776720762253,
0.090871162712574,
0.07715654373168945,
-0.0558636337518692,
-0.006947509944438934,
-0.04434732347726822,
-0.008697915822267532,
-0.13266326487064362,
-0.1988300234079361,
-0.038778308779001236,
-0.02582191489636898,
0.11033903807401657,
-0.16954559087753296,
0.023488124832510948,
-0.06509378552436829,
0.07307387888431549,
0.009685785509645939,
-0.019382189959287643,
-0.05178065225481987,
0.09941145032644272,
-0.01238337717950344,
-0.048414040356874466,
0.0673585832118988,
-0.006280227564275265,
-0.08257895708084106,
-0.06036531180143356,
-0.09987880289554596,
0.195930615067482,
0.13028821349143982,
-0.1383679211139679,
-0.10533550381660461,
0.03286326676607132,
-0.06620754301548004,
-0.022833341732621193,
-0.05592472106218338,
0.05611960589885712,
0.1824580729007721,
-0.004983499646186829,
0.1378696709871292,
-0.057656094431877136,
-0.03719617426395416,
0.03568882867693901,
-0.04782059043645859,
0.03078637458384037,
0.09431199729442596,
0.13049371540546417,
-0.07153793424367905,
0.12487473338842392,
0.15162207186222076,
-0.12159208953380585,
0.13994574546813965,
-0.020922496914863586,
-0.06898827850818634,
-0.02648100256919861,
-0.02595541439950466,
0.012687887996435165,
0.12873463332653046,
-0.12073169648647308,
-0.012482400052249432,
0.015453891828656197,
-0.0033935564570128918,
0.018292708322405815,
-0.2392539381980896,
-0.06183888018131256,
0.030564799904823303,
-0.01908005028963089,
-0.026748526841402054,
0.00008429399895248935,
0.010319809429347515,
0.10967085510492325,
0.00023312406847253442,
-0.09326734393835068,
0.033314116299152374,
0.005169041454792023,
-0.05972573533654213,
0.21767368912696838,
-0.07050799578428268,
-0.13553676009178162,
-0.11830145120620728,
-0.07903628051280975,
-0.04188460111618042,
0.015505947172641754,
0.04904933273792267,
-0.09568002820014954,
-0.029766296967864037,
-0.03531123697757721,
0.007779020816087723,
0.006605223752558231,
0.05416109785437584,
-0.01097552478313446,
-0.009623725898563862,
0.07278046756982803,
-0.10634037107229233,
-0.012799747288227081,
-0.06294012814760208,
-0.06752818077802658,
0.07034019380807877,
0.06493307650089264,
0.1289888322353363,
0.1478859782218933,
-0.03461248427629471,
0.0025877479929476976,
-0.021026208996772766,
0.24006037414073944,
-0.07669124007225037,
-0.035461489111185074,
0.1253465861082077,
-0.008303540758788586,
0.06371048837900162,
0.10604145377874374,
0.08619572967290878,
-0.08858096599578857,
0.013226008974015713,
0.028303969651460648,
-0.05336664617061615,
-0.19976459443569183,
-0.03739567846059799,
-0.0672888308763504,
-0.05653320997953415,
0.09743703156709671,
0.025575939565896988,
0.04246498644351959,
0.07736729085445404,
0.055023692548274994,
0.07628544420003891,
-0.06798356026411057,
0.04844149574637413,
0.06650042533874512,
0.04881955683231354,
0.12923212349414825,
-0.035791900008916855,
-0.09548638761043549,
0.019213488325476646,
-0.024610230699181557,
0.2351941615343094,
0.004029191564768553,
0.09942054003477097,
0.07173176854848862,
0.20004522800445557,
0.0025179421063512564,
0.10428119450807571,
0.008915064856410027,
-0.06387558579444885,
-0.004014335572719574,
-0.04308363422751427,
-0.026619840413331985,
0.011915015988051891,
-0.024104006588459015,
0.05630261451005936,
-0.11060117185115814,
-0.01534368097782135,
0.03780824691057205,
0.25124120712280273,
0.03893903270363808,
-0.3265010118484497,
-0.0774192288517952,
-0.013210652396082878,
-0.016422422602772713,
-0.010467899963259697,
-0.0008247477817349136,
0.10209774225950241,
-0.09290636330842972,
0.03494646027684212,
-0.08078233897686005,
0.0822097659111023,
0.0054243882186710835,
0.043083567172288895,
0.0652727335691452,
0.11223335564136505,
-0.0010519494535401464,
0.06288063526153564,
-0.3090798258781433,
0.30511242151260376,
0.0021186748053878546,
0.09194228053092957,
-0.0876651406288147,
-0.006035724189132452,
0.04658980667591095,
0.015614217147231102,
0.06672089546918869,
-0.016632216051220894,
-0.013216604478657246,
-0.2070448249578476,
-0.031641844660043716,
0.027353277429938316,
0.11281021684408188,
-0.012166653759777546,
0.09571681916713715,
-0.010635008104145527,
-0.004043614026159048,
0.07316862791776657,
0.004989865701645613,
-0.06715519726276398,
-0.07790548354387283,
-0.008339998312294483,
0.011746607720851898,
-0.09315168112516403,
-0.06343899667263031,
-0.12151306867599487,
-0.1315310299396515,
0.14859987795352936,
0.027106914669275284,
-0.021890921518206596,
-0.12230973690748215,
0.09270957857370377,
0.09137923270463943,
-0.08802326023578644,
0.058826565742492676,
0.0062048304826021194,
0.051928114145994186,
0.012511751614511013,
-0.0611727349460125,
0.10961586236953735,
-0.06678549945354462,
-0.14268812537193298,
-0.070889912545681,
0.08838621526956558,
0.03876117616891861,
0.07198331505060196,
-0.012502501718699932,
0.029757549986243248,
-0.04059851914644241,
-0.0881422758102417,
0.05198109894990921,
-0.07201217114925385,
0.0686180591583252,
0.028531890362501144,
-0.03328091278672218,
0.005606821738183498,
-0.05714200437068939,
-0.01004311628639698,
0.17884047329425812,
0.24646277725696564,
-0.10018536448478699,
0.010785781778395176,
0.02567571960389614,
-0.046528320759534836,
-0.209248349070549,
0.05263780802488327,
0.06676742434501648,
0.01916472800076008,
0.0696771964430809,
-0.16019943356513977,
0.12430636584758759,
0.08713237941265106,
-0.013428501784801483,
0.13277466595172882,
-0.31465646624565125,
-0.13290095329284668,
0.1244523823261261,
0.1771090179681778,
0.15322710573673248,
-0.14154796302318573,
-0.011718358844518661,
-0.018337590619921684,
-0.11073866486549377,
0.06948027014732361,
-0.07643521577119827,
0.13319560885429382,
-0.03355929255485535,
0.1082710549235344,
0.004858269821852446,
-0.07820122689008713,
0.11624754965305328,
-0.0009760063840076327,
0.10505619645118713,
-0.061499424278736115,
-0.040998414158821106,
0.058656323701143265,
-0.020933758467435837,
-0.004392366856336594,
-0.05248207598924637,
0.01758596859872341,
-0.042349688708782196,
-0.01343619916588068,
-0.08986572176218033,
0.06567138433456421,
-0.029514731839299202,
-0.06473702192306519,
-0.023128163069486618,
0.030713316053152084,
0.02752572111785412,
-0.020415134727954865,
0.11275804787874222,
0.025250554084777832,
0.18126444518566132,
0.08215595036745071,
0.047031376510858536,
-0.051232628524303436,
-0.07645954191684723,
-0.006032558158040047,
-0.017036674544215202,
0.05915980413556099,
-0.10501313209533691,
0.009067535400390625,
0.13349616527557373,
0.04173079505562782,
0.11801177263259888,
0.08912919461727142,
-0.030764596536755562,
0.02945167012512684,
0.08433424681425095,
-0.16083262860774994,
-0.06406597793102264,
0.006439040414988995,
-0.08193743973970413,
-0.1030343621969223,
0.04157598316669464,
0.07789646834135056,
-0.06778211891651154,
-0.00989378709346056,
-0.018875177949666977,
-0.008112948387861252,
-0.08051056414842606,
0.21983417868614197,
0.07506337761878967,
0.049412474036216736,
-0.10044758766889572,
0.04374025762081146,
0.044325437396764755,
-0.07074055820703506,
-0.010682156309485435,
0.07282175868749619,
-0.06309442222118378,
-0.03338530287146568,
0.1106751412153244,
0.18050122261047363,
-0.05464302748441696,
-0.02143211103975773,
-0.15117111802101135,
-0.11195098608732224,
0.06041921675205231,
0.1698969304561615,
0.10464301705360413,
-0.010189441032707691,
-0.05850750580430031,
0.021970806643366814,
-0.12612059712409973,
0.06961243599653244,
0.05913683399558067,
0.07376334816217422,
-0.12010249495506287,
0.18648375570774078,
0.0035583137068897486,
0.0595058910548687,
-0.028674719855189323,
0.031209249049425125,
-0.10387156158685684,
0.02276250533759594,
-0.10625499486923218,
-0.04399348050355911,
-0.019010493531823158,
-0.011584447696805,
-0.009343801066279411,
-0.06188872456550598,
-0.06466087698936462,
0.016487974673509598,
-0.13047285377979279,
-0.033435918390750885,
0.047110386192798615,
0.020561739802360535,
-0.11408444494009018,
-0.04882451146841049,
0.030268631875514984,
-0.05227728188037872,
0.045387446880340576,
0.05817278474569321,
0.013688912615180016,
0.06469617038965225,
-0.1633141040802002,
-0.019655009731650352,
0.06480196863412857,
-0.000496409076731652,
0.08021476119756699,
-0.06323074549436569,
-0.010536130517721176,
-0.013324987143278122,
0.08661568909883499,
0.016882145777344704,
0.07338467985391617,
-0.1469738781452179,
0.008392942138016224,
-0.023095857352018356,
-0.10136678069829941,
-0.06531529128551483,
0.014513522386550903,
0.07205506414175034,
0.008890769444406033,
0.18937085568904877,
-0.09354183077812195,
0.06625472009181976,
-0.2073175311088562,
-0.004771817941218615,
-0.021133828908205032,
-0.09781559556722641,
-0.10870376229286194,
-0.0527048259973526,
0.07683795690536499,
-0.05138256400823593,
0.1239510178565979,
0.03850598260760307,
0.050515905022621155,
0.02161403000354767,
-0.008641665801405907,
0.01959078013896942,
0.014377337880432606,
0.19832521677017212,
0.030356310307979584,
-0.045957569032907486,
0.06163039803504944,
0.07671274989843369,
0.09565094113349915,
0.11559204012155533,
0.22100314497947693,
0.14631065726280212,
0.02225339598953724,
0.08768804371356964,
0.026805216446518898,
-0.0451338104903698,
-0.1461278349161148,
0.008559443056583405,
-0.03763392195105553,
0.0851994976401329,
-0.018792692571878433,
0.1692407876253128,
0.073019839823246,
-0.1686137169599533,
0.054469551891088486,
-0.048402391374111176,
-0.09034444391727448,
-0.10735232383012772,
-0.04133851081132889,
-0.07564952969551086,
-0.10872366279363632,
0.018794633448123932,
-0.08688243478536606,
0.017354434356093407,
0.11630287766456604,
0.002158575691282749,
-0.019998060539364815,
0.20016075670719147,
0.02225058525800705,
0.04350235313177109,
0.053783562034368515,
0.013211440294981003,
-0.017478231340646744,
-0.07334496825933456,
-0.0563616044819355,
-0.0441322922706604,
-0.02428632415831089,
0.032915808260440826,
-0.07741723209619522,
-0.0937238559126854,
0.04312853887677193,
-0.013950365595519543,
-0.11093160510063171,
0.016969887539744377,
0.02028387412428856,
0.07429742068052292,
0.05060182511806488,
0.008183274418115616,
0.026815414428710938,
-0.02760021574795246,
0.1931433081626892,
-0.07934685796499252,
-0.09795356541872025,
-0.08867348730564117,
0.2706475853919983,
0.03586370497941971,
-0.01650356315076351,
0.023378591984510422,
-0.0639137551188469,
0.01179987471550703,
0.24585267901420593,
0.2298637479543686,
-0.09936363995075226,
0.009912357665598392,
0.009778299368917942,
-0.014408372342586517,
-0.03438412398099899,
0.1283799111843109,
0.11982981860637665,
0.07595685124397278,
-0.10785698890686035,
-0.042217183858156204,
-0.06842963397502899,
-0.011960547417402267,
-0.06378781795501709,
0.03678322210907936,
0.05616088584065437,
0.011210068129003048,
-0.04800837114453316,
0.06754013150930405,
-0.0675729513168335,
-0.10640853643417358,
0.10591094195842743,
-0.20741130411624908,
-0.16462501883506775,
-0.007532238494604826,
0.1038239374756813,
0.00830613449215889,
0.0836406797170639,
-0.035779524594545364,
0.004486409015953541,
0.04150412231683731,
-0.015698615461587906,
-0.07701553404331207,
-0.08992185443639755,
0.1026521548628807,
-0.11199440807104111,
0.20577701926231384,
-0.04680856689810753,
0.07775622606277466,
0.12568460404872894,
0.075319804251194,
-0.05003233253955841,
0.0694064050912857,
0.04624935984611511,
-0.09992179274559021,
0.026482515037059784,
0.12028124928474426,
-0.03125183284282684,
0.014157754369080067,
0.03261183947324753,
-0.12002044171094894,
0.03886641189455986,
-0.08260639756917953,
-0.028754478320479393,
-0.03757714107632637,
-0.026655856519937515,
-0.0596124529838562,
0.11841718852519989,
0.2185039222240448,
-0.012221761047840118,
0.017117295414209366,
-0.0771317407488823,
0.01097036525607109,
0.07948673516511917,
0.02592538669705391,
-0.11352944374084473,
-0.2299102395772934,
0.012065952643752098,
0.05036677047610283,
-0.0409754142165184,
-0.25116145610809326,
-0.09660971164703369,
-0.0029909268487244844,
-0.0817129909992218,
-0.08585748821496964,
0.06715701520442963,
0.06986071169376373,
0.05866455286741257,
-0.04204276576638222,
-0.0903710201382637,
-0.07688285410404205,
0.14929617941379547,
-0.16604721546173096,
-0.08847561478614807
] |
null | null | transformers |
# Almas DialoGPT Model | {"tags": ["conversational"]} | text-generation | Royce23/DialoGPT-small-almas | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Almas DialoGPT Model | [
"# Almas DialoGPT Model"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Almas DialoGPT Model"
] | [
51,
8
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Almas DialoGPT Model"
] | [
-0.02406315878033638,
0.04289577901363373,
-0.006148059386759996,
0.0036817623768001795,
0.13714678585529327,
-0.013328074477612972,
0.1773146092891693,
0.11843129247426987,
0.008035261183977127,
-0.02999185398221016,
0.10796524584293365,
0.19151096045970917,
0.01714915782213211,
0.08958324044942856,
-0.09555978327989578,
-0.27546456456184387,
0.04071589186787605,
0.0461120568215847,
0.003093109233304858,
0.11168287694454193,
0.095622219145298,
-0.046295393258333206,
0.08301906287670135,
-0.013838401064276695,
-0.12201908230781555,
0.041418641805648804,
0.004611850716173649,
-0.1525758057832718,
0.12380873411893845,
0.09635548293590546,
0.022955099120736122,
0.031444650143384933,
-0.05836215615272522,
-0.14389342069625854,
0.029858270660042763,
-0.03718644753098488,
-0.03617558628320694,
0.04749004542827606,
0.028431206941604614,
-0.05841558426618576,
0.12963852286338806,
0.07925446331501007,
-0.02365320548415184,
0.022128762677311897,
-0.16129055619239807,
0.022074250504374504,
0.0237625353038311,
0.030660003423690796,
0.08311799168586731,
0.08935960382223129,
-0.04337184131145477,
0.10501905530691147,
-0.045027513056993484,
0.09067234396934509,
0.05541690066456795,
-0.36578840017318726,
-0.02099505066871643,
0.06554791331291199,
0.003469104878604412,
0.08376151323318481,
-0.025862183421850204,
0.09845199435949326,
0.016980616375803947,
0.007391471415758133,
-0.014263481833040714,
-0.07220829278230667,
-0.0952349454164505,
0.024304378777742386,
-0.06916002184152603,
-0.04387976601719856,
0.25392934679985046,
-0.04879789426922798,
0.061418306082487106,
-0.09356465935707092,
-0.09294863790273666,
0.00301210000179708,
-0.04045095294713974,
-0.05951133370399475,
-0.06987147778272629,
0.08475381135940552,
0.01589624397456646,
-0.08017364144325256,
-0.1381445974111557,
0.00653846887871623,
-0.1906445473432541,
0.16106665134429932,
0.034079570323228836,
0.038622934371232986,
-0.1966375708580017,
0.08835262805223465,
-0.0021827779710292816,
-0.11228229850530624,
0.040317900478839874,
-0.07988539338111877,
0.010200479999184608,
0.02435402013361454,
-0.01611865498125553,
-0.09370093792676926,
0.07601486891508102,
0.059000011533498764,
-0.02962399646639824,
0.011410687118768692,
-0.038614675402641296,
0.05177953839302063,
0.0628795176744461,
0.10883484780788422,
0.028301304206252098,
-0.0898941159248352,
0.040303751826286316,
-0.08618991822004318,
0.00356153747998178,
-0.04778668284416199,
-0.18169906735420227,
-0.0433133989572525,
0.045435283333063126,
0.05783067271113396,
0.02958892099559307,
0.1059703379869461,
0.0019217000808566809,
-0.03971892595291138,
0.08290029317140579,
-0.02877209335565567,
-0.024594800546765327,
-0.010028408840298653,
-0.009661098010838032,
0.13686805963516235,
-0.007535507902503014,
0.04840861260890961,
-0.13283467292785645,
0.005830816458910704,
-0.033258479088544846,
-0.034283168613910675,
-0.023430779576301575,
-0.02785423770546913,
-0.0038201455026865005,
-0.0019577329512685537,
0.016547108069062233,
-0.1621839851140976,
-0.16986000537872314,
-0.01791539043188095,
-0.020260296761989594,
-0.07114621996879578,
-0.09219013154506683,
-0.10308463871479034,
-0.005629761144518852,
0.05367329716682434,
-0.07054769992828369,
-0.020342765375971794,
-0.050654418766498566,
0.08114983141422272,
-0.006653445307165384,
0.08905776590108871,
-0.06941909343004227,
0.08214530348777771,
-0.12152718752622604,
-0.023905951529741287,
-0.10826847702264786,
0.1539115011692047,
0.019243929535150528,
0.07740610837936401,
-0.03469929099082947,
0.00015905711916275322,
-0.12784214317798615,
0.07417934387922287,
-0.028166279196739197,
0.22977612912654877,
-0.05618024989962578,
-0.09880269318819046,
0.2688249945640564,
-0.04803992807865143,
-0.11944155395030975,
0.14760464429855347,
0.003322782926261425,
0.0772951990365982,
0.14167536795139313,
0.22313378751277924,
-0.05209415778517723,
0.057466067373752594,
0.07945409417152405,
0.09365338832139969,
-0.048912327736616135,
-0.018434351310133934,
0.02237769402563572,
-0.005509837064892054,
-0.09337326884269714,
0.03298090398311615,
0.10910116136074066,
0.06004423648118973,
-0.03642898052930832,
-0.02906535193324089,
0.006038676016032696,
0.010331794619560242,
0.01628425158560276,
-0.02611706033349037,
0.1496531218290329,
-0.045265473425388336,
-0.06544036418199539,
-0.05796589329838753,
-0.006805398967117071,
-0.04678342863917351,
0.019542237743735313,
-0.0717054158449173,
0.06720910221338272,
-0.04728873446583748,
0.0715155079960823,
-0.12008866667747498,
-0.019558899104595184,
-0.00967391487210989,
0.16544502973556519,
0.0565648153424263,
0.0739232525229454,
0.04311926290392876,
-0.03814678266644478,
-0.023640552535653114,
0.02932518534362316,
0.19203658401966095,
-0.028297295793890953,
-0.09548693150281906,
-0.11386638879776001,
0.10690039396286011,
-0.05577373132109642,
0.040454499423503876,
-0.07477861642837524,
0.0022711639758199453,
-0.04015479236841202,
0.0851081907749176,
-0.031568948179483414,
0.04480227828025818,
0.010846107266843319,
-0.0033287955448031425,
-0.05552130192518234,
0.0243680477142334,
0.08951380848884583,
-0.001581535441800952,
-0.08233609050512314,
0.27291426062583923,
-0.2055598646402359,
0.1841278374195099,
0.16941508650779724,
-0.2389400154352188,
0.008008097298443317,
-0.14466117322444916,
-0.02968536876142025,
-0.004029536619782448,
0.08545677363872528,
-0.03243904188275337,
0.27059921622276306,
-0.03466550260782242,
0.18629954755306244,
-0.05554619058966637,
-0.025080064311623573,
-0.032876864075660706,
-0.06106138601899147,
0.012868442572653294,
0.09635577350854874,
0.06950180232524872,
-0.22999489307403564,
0.17687514424324036,
0.10515287518501282,
0.06287208944559097,
0.1733144372701645,
0.03402223438024521,
-0.021766148507595062,
0.05843786522746086,
-0.002652907744050026,
-0.05186647176742554,
-0.0786428153514862,
-0.2704620361328125,
-0.03440213203430176,
0.07352118194103241,
0.01843257062137127,
0.12019884586334229,
-0.09681505709886551,
-0.0211404450237751,
0.018199153244495392,
0.0011002994142472744,
0.0733611211180687,
0.11372940242290497,
0.021630343049764633,
0.13378791511058807,
-0.01024522352963686,
-0.04727064445614815,
0.05028604716062546,
0.010846429504454136,
-0.09093733876943588,
0.18895912170410156,
-0.09947222471237183,
-0.35639292001724243,
-0.11874635517597198,
-0.12239526212215424,
-0.015565121546387672,
0.07816541939973831,
0.10944206267595291,
-0.10457408428192139,
-0.03016531653702259,
-0.012073740363121033,
0.11237771064043045,
-0.07035105675458908,
0.006232911720871925,
-0.014334089122712612,
-0.009012730792164803,
-0.1235024556517601,
-0.06386541575193405,
-0.06935060769319534,
-0.02782864309847355,
-0.06200975552201271,
0.11087056994438171,
-0.14244556427001953,
0.025578336790204048,
0.21113045513629913,
0.04350212588906288,
0.02860686182975769,
-0.04974288493394852,
0.17649103701114655,
-0.10505110025405884,
0.015435845591127872,
0.1568150669336319,
-0.038319509476423264,
0.05582580715417862,
0.1296100616455078,
-0.003949938807636499,
-0.07651273906230927,
0.029396118596196175,
-0.052831269800662994,
-0.0749514177441597,
-0.24158823490142822,
-0.12224309146404266,
-0.10430017113685608,
0.06492110341787338,
0.0203081201761961,
0.04733588546514511,
0.2040797919034958,
0.08496380597352982,
-0.04151691123843193,
0.03364047408103943,
0.02563447691500187,
0.07602914422750473,
0.3236255347728729,
-0.04699498042464256,
0.1399323046207428,
-0.03303471952676773,
-0.161574125289917,
0.07308349758386612,
0.08129151910543442,
0.04616733267903328,
0.06167057529091835,
0.0713505819439888,
0.0017217104323208332,
-0.005019308999180794,
0.12090907245874405,
0.03608296066522598,
0.016935979947447777,
-0.034789253026247025,
-0.05721824988722801,
-0.04650795832276344,
-0.05709591507911682,
0.06545635312795639,
0.04876265674829483,
-0.14585204422473907,
-0.03238937631249428,
-0.016863159835338593,
0.06218286231160164,
0.05200612172484398,
0.05522700026631355,
-0.198933407664299,
-0.03874417394399643,
0.08081718534231186,
-0.05336279049515724,
-0.1145368367433548,
0.09013117849826813,
0.023053310811519623,
-0.11447815597057343,
0.010064365342259407,
0.00024217650934588164,
0.11301054060459137,
-0.09572634100914001,
0.09166249632835388,
-0.10740745067596436,
-0.05862906202673912,
0.007237082347273827,
0.10710009932518005,
-0.2890174388885498,
0.21753545105457306,
-0.00191772123798728,
-0.054756663739681244,
-0.1037604957818985,
0.010599306784570217,
0.010808589868247509,
0.06254532933235168,
0.10821491479873657,
-0.02537686377763748,
-0.04065818712115288,
-0.004060165490955114,
-0.06715291738510132,
0.028197068721055984,
0.07438814640045166,
-0.02803676575422287,
-0.007241195999085903,
-0.03888927772641182,
0.001879034098237753,
-0.011890962719917297,
-0.05848998576402664,
0.007216821424663067,
-0.2042229026556015,
0.07004418224096298,
0.08561712503433228,
0.06230904161930084,
0.033417895436286926,
-0.030829885974526405,
-0.1148567795753479,
0.22522340714931488,
0.035716477781534195,
-0.10252035409212112,
-0.0766461119055748,
-0.005052841734141111,
0.049502402544021606,
-0.05621865764260292,
0.012953901663422585,
-0.0631772130727768,
-0.00753535982221365,
-0.052637889981269836,
-0.18153581023216248,
0.13613596558570862,
-0.10117439925670624,
-0.03763674944639206,
-0.03650053218007088,
0.21018321812152863,
-0.023071514442563057,
0.02994929812848568,
0.04981766641139984,
0.0007956468034535646,
-0.07747776806354523,
-0.06746106594800949,
0.014544802717864513,
0.024511223658919334,
0.007595180533826351,
0.020962517708539963,
-0.0288418959826231,
-0.06830704212188721,
-0.06711365282535553,
-0.04080707207322121,
0.30786412954330444,
0.1289764791727066,
-0.03742099180817604,
0.17823755741119385,
0.12207992374897003,
-0.09289427101612091,
-0.2537742555141449,
-0.12093042582273483,
-0.06835206598043442,
-0.006420646328479052,
-0.07960642874240875,
-0.18284209072589874,
0.07274198532104492,
-0.039763934910297394,
-0.02629012241959572,
0.0747438445687294,
-0.3136546313762665,
-0.09802735596895218,
0.19531889259815216,
-0.04362097382545471,
0.4150349497795105,
-0.12923146784305573,
-0.0957537293434143,
-0.04492069408297539,
-0.07940969616174698,
0.11633359640836716,
0.10334999114274979,
0.10816297680139542,
-0.01628221571445465,
0.16436822712421417,
0.0613398477435112,
0.006216045003384352,
0.09770664572715759,
0.006437384989112616,
-0.05457521229982376,
-0.0935191735625267,
-0.028492538258433342,
0.01832554303109646,
0.027732690796256065,
0.023385899141430855,
-0.054484494030475616,
0.025659354403614998,
-0.15465465188026428,
-0.06732608377933502,
-0.05803613364696503,
0.027148030698299408,
0.03659048303961754,
-0.059689074754714966,
0.007731256540864706,
-0.07375869154930115,
0.021385688334703445,
0.010365757159888744,
0.16356484591960907,
-0.10128961503505707,
0.15082071721553802,
0.11430522799491882,
0.14022161066532135,
-0.06605377793312073,
0.029692664742469788,
-0.056942347437143326,
-0.05246566981077194,
0.07860696315765381,
-0.12446743249893188,
0.026704473420977592,
0.12553617358207703,
-0.06459929794073105,
0.0664733499288559,
0.07062860578298569,
0.006154991686344147,
0.01829196885228157,
0.10043305903673172,
-0.23321834206581116,
-0.05763997137546539,
-0.08165377378463745,
0.06404976546764374,
0.10681742429733276,
0.08035106956958771,
0.2004864662885666,
-0.031197339296340942,
-0.03996918722987175,
0.010651769116520882,
0.02076268568634987,
-0.023264290764927864,
0.10396808385848999,
-0.03834722936153412,
0.008110550232231617,
-0.12956194579601288,
0.07977790385484695,
0.011768289841711521,
-0.1306663602590561,
0.05856369435787201,
0.15109138190746307,
-0.08156508207321167,
-0.11479081213474274,
-0.0674045979976654,
0.07289683073759079,
-0.14733323454856873,
-0.028798360377550125,
-0.04276265576481819,
-0.1374264508485794,
0.07402073591947556,
0.10653585940599442,
0.04041857272386551,
0.06278696656227112,
-0.09934859722852707,
-0.010769585147500038,
-0.005496779456734657,
0.014648846350610256,
0.04519728198647499,
-0.00724037317559123,
-0.04000261798501015,
0.0019899304024875164,
-0.05035131797194481,
0.09393367171287537,
-0.09736360609531403,
-0.10551310330629349,
-0.14259180426597595,
0.04220625013113022,
-0.15462136268615723,
-0.05865689739584923,
-0.09759029746055603,
-0.06658202409744263,
-0.006663451436907053,
-0.03791351616382599,
-0.03564925491809845,
-0.04805324971675873,
-0.10632168501615524,
0.048603642731904984,
-0.04138413071632385,
0.049237437546253204,
-0.052763670682907104,
0.009556575678288937,
0.04096083343029022,
-0.02309739775955677,
0.15165871381759644,
0.1655968427658081,
-0.11633695662021637,
0.06898833811283112,
-0.1543261557817459,
-0.03780866041779518,
0.11640568822622299,
0.02908855304121971,
0.03730720281600952,
0.06798844784498215,
0.02175641432404518,
0.08069691061973572,
0.03101840801537037,
0.038074709475040436,
0.06084645912051201,
-0.0881505236029625,
0.043003372848033905,
-0.0370529405772686,
-0.1285434365272522,
-0.02857220359146595,
-0.03395029902458191,
0.022963792085647583,
0.012743845582008362,
0.06869840621948242,
-0.095229372382164,
0.07250774651765823,
-0.06465040892362595,
0.039976805448532104,
0.01878521963953972,
-0.15465885400772095,
-0.027203969657421112,
-0.10308664292097092,
0.05480991303920746,
0.011431652121245861,
0.2507857084274292,
0.0462314635515213,
0.021249551326036453,
0.031936291605234146,
0.030765578150749207,
0.0829433724284172,
0.013220011256635189,
0.19956213235855103,
0.10968057066202164,
-0.06178688257932663,
-0.09522085636854172,
0.07726209610700607,
0.02618110366165638,
0.02663310617208481,
0.06418301165103912,
0.009706521406769753,
-0.005104429554194212,
0.10774930566549301,
-0.0014447267167270184,
0.021178297698497772,
-0.13997872173786163,
-0.12776567041873932,
-0.06343358010053635,
0.058442622423172,
-0.05560486391186714,
0.1374889612197876,
0.1537243127822876,
-0.017388835549354553,
0.029151001945137978,
-0.02952352724969387,
-0.047094933688640594,
-0.1770518273115158,
-0.21652689576148987,
-0.08011798560619354,
-0.13168714940547943,
0.0013798992149531841,
-0.13030663132667542,
0.03993772715330124,
0.04799511283636093,
0.08172784000635147,
-0.07847245782613754,
0.08965694904327393,
0.07498246431350708,
-0.11285099387168884,
0.08610400557518005,
-0.01814565435051918,
0.1074196919798851,
-0.06389521807432175,
0.010583958588540554,
-0.07676120102405548,
0.06962801516056061,
-0.001554175978526473,
0.038102712482213974,
-0.019323455169796944,
0.0029006467666476965,
-0.10045094788074493,
-0.07145649939775467,
-0.056903500109910965,
0.06400218605995178,
-0.011541496962308884,
0.1055571436882019,
0.014788411557674408,
-0.04924744740128517,
0.028154365718364716,
0.2524397075176239,
-0.08714660257101059,
-0.07836373895406723,
-0.04347062483429909,
0.23190560936927795,
0.006584668066352606,
0.10697271674871445,
-0.02587938867509365,
0.015415540896356106,
-0.08621867746114731,
0.2967302203178406,
0.33939921855926514,
-0.08970057964324951,
0.0017261069733649492,
0.004233607556670904,
0.04800992086529732,
0.09388762712478638,
0.104921355843544,
0.10693689435720444,
0.3160548806190491,
-0.06857595592737198,
-0.03324272483587265,
-0.03116011992096901,
-0.013260051608085632,
-0.05326545611023903,
0.08649805188179016,
0.03976370021700859,
-0.0608859620988369,
-0.014160064049065113,
0.11103091388940811,
-0.25005194544792175,
0.03393883258104324,
-0.15855833888053894,
-0.19642333686351776,
-0.07620083540678024,
0.008872924372553825,
0.08429825305938721,
0.0486912801861763,
0.07453519105911255,
-0.0013288827612996101,
-0.06568525731563568,
0.05290186032652855,
0.029754066839814186,
-0.18802498281002045,
0.059752292931079865,
0.053459532558918,
-0.12456513941287994,
-0.04677734524011612,
-0.02715912275016308,
0.0807371661067009,
0.06341966241598129,
0.06379810720682144,
-0.004763831850141287,
0.027087751775979996,
0.01739213988184929,
-0.055951036512851715,
0.054333340376615524,
0.034262146800756454,
0.04381626099348068,
-0.02169838361442089,
0.10253474116325378,
-0.13523012399673462,
0.04727870225906372,
-0.026502065360546112,
-0.044732715934515,
-0.004048808012157679,
0.01353798434138298,
-0.06650759279727936,
0.06139430031180382,
0.07302320003509521,
-0.002564256079494953,
-0.016509370878338814,
-0.018210776150226593,
-0.02350260317325592,
-0.00428761774674058,
-0.053425952792167664,
-0.08347278088331223,
-0.1738363802433014,
-0.10795006155967712,
0.07144587486982346,
0.027682868763804436,
-0.19978885352611542,
0.009441777132451534,
-0.12677527964115143,
0.06398291885852814,
-0.14424802362918854,
0.10629870742559433,
0.09056377410888672,
0.02702915482223034,
-0.010291313752532005,
-0.07017415761947632,
0.03864141181111336,
0.09835661947727203,
-0.12078193575143814,
-0.07126311212778091
] |
null | null | transformers |
# Wav2Vec2-Large-XLSR-53-Portuguese
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Portuguese using the [Common Voice](https://huggingface.co/datasets/common_voice) dataset.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "pt", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("Rubens/Wav2Vec2-Large-XLSR-53-Portuguese")
model = Wav2Vec2ForCTC.from_pretrained("Rubens/Wav2Vec2-Large-XLSR-53-Portuguese")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
\tspeech_array, sampling_rate = torchaudio.load(batch["path"])
\tbatch["speech"] = resampler(speech_array).squeeze().numpy()
\treturn batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
\tlogits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Portuguese test data of Common Voice.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
test_dataset = load_dataset("common_voice", "pt", split="test")
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("Rubens/Wav2Vec2-Large-XLSR-53-Portuguese")
model = Wav2Vec2ForCTC.from_pretrained("Rubens/Wav2Vec2-Large-XLSR-53-Portuguese")
model.to("cuda")
chars_to_ignore_regex = '[\\,\\?\\.\\!\\-\\;\\:\\"\\“]' # TODO: adapt this list to include all special characters you removed from the data
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
\tbatch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower()
\tspeech_array, sampling_rate = torchaudio.load(batch["path"])
\tbatch["speech"] = resampler(speech_array).squeeze().numpy()
\treturn batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
\tinputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
\twith torch.no_grad():
\t\tlogits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
\tbatch["pred_strings"] = processor.batch_decode(pred_ids)
\treturn batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result (wer)**: 20.41 %
## Training
The Common Voice `train`, `validation` datasets were used for training.
The script used for training can be found at: https://github.com/RubensZimbres/wav2vec2/blob/main/fine-tuning.py
| {"language": "pt", "license": "apache-2.0", "tags": ["audio", "speech", "wav2vec2", "pt", "apache-2.0", "portuguese-speech-corpus", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week", "PyTorch"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Rubens XLSR Wav2Vec2 Large 53 Portuguese", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice pt", "type": "common_voice", "args": "pt"}, "metrics": [{"type": "wer", "value": "20.41%", "name": "Test WER"}]}]}]} | automatic-speech-recognition | Rubens/Wav2Vec2-Large-XLSR-53-Portuguese | [
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"pt",
"apache-2.0",
"portuguese-speech-corpus",
"xlsr-fine-tuning-week",
"PyTorch",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"pt"
] | TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #pt #apache-2.0 #portuguese-speech-corpus #xlsr-fine-tuning-week #PyTorch #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Portuguese
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Portuguese using the Common Voice dataset.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Portuguese test data of Common Voice.
Test Result (wer): 20.41 %
## Training
The Common Voice 'train', 'validation' datasets were used for training.
The script used for training can be found at: URL
| [
"# Wav2Vec2-Large-XLSR-53-Portuguese\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Portuguese using the Common Voice dataset.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Portuguese test data of Common Voice.\n\n\n\n\nTest Result (wer): 20.41 %",
"## Training\n\nThe Common Voice 'train', 'validation' datasets were used for training.\n\nThe script used for training can be found at: URL"
] | [
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #pt #apache-2.0 #portuguese-speech-corpus #xlsr-fine-tuning-week #PyTorch #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Portuguese\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Portuguese using the Common Voice dataset.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Portuguese test data of Common Voice.\n\n\n\n\nTest Result (wer): 20.41 %",
"## Training\n\nThe Common Voice 'train', 'validation' datasets were used for training.\n\nThe script used for training can be found at: URL"
] | [
100,
48,
20,
31,
34
] | [
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #pt #apache-2.0 #portuguese-speech-corpus #xlsr-fine-tuning-week #PyTorch #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Portuguese\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Portuguese using the Common Voice dataset.## Usage\n\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the Portuguese test data of Common Voice.\n\n\n\n\nTest Result (wer): 20.41 %## Training\n\nThe Common Voice 'train', 'validation' datasets were used for training.\n\nThe script used for training can be found at: URL"
] | [
-0.15109948813915253,
0.11478527635335922,
-0.004170035012066364,
0.026111597195267677,
0.12412568926811218,
-0.02225556969642639,
0.16909241676330566,
0.12400270253419876,
-0.07993032783269882,
-0.03446562960743904,
-0.0065877665765583515,
-0.0056011914275586605,
0.0550227165222168,
0.06337452679872513,
0.03482913225889206,
-0.17925940454006195,
0.010845080018043518,
0.009854021482169628,
-0.03043871559202671,
0.12004813551902771,
0.10136793553829193,
-0.04290140047669411,
0.0018764894921332598,
0.07935623824596405,
-0.1096857488155365,
0.06588787585496902,
0.037659142166376114,
-0.1498284637928009,
0.14557376503944397,
0.05511396750807762,
0.11605542153120041,
0.04243874177336693,
0.08776848763227463,
-0.19372659921646118,
0.013954775407910347,
0.05825638025999069,
0.012705642729997635,
0.021116551011800766,
0.05666416883468628,
-0.07605734467506409,
0.07597872614860535,
0.11389760673046112,
0.0260833278298378,
0.07235970348119736,
-0.09072402119636536,
-0.2546672821044922,
-0.03222053498029709,
-0.056952547281980515,
0.037466492503881454,
0.17000505328178406,
-0.039850927889347076,
0.07368552684783936,
-0.1441982537508011,
0.06695013493299484,
0.12625958025455475,
-0.15388496220111847,
-0.01620582304894924,
0.04738694056868553,
0.05828327685594559,
0.05073980242013931,
-0.005023987498134375,
0.014459184370934963,
0.029076898470520973,
0.0015719423536211252,
0.023638395592570305,
-0.03794363886117935,
-0.22055183351039886,
-0.032765943557024,
-0.10428107529878616,
-0.07128439843654633,
0.2670786678791046,
-0.03618893399834633,
-0.07119683176279068,
-0.0929366797208786,
0.009925233200192451,
0.04291870445013046,
0.014878945425152779,
-0.028803100809454918,
0.013582365587353706,
0.038975492119789124,
-0.008152803406119347,
-0.03896479681134224,
-0.08411205559968948,
-0.13226325809955597,
0.04251973330974579,
0.005747226998209953,
0.028983036056160927,
0.004230923019349575,
-0.12903811037540436,
0.09492497891187668,
-0.03972531110048294,
-0.09766624122858047,
0.0004863610374741256,
0.02268681861460209,
-0.024546867236495018,
0.025523055344820023,
-0.025746919214725494,
-0.14690297842025757,
0.07186191529035568,
0.03181683272123337,
0.09577115625143051,
-0.004478728864341974,
-0.08575490117073059,
0.07009529322385788,
-0.004391090013086796,
0.13044962286949158,
-0.07279915362596512,
-0.01733304373919964,
0.04531949386000633,
0.020445024594664574,
-0.04658498615026474,
-0.016593391075730324,
-0.09163706749677658,
-0.017419137060642242,
0.005326822400093079,
0.11054155230522156,
0.0034773913212120533,
-0.01591145060956478,
-0.07790784537792206,
-0.051619015634059906,
-0.01041374634951353,
-0.1325465887784958,
-0.02841358631849289,
0.06516127288341522,
-0.030531438067555428,
0.08358435332775116,
0.10384002327919006,
0.033160675317049026,
-0.09290596097707748,
-0.0402773953974247,
0.030627869069576263,
0.043619055300951004,
-0.0659731775522232,
-0.12003899365663528,
0.021985741332173347,
-0.01905285380780697,
-0.02627519518136978,
-0.09203550219535828,
-0.13430693745613098,
-0.09658518433570862,
0.007204051129519939,
0.01719576120376587,
0.008339527994394302,
-0.11381074786186218,
-0.008688139729201794,
-0.04964965581893921,
-0.03018035925924778,
0.041500262916088104,
-0.03380054607987404,
0.06842846423387527,
0.019450541585683823,
0.03460853174328804,
0.04687502235174179,
0.07323391735553741,
-0.10069194436073303,
-0.06173253059387207,
-0.0284007228910923,
0.15833470225334167,
-0.051800016313791275,
-0.07183010131120682,
-0.10290118306875229,
-0.0644170343875885,
-0.03422015905380249,
0.06000745669007301,
0.057680677622556686,
0.13155537843704224,
-0.2666851282119751,
-0.10827908664941788,
0.2964671850204468,
-0.14069560170173645,
-0.08095786720514297,
0.14546827971935272,
-0.032700955867767334,
0.1737334132194519,
0.11478478461503983,
0.17129570245742798,
0.14956706762313843,
-0.1717318892478943,
0.08367688953876495,
-0.03752269595861435,
0.009671223349869251,
-0.02055811509490013,
0.12418423593044281,
-0.11105415225028992,
0.008210841566324234,
0.023387789726257324,
-0.1632710099220276,
0.09202326834201813,
-0.025195596739649773,
-0.07809464633464813,
0.004044870380312204,
-0.04857020080089569,
0.03670662268996239,
0.03542526438832283,
0.010262904688715935,
-0.006709384731948376,
-0.07499038428068161,
0.06189300864934921,
0.0929105132818222,
-0.15887212753295898,
0.04978356137871742,
-0.10431235283613205,
0.06725359708070755,
-0.05640114098787308,
-0.012324818409979343,
-0.16708607971668243,
0.09466573596000671,
-0.017597030848264694,
0.041519276797771454,
0.01708354614675045,
0.0656348243355751,
0.008222447708249092,
0.023478202521800995,
-0.04296175017952919,
-0.028045419603586197,
-0.08773353695869446,
-0.03379387781023979,
0.03588508442044258,
-0.11204050481319427,
-0.021829701960086823,
-0.05321268364787102,
0.11947234719991684,
-0.12272021919488907,
0.04945562779903412,
0.008167942985892296,
0.009698795154690742,
0.007366246078163385,
-0.014192076399922371,
0.05189703404903412,
0.11491058766841888,
-0.006994211580604315,
-0.03850831091403961,
0.05600579455494881,
0.029702361673116684,
-0.06263066828250885,
0.0754215195775032,
-0.1128813698887825,
0.08595996350049973,
0.0797300785779953,
-0.04744238406419754,
0.0031299693509936333,
0.04876159504055977,
-0.008315575309097767,
-0.0026574560906738043,
-0.14365538954734802,
-0.017370354384183884,
0.27569934725761414,
0.018063047900795937,
0.14610716700553894,
-0.1377464234828949,
0.025809098035097122,
0.014336069114506245,
-0.0645550936460495,
0.059656333178281784,
0.047490816563367844,
0.03958728164434433,
0.09041512757539749,
0.0549309179186821,
-0.04691296070814133,
-0.07788168638944626,
0.2869906425476074,
-0.057029832154512405,
-0.08356321603059769,
0.009210413321852684,
0.009201076813042164,
0.010802454315125942,
0.05081280320882797,
-0.20613054931163788,
-0.035251419991254807,
0.03736742213368416,
0.04792298749089241,
0.06176236644387245,
-0.14524118602275848,
-0.01010151393711567,
0.03387238085269928,
-0.13148321211338043,
-0.17963476479053497,
0.04123751446604729,
-0.038421180099248886,
0.04887323081493378,
-0.09425180405378342,
-0.017518186941742897,
-0.005003557540476322,
-0.039824552834033966,
-0.17145034670829773,
0.1524057537317276,
-0.09724461287260056,
-0.22344864904880524,
-0.15072599053382874,
0.12346531450748444,
0.021495282649993896,
0.02799406833946705,
0.09895990788936615,
-0.12127557396888733,
0.03375422582030296,
-0.006050200201570988,
0.08775646239519119,
0.03210561349987984,
-0.07247386872768402,
-0.03433733060956001,
0.07803264260292053,
0.06410233676433563,
-0.1769670695066452,
0.010413561016321182,
-0.016086362302303314,
-0.10736477375030518,
-0.027858376502990723,
-0.04659614339470863,
0.059560466557741165,
0.15216302871704102,
0.035841476172208786,
0.028831709176301956,
-0.0281166173517704,
0.10303735733032227,
-0.10509150475263596,
-0.049980755895376205,
0.26486605405807495,
-0.002914197277277708,
-0.020138302817940712,
0.06799820065498352,
0.03444414958357811,
-0.04720053821802139,
-0.021484967321157455,
-0.019405687227845192,
-0.08894456923007965,
-0.31149518489837646,
-0.09221947193145752,
-0.05998701974749565,
-0.035606395453214645,
-0.016661511734128,
-0.010125386528670788,
0.033147070556879044,
0.036134909838438034,
0.0009959464659914374,
-0.08979873359203339,
0.12350085377693176,
0.006857152562588453,
0.06231239065527916,
0.0042318133637309074,
0.09914043545722961,
-0.04639355465769768,
-0.03485254570841789,
0.03063414804637432,
0.017394090071320534,
0.17991988360881805,
0.019530193880200386,
0.09004725515842438,
0.13318340480327606,
0.10446592420339584,
0.06691311299800873,
0.09966608136892319,
-0.011968499049544334,
0.03625624626874924,
0.0026756420265883207,
-0.07027283310890198,
-0.07141529023647308,
-0.029002917930483818,
0.057388003915548325,
-0.01892797090113163,
-0.07521752268075943,
0.011584709398448467,
0.044677626341581345,
0.1511448323726654,
0.017417920753359795,
-0.19602778553962708,
-0.06339982151985168,
-0.032496921718120575,
0.01585068367421627,
-0.02304307371377945,
0.029287056997418404,
0.17320962250232697,
-0.17868593335151672,
0.03383152186870575,
-0.00931699387729168,
0.10559961199760437,
-0.06009808927774429,
0.013211390934884548,
-0.023167740553617477,
-0.018208974972367287,
0.022679079324007034,
0.10666064918041229,
-0.28283363580703735,
0.21668854355812073,
0.0016183230327442288,
0.12939922511577606,
-0.04371808469295502,
0.046427495777606964,
0.01575883850455284,
0.029452724382281303,
0.13775581121444702,
0.00222355080768466,
0.06580070406198502,
-0.051582541316747665,
-0.08078434318304062,
0.07584477961063385,
-0.03070824220776558,
-0.04980865865945816,
0.029506981372833252,
0.012468907982110977,
0.010495209135115147,
-0.003004817059263587,
-0.032350219786167145,
-0.1978074163198471,
-0.10344937443733215,
0.014348890632390976,
0.06518842279911041,
0.11002519726753235,
-0.05308661609888077,
-0.1012389287352562,
-0.02723505347967148,
0.11144419014453888,
-0.10149776935577393,
-0.06868770718574524,
-0.09133372455835342,
0.013554992154240608,
0.09091789275407791,
-0.06255228072404861,
0.014088038355112076,
0.08346624672412872,
0.10576248914003372,
-0.027281006798148155,
-0.014470960944890976,
0.07103727012872696,
-0.1141926497220993,
-0.09956610202789307,
-0.05060917139053345,
0.17908169329166412,
0.08219029009342194,
0.057687051594257355,
0.05395736172795296,
-0.02145778387784958,
0.02447730302810669,
-0.056832022964954376,
-0.00840538740158081,
0.15946054458618164,
-0.07674843817949295,
0.04009319469332695,
-0.0802956148982048,
-0.1835261732339859,
-0.08243732154369354,
-0.06351761519908905,
0.1909019649028778,
0.0221719853579998,
-0.05147112160921097,
0.12957604229450226,
0.15376447141170502,
-0.14242435991764069,
-0.17710085213184357,
-0.0006081591127440333,
0.13007959723472595,
0.09974238276481628,
-0.01782512478530407,
-0.2666388154029846,
0.022400852292776108,
0.04904459789395332,
-0.01047614123672247,
-0.11176182329654694,
-0.3544633686542511,
-0.12727588415145874,
0.1053338348865509,
-0.011682617478072643,
0.07098745554685593,
-0.008351558819413185,
-0.019262492656707764,
-0.03619026020169258,
-0.09297404438257217,
0.019366079941391945,
-0.2027880996465683,
0.06697297096252441,
0.040621932595968246,
0.02849564515054226,
0.02393082156777382,
-0.029737012460827827,
0.07504986971616745,
0.08761425316333771,
-0.000969278858974576,
0.023791370913386345,
0.04200194403529167,
0.1321875900030136,
0.01177695207297802,
0.12191358953714371,
-0.018447058275341988,
0.02802875265479088,
-0.09852776676416397,
-0.06376726180315018,
-0.07455131411552429,
0.07096798717975616,
-0.01002831943333149,
-0.015651516616344452,
0.041667018085718155,
-0.03735204413533211,
0.007844784297049046,
0.016840731725096703,
-0.023150987923145294,
-0.10336093604564667,
0.06968946754932404,
0.08134003728628159,
0.19378113746643066,
0.023546703159809113,
-0.11803577095270157,
-0.003413101891055703,
0.0028608411084860563,
0.11214892566204071,
-0.10788456350564957,
0.0547298938035965,
0.07177279144525528,
0.032731667160987854,
0.12947078049182892,
0.040229275822639465,
-0.10254720598459244,
0.08571334183216095,
0.05170268192887306,
-0.018759436905384064,
-0.07345901429653168,
-0.04368490353226662,
-0.04523925110697746,
-0.05585923418402672,
0.015120758675038815,
0.11771366745233536,
-0.052066393196582794,
-0.05786656215786934,
-0.029757976531982422,
-0.005972676444798708,
-0.10995201021432877,
0.2141524702310562,
0.017201751470565796,
0.060977257788181305,
-0.11121317744255066,
0.07164090126752853,
0.009254788048565388,
-0.0659308135509491,
0.03324482962489128,
-0.030151845887303352,
-0.10191620141267776,
-0.06693903356790543,
0.01205591019243002,
0.217552050948143,
0.01609906181693077,
-0.13403448462486267,
-0.11584174633026123,
-0.08770938217639923,
-0.009220587089657784,
0.049339380115270615,
0.04167783260345459,
0.027705421671271324,
-0.08393193781375885,
-0.07264625281095505,
-0.07294969260692596,
0.06416817754507065,
0.11644374579191208,
-0.04942267760634422,
-0.06312065571546555,
0.15992948412895203,
0.11932823807001114,
-0.018490735441446304,
-0.01824183575809002,
-0.0766262337565422,
-0.04452042654156685,
0.09525944292545319,
-0.039872173219919205,
0.013942211866378784,
-0.028387878090143204,
0.01272660493850708,
-0.008142668753862381,
-0.052220575511455536,
0.004662895575165749,
0.08302871137857437,
-0.09134003520011902,
0.019624611362814903,
-0.01912582665681839,
0.09651930630207062,
-0.08286704123020172,
0.016196096315979958,
0.009978172369301319,
-0.06977927684783936,
0.07653449475765228,
0.08828307688236237,
-0.07047369331121445,
0.10644466429948807,
-0.17831425368785858,
-0.037478551268577576,
0.05703151226043701,
0.046634379774332047,
-0.02434644289314747,
-0.13574014604091644,
0.07627447694540024,
0.08208176493644714,
0.010740680620074272,
-0.004761644173413515,
0.09170197695493698,
-0.07348443567752838,
0.027940014377236366,
0.010773682966828346,
-0.02030233107507229,
0.00017793681763578206,
0.09074617177248001,
0.0966184064745903,
0.11489290744066238,
0.13833877444267273,
-0.09784498810768127,
0.12603522837162018,
-0.16236178576946259,
-0.005475452169775963,
-0.035480186343193054,
-0.002386948326602578,
-0.13382834196090698,
-0.058583226054906845,
0.05214659497141838,
-0.05415329709649086,
0.0958186611533165,
0.08405829966068268,
0.1493091583251953,
-0.03619180992245674,
-0.05936660245060921,
0.035979222506284714,
0.0024376993533223867,
0.17171140015125275,
0.04139562323689461,
0.03646573796868324,
0.01904338411986828,
0.004238954279571772,
0.001912437379360199,
0.10402946919202805,
-0.004863874986767769,
0.12013116478919983,
0.07142212986946106,
0.06547708064317703,
0.08828241378068924,
-0.033004820346832275,
-0.06313295662403107,
-0.061994101852178574,
0.00021307241695467383,
0.0508359894156456,
-0.04949482902884483,
0.09023626148700714,
0.13267162442207336,
-0.08368878811597824,
0.10528820008039474,
0.06220491603016853,
-0.08831343054771423,
-0.1713714748620987,
-0.10427601635456085,
-0.04605499655008316,
-0.1265564113855362,
0.006946300622075796,
-0.11201740056276321,
-0.012179865501821041,
0.038808465003967285,
0.03152482584118843,
-0.031450796872377396,
0.12444785237312317,
0.013621903024613857,
-0.1356215924024582,
0.06824234127998352,
-0.059921350330114365,
0.052996207028627396,
-0.1046232208609581,
0.06617356091737747,
0.118258997797966,
0.0746140107512474,
0.06530936807394028,
0.02973327226936817,
-0.024026276543736458,
0.016229992732405663,
-0.09049656987190247,
-0.05586590617895126,
-0.010272984392940998,
-0.014693128876388073,
0.061450157314538956,
0.15084943175315857,
0.09452538937330246,
-0.06767588108778,
0.044796086847782135,
0.1579279899597168,
-0.02527753822505474,
-0.14065898954868317,
-0.19755521416664124,
0.10004688054323196,
0.04579448327422142,
0.012070811353623867,
-0.023906031623482704,
-0.041630204766988754,
-0.012102690525352955,
0.23734456300735474,
0.23761394619941711,
0.09353865683078766,
0.009369686245918274,
0.01029027346521616,
-0.01727554015815258,
-0.009874379262328148,
0.04329439252614975,
0.05870487913489342,
0.15517279505729675,
-0.024079713970422745,
0.01777566224336624,
-0.06439435482025146,
-0.07021207362413406,
-0.0047102998942136765,
0.07134958356618881,
-0.07788076996803284,
-0.07599148154258728,
0.00807210337370634,
0.14618152379989624,
-0.027308277785778046,
-0.08016582578420639,
-0.06312006711959839,
-0.10468116402626038,
-0.11880490928888321,
-0.044240888208150864,
0.03708606958389282,
0.11192096024751663,
0.024448033422231674,
-0.025501860305666924,
-0.02267514541745186,
0.203365296125412,
0.015494028106331825,
-0.07529763132333755,
-0.13047966361045837,
0.028232183307409286,
-0.12155607342720032,
0.09092541038990021,
-0.019931644201278687,
0.12351678311824799,
0.0391104556620121,
0.0936296358704567,
-0.033715128898620605,
0.09537666290998459,
-0.010503780096769333,
0.04433872178196907,
0.028783272951841354,
0.031239891424775124,
-0.05864524841308594,
0.10133536905050278,
0.01515661645680666,
-0.1318671554327011,
0.06743154674768448,
-0.11082769930362701,
-0.038865406066179276,
-0.10147816687822342,
0.06439882516860962,
-0.05651912838220596,
0.05808136984705925,
0.11132984608411789,
-0.06927677243947983,
-0.032751958817243576,
-0.07440800964832306,
0.07685897499322891,
0.042929187417030334,
0.01448887400329113,
-0.03561718016862869,
-0.20950452983379364,
-0.017754489555954933,
-0.07307648658752441,
-0.028855014592409134,
-0.1723979264497757,
-0.0059437998570501804,
0.03788207471370697,
-0.1026022732257843,
-0.01881237141788006,
0.038559023290872574,
0.03743395954370499,
0.06117862090468407,
0.006837756372988224,
-0.0436718612909317,
0.05571223050355911,
0.1303526908159256,
-0.129231795668602,
-0.07165128737688065
] |
null | null | transformers |
# Wav2Vec2-Large-XLSR-53-Portuguese
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Portuguese using the [Common Voice](https://huggingface.co/datasets/common_voice) dataset.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "pt", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("Rubens/Wav2Vec2-Large-XLSR-53-a-Portuguese")
model = Wav2Vec2ForCTC.from_pretrained("Rubens/Wav2Vec2-Large-XLSR-53-a-Portuguese")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the audio files as arrays
def speech_file_to_array_fn(batch):
\tspeech_array, sampling_rate = torchaudio.load(batch["path"])
\tbatch["speech"] = resampler(speech_array).squeeze().numpy()
\treturn batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
\tlogits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Portuguese test data of Common Voice.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
test_dataset = load_dataset("common_voice", "pt", split="test")
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("Rubens/Wav2Vec2-Large-XLSR-53-a-Portuguese")
model = Wav2Vec2ForCTC.from_pretrained("Rubens/Wav2Vec2-Large-XLSR-53-a-Portuguese")
model.to("cuda")
chars_to_ignore_regex = '[\\,\\?\\.\\!\\-\\;\\:\\"\\“]' # TODO: adapt this list to include all special characters you removed from the data
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
\tbatch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower()
\tspeech_array, sampling_rate = torchaudio.load(batch["path"])
\tbatch["speech"] = resampler(speech_array).squeeze().numpy()
\treturn batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
\tinputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
\twith torch.no_grad():
\t\tlogits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
\tbatch["pred_strings"] = processor.batch_decode(pred_ids)
\treturn batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result(wer)**: 19.30 %
## Training
The Common Voice `train`, `validation` datasets were used for training.
The script used for training can be found at: https://github.com/RubensZimbres/wav2vec2/blob/main/fine-tuning.py
| {"language": "pt", "license": "apache-2.0", "tags": ["audio", "speech", "wav2vec2", "pt", "apache-2.0", "portuguese-speech-corpus", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week", "PyTorch"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Rubens XLSR Wav2Vec2 Large 53 Portuguese", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice pt", "type": "common_voice", "args": "pt"}, "metrics": [{"type": "wer", "value": "19.30%", "name": "Test WER"}]}]}]} | automatic-speech-recognition | Rubens/Wav2Vec2-Large-XLSR-53-a-Portuguese | [
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"pt",
"apache-2.0",
"portuguese-speech-corpus",
"xlsr-fine-tuning-week",
"PyTorch",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"pt"
] | TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #pt #apache-2.0 #portuguese-speech-corpus #xlsr-fine-tuning-week #PyTorch #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Portuguese
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Portuguese using the Common Voice dataset.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Portuguese test data of Common Voice.
Test Result(wer): 19.30 %
## Training
The Common Voice 'train', 'validation' datasets were used for training.
The script used for training can be found at: URL
| [
"# Wav2Vec2-Large-XLSR-53-Portuguese\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Portuguese using the Common Voice dataset.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Portuguese test data of Common Voice.\n\n\n\n\nTest Result(wer): 19.30 %",
"## Training\n\nThe Common Voice 'train', 'validation' datasets were used for training.\n\nThe script used for training can be found at: URL"
] | [
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #pt #apache-2.0 #portuguese-speech-corpus #xlsr-fine-tuning-week #PyTorch #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Portuguese\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Portuguese using the Common Voice dataset.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Portuguese test data of Common Voice.\n\n\n\n\nTest Result(wer): 19.30 %",
"## Training\n\nThe Common Voice 'train', 'validation' datasets were used for training.\n\nThe script used for training can be found at: URL"
] | [
100,
48,
20,
30,
34
] | [
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #pt #apache-2.0 #portuguese-speech-corpus #xlsr-fine-tuning-week #PyTorch #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Portuguese\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Portuguese using the Common Voice dataset.## Usage\n\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the Portuguese test data of Common Voice.\n\n\n\n\nTest Result(wer): 19.30 %## Training\n\nThe Common Voice 'train', 'validation' datasets were used for training.\n\nThe script used for training can be found at: URL"
] | [
-0.15031500160694122,
0.10238109529018402,
-0.003943650983273983,
0.026146797463297844,
0.12981371581554413,
-0.02462264709174633,
0.16818104684352875,
0.12186754494905472,
-0.07139995694160461,
-0.03665498271584511,
-0.004478965885937214,
-0.000711666711140424,
0.05465729534626007,
0.06456699222326279,
0.04038412868976593,
-0.18743132054805756,
0.007420478854328394,
0.012051153928041458,
-0.03467176482081413,
0.1209721639752388,
0.09797631204128265,
-0.04474278911948204,
-0.00027066044276580215,
0.08458515256643295,
-0.11118869483470917,
0.06256639957427979,
0.03475222736597061,
-0.1511377990245819,
0.1454690396785736,
0.054773960262537,
0.12168577313423157,
0.04194209724664688,
0.09140696376562119,
-0.18944059312343597,
0.015526303090155125,
0.06004124507308006,
0.013866918161511421,
0.0263530220836401,
0.04918863624334335,
-0.08424066752195358,
0.07562019675970078,
0.11390325427055359,
0.027327988296747208,
0.07590335607528687,
-0.0910840854048729,
-0.24215158820152283,
-0.02167796902358532,
-0.05860690772533417,
0.03801364451646805,
0.1744130551815033,
-0.03728986531496048,
0.06245361268520355,
-0.14383378624916077,
0.06927088648080826,
0.12743620574474335,
-0.1431865245103836,
-0.016600001603364944,
0.04488592967391014,
0.05535576492547989,
0.049944277852773666,
-0.01170643325895071,
0.012168574146926403,
0.02406936325132847,
0.0032706735655665398,
0.029962342232465744,
-0.033018533140420914,
-0.21854285895824432,
-0.03747192397713661,
-0.1034872829914093,
-0.06631293147802353,
0.26366811990737915,
-0.035932205617427826,
-0.07002439349889755,
-0.08803874254226685,
0.006051766686141491,
0.03974585607647896,
0.011389665305614471,
-0.029460474848747253,
0.013589105568826199,
0.04214897379279137,
-0.012432513758540154,
-0.045612603425979614,
-0.08454817533493042,
-0.13378384709358215,
0.04426378384232521,
0.01897202990949154,
0.024124640971422195,
0.004448935855180025,
-0.1303875595331192,
0.09564593434333801,
-0.026393938809633255,
-0.09406585991382599,
0.0051447986625134945,
0.0219451654702425,
-0.02281353436410427,
0.030852988362312317,
-0.024447476491332054,
-0.15661241114139557,
0.06138165667653084,
0.018641913309693336,
0.10499105602502823,
-0.0028351254295557737,
-0.07964223623275757,
0.0707731768488884,
-0.009220777079463005,
0.12906324863433838,
-0.07353140413761139,
-0.01704465039074421,
0.04689176753163338,
0.0207999087870121,
-0.04895560443401337,
-0.022122716531157494,
-0.09555363655090332,
-0.01917468011379242,
-0.003942759241908789,
0.1051059439778328,
0.007758766412734985,
-0.008867197670042515,
-0.07836250960826874,
-0.05642901360988617,
-0.0027933758683502674,
-0.12964214384555817,
-0.03259056806564331,
0.06648973375558853,
-0.03387761116027832,
0.08667440712451935,
0.1138717383146286,
0.03607267886400223,
-0.09323546290397644,
-0.035466518253088,
0.0300862155854702,
0.044507645070552826,
-0.06995047628879547,
-0.12057208269834518,
0.018557095900177956,
-0.00815435592085123,
-0.02670958638191223,
-0.09162668883800507,
-0.13679863512516022,
-0.09427738934755325,
0.011878378689289093,
0.024383744224905968,
0.009986802004277706,
-0.11524925380945206,
-0.00897387694567442,
-0.05136510357260704,
-0.030648833140730858,
0.039740901440382004,
-0.03320866450667381,
0.06928585469722748,
0.010516300797462463,
0.037595462054014206,
0.04812360182404518,
0.07731595635414124,
-0.09911277145147324,
-0.06490965932607651,
-0.03370596095919609,
0.1610279530286789,
-0.05024445056915283,
-0.06665898114442825,
-0.10053583234548569,
-0.07093536853790283,
-0.04840562865138054,
0.060657940804958344,
0.057613179087638855,
0.12883804738521576,
-0.2613840401172638,
-0.11044663935899734,
0.296854704618454,
-0.13622169196605682,
-0.07498573511838913,
0.14191333949565887,
-0.033528439700603485,
0.17921407520771027,
0.11720813810825348,
0.1700085699558258,
0.15162523090839386,
-0.172444149851799,
0.08526924252510071,
-0.03476980701088905,
-0.0010879505425691605,
-0.026923932135105133,
0.12102944403886795,
-0.10775097459554672,
0.006997685879468918,
0.024525392800569534,
-0.15633191168308258,
0.09782928228378296,
-0.02460421994328499,
-0.07602322101593018,
0.00819907896220684,
-0.04861854016780853,
0.04495210945606232,
0.03730121627449989,
0.01109863817691803,
-0.00522121787071228,
-0.08139590173959732,
0.07053980231285095,
0.09104078263044357,
-0.15798771381378174,
0.050272196531295776,
-0.10366698354482651,
0.05203147232532501,
-0.05041981860995293,
-0.012856081128120422,
-0.16085723042488098,
0.09465532749891281,
-0.020664174109697342,
0.049686308950185776,
0.01789839193224907,
0.07001039385795593,
0.009345845319330692,
0.025908388197422028,
-0.045393895357847214,
-0.028950195759534836,
-0.08624080568552017,
-0.03212719038128853,
0.03195103257894516,
-0.112040676176548,
-0.02359350584447384,
-0.05034002289175987,
0.12030564248561859,
-0.120868019759655,
0.05189752206206322,
0.0007409179816022515,
0.005467145703732967,
0.008046169765293598,
-0.014743687584996223,
0.056525811553001404,
0.11411090940237045,
-0.0052389465272426605,
-0.037866368889808655,
0.060786861926317215,
0.0310839656740427,
-0.059972405433654785,
0.07280326634645462,
-0.11645068228244781,
0.08469897508621216,
0.08012837171554565,
-0.06014418229460716,
0.0018252867739647627,
0.052819278091192245,
-0.010070167481899261,
-0.005979100242257118,
-0.13864341378211975,
-0.018202001228928566,
0.287243515253067,
0.0153965437784791,
0.146810844540596,
-0.13134807348251343,
0.02639012038707733,
0.012676326557993889,
-0.06446205079555511,
0.06577039510011673,
0.039751648902893066,
0.040425583720207214,
0.08709108084440231,
0.053826313465833664,
-0.054150331765413284,
-0.0787462443113327,
0.29291731119155884,
-0.04879715293645859,
-0.07837913185358047,
0.011869901791214943,
0.007573436014354229,
0.007575212046504021,
0.048224836587905884,
-0.21281921863555908,
-0.03634599223732948,
0.03416374698281288,
0.0451691672205925,
0.0630905032157898,
-0.14796331524848938,
-0.011076992377638817,
0.03575686365365982,
-0.12957829236984253,
-0.1863984912633896,
0.03561954200267792,
-0.04097510129213333,
0.04633545130491257,
-0.09123776853084564,
-0.01617324724793434,
-0.0034721011761575937,
-0.039757050573825836,
-0.17198866605758667,
0.1504727154970169,
-0.09678894281387329,
-0.2224801778793335,
-0.15073516964912415,
0.12102489918470383,
0.02912711910903454,
0.02896782197058201,
0.09714175015687943,
-0.1222304254770279,
0.03614572063088417,
-0.0035705575719475746,
0.09218909591436386,
0.029848018661141396,
-0.074111707508564,
-0.031006738543510437,
0.077845998108387,
0.06314606964588165,
-0.17812788486480713,
0.012057669460773468,
-0.01805512048304081,
-0.10227108746767044,
-0.03129550814628601,
-0.048654019832611084,
0.05692918226122856,
0.15658243000507355,
0.035319335758686066,
0.027362383902072906,
-0.03179970383644104,
0.09705295413732529,
-0.10457377135753632,
-0.0455726757645607,
0.26101577281951904,
-0.0035509734880179167,
-0.019462889060378075,
0.06581040471792221,
0.03554963320493698,
-0.047006137669086456,
-0.01925516687333584,
-0.019977416843175888,
-0.09457126259803772,
-0.3105718791484833,
-0.09378991276025772,
-0.06292547285556793,
-0.03203042224049568,
-0.018800748512148857,
-0.010988041758537292,
0.037263017147779465,
0.034955188632011414,
0.0010444270446896553,
-0.08714386820793152,
0.12325786799192429,
0.005552235990762711,
0.05775219574570656,
0.002095696981996298,
0.10229003429412842,
-0.04541737586259842,
-0.03417564556002617,
0.025571484118700027,
0.015656977891921997,
0.1804104596376419,
0.014138374477624893,
0.0975666269659996,
0.12931926548480988,
0.10092251002788544,
0.06985726952552795,
0.10116758197546005,
-0.01026557944715023,
0.03411717712879181,
0.0035697908606380224,
-0.06439501792192459,
-0.07824479788541794,
-0.028327960520982742,
0.061555638909339905,
-0.016536250710487366,
-0.07160639762878418,
0.01926615461707115,
0.04270382970571518,
0.15322262048721313,
0.007818319834768772,
-0.19705051183700562,
-0.07276270538568497,
-0.035160861909389496,
0.016511572524905205,
-0.02006407454609871,
0.03170020133256912,
0.17449542880058289,
-0.17979033291339874,
0.029529694467782974,
-0.011121612042188644,
0.10514707863330841,
-0.06211008131504059,
0.014390815980732441,
-0.022570183500647545,
-0.008034422062337399,
0.01955319568514824,
0.1041780635714531,
-0.27530035376548767,
0.21716243028640747,
-0.00024786204448901117,
0.1328354775905609,
-0.04382524639368057,
0.042592745274305344,
0.018976865336298943,
0.027402518317103386,
0.13620834052562714,
0.004122194368392229,
0.06918127834796906,
-0.05633864924311638,
-0.0843443050980568,
0.07259674370288849,
-0.02799832634627819,
-0.047716859728097916,
0.026707125827670097,
0.01116102747619152,
0.008063238114118576,
0.005223365500569344,
-0.033528607338666916,
-0.19127309322357178,
-0.10474561899900436,
0.014816648326814175,
0.06340296566486359,
0.11134928464889526,
-0.048643916845321655,
-0.10214821994304657,
-0.01598167233169079,
0.11396557837724686,
-0.09982622414827347,
-0.06851422041654587,
-0.0919223353266716,
0.01351388730108738,
0.08549989759922028,
-0.06499786674976349,
0.018768317997455597,
0.08524387329816818,
0.10289204120635986,
-0.025523459538817406,
-0.015165003947913647,
0.07475239038467407,
-0.11759477108716965,
-0.09445556253194809,
-0.051286112517118454,
0.1796276867389679,
0.08406984806060791,
0.060856834053993225,
0.048537470400333405,
-0.023877786472439766,
0.025302764028310776,
-0.05449220538139343,
-0.0012352934572845697,
0.15892210602760315,
-0.08405967801809311,
0.037806976586580276,
-0.07922305911779404,
-0.17708611488342285,
-0.08254194259643555,
-0.059275224804878235,
0.19826199114322662,
0.017674526199698448,
-0.05226348340511322,
0.13298295438289642,
0.15287920832633972,
-0.14044307172298431,
-0.18213783204555511,
0.003044111654162407,
0.13275595009326935,
0.09873082488775253,
-0.020505283027887344,
-0.2757366895675659,
0.022606708109378815,
0.04296199604868889,
-0.009235712699592113,
-0.10905451327562332,
-0.35411107540130615,
-0.1278645098209381,
0.10698281973600388,
-0.009827910922467709,
0.07655242085456848,
-0.002624982036650181,
-0.016695812344551086,
-0.0351998433470726,
-0.09353719651699066,
0.016690384596586227,
-0.19834594428539276,
0.07163821160793304,
0.043513376265764236,
0.0293772853910923,
0.023995859548449516,
-0.02955193631350994,
0.06818759441375732,
0.0846741795539856,
-0.005798090249300003,
0.02173236571252346,
0.03678440302610397,
0.13123857975006104,
0.013509267941117287,
0.12168136984109879,
-0.016705172136425972,
0.026445835828781128,
-0.08449086546897888,
-0.06859239190816879,
-0.07763191312551498,
0.07141172140836716,
-0.005487168673425913,
-0.02303917519748211,
0.04163112863898277,
-0.03882412984967232,
0.015048613771796227,
0.019232140854001045,
-0.026325736194849014,
-0.10127165168523788,
0.06882137805223465,
0.08282159268856049,
0.19825364649295807,
0.024344874545931816,
-0.1171071007847786,
-0.006752626970410347,
0.0024151660036295652,
0.11199235916137695,
-0.11236194521188736,
0.05088226869702339,
0.06947888433933258,
0.03686494380235672,
0.12718883156776428,
0.04198489338159561,
-0.10037156939506531,
0.08643943071365356,
0.051491186022758484,
-0.01812942884862423,
-0.08063825219869614,
-0.04526139423251152,
-0.04266888275742531,
-0.05781283974647522,
0.015294274315237999,
0.11335786432027817,
-0.05810386314988136,
-0.05464858189225197,
-0.028155440464615822,
-0.00667377095669508,
-0.11177146434783936,
0.2063259333372116,
0.020503228530287743,
0.059884753078222275,
-0.10791835188865662,
0.06787868589162827,
0.008838110603392124,
-0.058775193989276886,
0.035408101975917816,
-0.027726847678422928,
-0.10313450545072556,
-0.06588224321603775,
0.0043797981925308704,
0.2114090472459793,
0.019745921716094017,
-0.12823185324668884,
-0.11532161384820938,
-0.08625483512878418,
-0.011183167807757854,
0.046412184834480286,
0.03662562742829323,
0.02118533104658127,
-0.09041805565357208,
-0.06062871217727661,
-0.07677654922008514,
0.06322816014289856,
0.10842636972665787,
-0.05212199315428734,
-0.0634813979268074,
0.1657676249742508,
0.12483865767717361,
-0.017392324283719063,
-0.018946627154946327,
-0.08023156225681305,
-0.04040581360459328,
0.09656200557947159,
-0.03205693140625954,
0.006474181544035673,
-0.03354412689805031,
0.01102976780384779,
-0.01210924331098795,
-0.05390095338225365,
0.0037596614565700293,
0.08612009137868881,
-0.09174045920372009,
0.018468040972948074,
-0.01745789311826229,
0.09309887140989304,
-0.08211323618888855,
0.0176143329590559,
0.012198621407151222,
-0.06619171053171158,
0.07253595441579819,
0.09420185536146164,
-0.07273989170789719,
0.10635241866111755,
-0.18631769716739655,
-0.035466134548187256,
0.06188683211803436,
0.044171351939439774,
-0.02398763783276081,
-0.12872985005378723,
0.07284580171108246,
0.07616478949785233,
0.012632711790502071,
-0.009696024470031261,
0.0951443761587143,
-0.07374706864356995,
0.03245016559958458,
0.012097781524062157,
-0.018204117193818092,
-0.002730826148763299,
0.09092351794242859,
0.0930563360452652,
0.11066833883523941,
0.1367414891719818,
-0.09740036725997925,
0.13214977085590363,
-0.16270337998867035,
-0.005695466883480549,
-0.03318098559975624,
-0.0037713455967605114,
-0.12750619649887085,
-0.06145089864730835,
0.0514405220746994,
-0.05394177511334419,
0.09161204844713211,
0.08502324670553207,
0.14081116020679474,
-0.037665072828531265,
-0.061432041227817535,
0.037833232432603836,
0.0014921826077625155,
0.17111974954605103,
0.03715946525335312,
0.03283950313925743,
0.02446581795811653,
0.007422994822263718,
0.005315370857715607,
0.11011020094156265,
0.0008961650892160833,
0.12138700485229492,
0.07233376801013947,
0.06534299999475479,
0.08631519973278046,
-0.03289360553026199,
-0.0769302174448967,
-0.07029199600219727,
-0.001977651147171855,
0.048292260617017746,
-0.0492488332092762,
0.08958503603935242,
0.1301572322845459,
-0.0788518562912941,
0.10840878635644913,
0.06464185565710068,
-0.0914573147892952,
-0.16975468397140503,
-0.10588576644659042,
-0.03923783078789711,
-0.12788091599941254,
0.004026522394269705,
-0.10979539901018143,
-0.015915388241410255,
0.04071037471294403,
0.029253456741571426,
-0.029702920466661453,
0.13844908773899078,
0.016945939511060715,
-0.1348615139722824,
0.06692604720592499,
-0.06171039119362831,
0.04860644415020943,
-0.11592400819063187,
0.06464610248804092,
0.11767517030239105,
0.07472962886095047,
0.06639042496681213,
0.02541482448577881,
-0.0353533998131752,
0.018498994410037994,
-0.08803687989711761,
-0.0601651556789875,
-0.011248142458498478,
-0.011862395331263542,
0.05974192917346954,
0.15306013822555542,
0.09464084357023239,
-0.06847440451383591,
0.04389772564172745,
0.1575673669576645,
-0.024168677628040314,
-0.13974259793758392,
-0.19384177029132843,
0.09703164547681808,
0.04949367046356201,
0.00808571558445692,
-0.025821074843406677,
-0.040338970720767975,
-0.013856444507837296,
0.24119658768177032,
0.24478591978549957,
0.08874983340501785,
0.011014807969331741,
0.009261786006391048,
-0.016871022060513496,
-0.005212424788624048,
0.04837283492088318,
0.060529764741659164,
0.1558658927679062,
-0.02462630346417427,
0.0168721005320549,
-0.06683006137609482,
-0.07426730543375015,
-0.00401152390986681,
0.07298672944307327,
-0.07454061508178711,
-0.07446707040071487,
0.004486197605729103,
0.14546220004558563,
-0.03570995479822159,
-0.08859121799468994,
-0.07028065621852875,
-0.10978271812200546,
-0.11420903354883194,
-0.04708832874894142,
0.03278413414955139,
0.11729899048805237,
0.027549983933568,
-0.024857433512806892,
-0.022260991856455803,
0.20721393823623657,
0.017208119854331017,
-0.07752146571874619,
-0.12033917754888535,
0.02105894684791565,
-0.12943527102470398,
0.08803422749042511,
-0.024508822709321976,
0.11840897053480148,
0.03723416104912758,
0.09885339438915253,
-0.03148353099822998,
0.09568081796169281,
-0.01247431617230177,
0.04372826963663101,
0.027476927265524864,
0.03319941833615303,
-0.06313673406839371,
0.10483649373054504,
0.011432650499045849,
-0.1372203528881073,
0.07117243856191635,
-0.10998931527137756,
-0.04174382984638214,
-0.09508155286312103,
0.059198047965765,
-0.05307066813111305,
0.054446373134851456,
0.11427196860313416,
-0.07015230506658554,
-0.039175014942884445,
-0.07196217775344849,
0.06994670629501343,
0.04348723217844963,
0.011175377294421196,
-0.043432239443063736,
-0.21109063923358917,
-0.022516736760735512,
-0.07361762225627899,
-0.027091484516859055,
-0.17063435912132263,
-0.009298224002122879,
0.035328637808561325,
-0.10572878271341324,
-0.01915079727768898,
0.03421669825911522,
0.03527859225869179,
0.06040404364466667,
0.008699414320290089,
-0.027466654777526855,
0.055444616824388504,
0.13307836651802063,
-0.13454869389533997,
-0.07480543851852417
] |
null | null | transformers |
# Harry Potter DialoGPT Model | {"tags": ["conversational"]} | text-generation | Rush11/DialoGPT-small-HarryPotter | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Harry Potter DialoGPT Model | [
"# Harry Potter DialoGPT Model"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Harry Potter DialoGPT Model"
] | [
51,
8
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Harry Potter DialoGPT Model"
] | [
-0.0009023238671943545,
0.07815738022327423,
-0.006546166725456715,
0.07792752981185913,
0.10655936598777771,
0.048972971737384796,
0.17639793455600739,
0.12185695022344589,
0.016568755730986595,
-0.04774167761206627,
0.11647630482912064,
0.2130284160375595,
-0.002118367003276944,
0.024608047679066658,
-0.05022026598453522,
-0.3065771162509918,
0.0474756620824337,
0.014356585219502449,
-0.07174845039844513,
0.11724270135164261,
0.09064973145723343,
-0.046179238706827164,
0.08330509811639786,
-0.009135239757597446,
-0.13198648393154144,
-0.039482954889535904,
0.019292812794446945,
-0.11745545268058777,
0.1662212759256363,
0.05298272892832756,
0.02469746209681034,
-0.008447164669632912,
-0.06598151475191116,
-0.15036040544509888,
0.037190426141023636,
-0.027472136542201042,
-0.01080626156181097,
0.05462246760725975,
0.023526115342974663,
-0.07521048933267593,
0.170567125082016,
0.17678891122341156,
0.0833497866988182,
0.0349111407995224,
-0.14917024970054626,
-0.045548245310783386,
0.008950977586209774,
0.05421316996216774,
-0.017893504351377487,
0.09349167346954346,
-0.019903047010302544,
0.11801653355360031,
-0.04491448402404785,
0.09210366010665894,
0.15255063772201538,
-0.4016275703907013,
-0.027563704177737236,
0.08920855820178986,
0.05989706888794899,
0.12076901644468307,
-0.10560955852270126,
0.03972794860601425,
-0.0039703017100691795,
0.01236654631793499,
-0.014540530741214752,
-0.08304883539676666,
-0.07308239489793777,
0.032504837960004807,
-0.1272556483745575,
0.008525865152478218,
0.23756256699562073,
-0.10643257945775986,
0.037069112062454224,
-0.09791990369558334,
-0.07414398342370987,
0.048336777836084366,
-0.053761593997478485,
-0.081727035343647,
-0.054839808493852615,
0.06347949057817459,
0.004366500303149223,
-0.06301609426736832,
-0.08326146006584167,
-0.0006536149303428829,
-0.12781435251235962,
0.17595994472503662,
0.061243366450071335,
0.041611745953559875,
-0.21322020888328552,
0.08940251916646957,
0.04477722570300102,
-0.04711297154426575,
0.007116159424185753,
-0.11796226352453232,
0.04023287072777748,
0.005483259446918964,
-0.03256071358919144,
-0.021854614838957787,
0.0393419973552227,
0.13909944891929626,
-0.01777748204767704,
0.03252175822854042,
0.006831915583461523,
0.05811219662427902,
0.08162496984004974,
0.02222144603729248,
0.019291909411549568,
-0.0818009302020073,
0.019385190680623055,
-0.08128736168146133,
-0.0030400939285755157,
-0.048940129578113556,
-0.17071883380413055,
-0.07477642595767975,
0.052610911428928375,
0.020047198981046677,
0.03746970370411873,
0.08054786175489426,
-0.0017944995779544115,
-0.05560554191470146,
0.03284840285778046,
0.01671096310019493,
-0.020622212439775467,
-0.010361049324274063,
-0.02412462793290615,
0.19123271107673645,
0.019619356840848923,
0.014111656695604324,
-0.12379156798124313,
0.10023640841245651,
-0.08179095387458801,
0.0037731381598860025,
0.02743307314813137,
-0.04204464703798294,
-0.004716555587947369,
0.02917117439210415,
0.023101668804883957,
-0.1252521574497223,
-0.1099385917186737,
-0.0030569476075470448,
-0.012054097838699818,
-0.036421261727809906,
-0.10490952432155609,
-0.08483029156923294,
-0.012153145857155323,
0.0449371263384819,
-0.013397793285548687,
0.007936403155326843,
-0.05143149942159653,
0.0985720232129097,
-0.0514979362487793,
0.09873400628566742,
-0.08342572301626205,
0.06359215080738068,
-0.09124887734651566,
-0.061886150389909744,
-0.11452563107013702,
0.05216052383184433,
0.012905281968414783,
0.066250741481781,
0.016998225823044777,
-0.044836658984422684,
-0.014836243353784084,
0.05253177136182785,
-0.07656687498092651,
0.1940697431564331,
-0.041674621403217316,
-0.12459053844213486,
0.24146439135074615,
-0.09138800948858261,
-0.1802034229040146,
0.12973085045814514,
-0.022254703566432,
0.08523941785097122,
0.12802475690841675,
0.20380465686321259,
-0.00019822151807602495,
-0.01302915159612894,
0.07281201332807541,
0.07031642645597458,
-0.09803894907236099,
0.06239739805459976,
0.029653839766979218,
-0.008071083575487137,
-0.08906278014183044,
0.05762826278805733,
0.046033453196287155,
-0.010650773532688618,
-0.035073768347501755,
-0.001896020956337452,
-0.012895751744508743,
-0.022185025736689568,
0.14126582443714142,
-0.02006692811846733,
0.1300428807735443,
-0.06926563382148743,
-0.03515486419200897,
-0.009500149637460709,
0.03533667325973511,
-0.04091939330101013,
0.08151165395975113,
-0.0436173714697361,
0.10586477071046829,
0.09034156054258347,
0.053724925965070724,
-0.13120363652706146,
0.00466286763548851,
-0.015246815048158169,
0.17014820873737335,
0.08964069187641144,
0.05222717300057411,
0.06265474855899811,
-0.0020888058934360743,
-0.06708643585443497,
0.045407816767692566,
0.13778303563594818,
-0.037020038813352585,
-0.12218865007162094,
-0.1755627691745758,
0.051157694309949875,
-0.045444171875715256,
0.10855234414339066,
-0.10010123997926712,
0.022670533508062363,
-0.055906031280756,
0.07772238552570343,
-0.024998966604471207,
0.020512236282229424,
-0.0013405600329861045,
-0.021700702607631683,
-0.08356887847185135,
-0.002377772703766823,
0.08597290515899658,
-0.02048647589981556,
-0.06707409024238586,
0.16556480526924133,
-0.16400809586048126,
0.1631954461336136,
0.2116095870733261,
-0.28542569279670715,
-0.005696662236005068,
-0.15163889527320862,
-0.0208092350512743,
0.019645055755972862,
0.07834604382514954,
0.026225795969367027,
0.2044338881969452,
-0.012928472831845284,
0.16565458476543427,
-0.05699567869305611,
-0.07730039209127426,
-0.06881127506494522,
-0.048101142048835754,
0.013522743247449398,
0.09095205366611481,
0.04542696103453636,
-0.11962861567735672,
0.13119758665561676,
0.1054433062672615,
0.06484298408031464,
0.12711186707019806,
0.1030748188495636,
-0.008113685995340347,
0.07252490520477295,
-0.03624548763036728,
-0.03462279960513115,
-0.09254947304725647,
-0.30446043610572815,
-0.04840317741036415,
0.0939924493432045,
0.007963384501636028,
0.09285714477300644,
-0.0919896736741066,
-0.03311870992183685,
0.006042704917490482,
0.009473444893956184,
0.028337622061371803,
0.09653715789318085,
0.013490920886397362,
0.15320514142513275,
-0.008011690340936184,
-0.03430786728858948,
0.05891305208206177,
0.017982570454478264,
-0.09147711098194122,
0.17280617356300354,
-0.17050009965896606,
-0.27190929651260376,
-0.06990014761686325,
-0.21745692193508148,
-0.013139115646481514,
0.05258983001112938,
0.0786920040845871,
-0.11818131804466248,
-0.018352627754211426,
-0.006239492911845446,
0.05685517191886902,
-0.2425733357667923,
0.0004911290016025305,
-0.1354890614748001,
0.0501418262720108,
-0.1974833607673645,
-0.09718500077724457,
-0.02271542325615883,
-0.013450481928884983,
-0.0464281290769577,
0.13365240395069122,
-0.1448695808649063,
-0.011572926305234432,
0.2329535037279129,
0.032479673624038696,
0.027794739231467247,
-0.05020907148718834,
0.19788463413715363,
-0.0958966314792633,
-0.023973820731043816,
0.11024576425552368,
-0.05038975924253464,
0.04834126681089401,
0.06649978458881378,
-0.012981836684048176,
-0.08557141572237015,
0.023789849132299423,
-0.068336620926857,
-0.03150583803653717,
-0.27926525473594666,
-0.0930178239941597,
-0.09319330751895905,
0.11305391043424606,
0.04079577326774597,
0.06421639025211334,
0.16545771062374115,
0.05191578343510628,
-0.024325082078576088,
-0.03006586618721485,
0.11609793454408646,
0.12905290722846985,
0.2277202159166336,
-0.06067761778831482,
0.10221996158361435,
0.009445492178201675,
-0.08203992247581482,
0.06062209978699684,
0.056782789528369904,
0.06324724853038788,
0.02584579586982727,
0.03694582358002663,
-0.030939655378460884,
0.1121687963604927,
0.12571842968463898,
0.05258069559931755,
0.0481170229613781,
0.0002127334737451747,
-0.0561506561934948,
-0.008168719708919525,
-0.05726633965969086,
0.06774696707725525,
0.061340972781181335,
-0.12918008863925934,
-0.08061543852090836,
0.0011613310780376196,
0.06660808622837067,
-0.016230419278144836,
0.06823775917291641,
-0.13560809195041656,
-0.03582429885864258,
0.0790911465883255,
-0.07693151384592056,
-0.14156894385814667,
0.11972879618406296,
-0.026570770889520645,
-0.19904157519340515,
0.05265914276242256,
0.007704653777182102,
0.0908159390091896,
-0.06360849738121033,
0.05343840271234512,
-0.13023801147937775,
-0.12935101985931396,
-0.018437571823596954,
0.07945099472999573,
-0.3450873792171478,
0.13536721467971802,
-0.013286802917718887,
-0.02876877970993519,
-0.06474969536066055,
-0.02640824392437935,
0.013905409723520279,
0.12719078361988068,
0.08667250722646713,
0.0008821099763736129,
0.0991629809141159,
0.03823768347501755,
0.04188435152173042,
-0.002011700300499797,
0.10950417071580887,
0.0050011589191854,
0.004797275178134441,
-0.04982118681073189,
0.007274609990417957,
-0.05164213851094246,
-0.07472953200340271,
0.08393982797861099,
-0.20678792893886566,
0.09087453782558441,
-0.03378438204526901,
0.08427679538726807,
0.04304937273263931,
-0.018965769559144974,
-0.1001204177737236,
0.19745583832263947,
-0.012206900864839554,
-0.11405988782644272,
-0.07517550885677338,
-0.02810264565050602,
0.09103139489889145,
-0.013817726634442806,
0.012886416167020798,
-0.045470476150512695,
0.032183047384023666,
-0.1263762265443802,
-0.1597503274679184,
0.08734500408172607,
-0.04441224783658981,
-0.10894393920898438,
-0.025462759658694267,
0.20382575690746307,
-0.007266622502356768,
0.08242089301347733,
0.01605331338942051,
0.010653935372829437,
-0.18066231906414032,
-0.04018142446875572,
0.02645772136747837,
-0.0016437612939625978,
0.005979063920676708,
0.047698814421892166,
0.019091911613941193,
0.06207629665732384,
-0.1069745197892189,
-0.013920160941779613,
0.3158324360847473,
0.15978319942951202,
-0.00912671908736229,
0.14943915605545044,
0.1093616932630539,
-0.08669080585241318,
-0.17238758504390717,
-0.1171615794301033,
-0.1210922971367836,
-0.08425768464803696,
-0.10681738704442978,
-0.1525043100118637,
0.09535340964794159,
-0.03392014652490616,
0.03498011827468872,
0.14615866541862488,
-0.280263751745224,
-0.10949636250734329,
0.13820378482341766,
0.010744688101112843,
0.3510635495185852,
-0.12303631007671356,
-0.044944874942302704,
-0.06214528530836105,
-0.16933435201644897,
0.08021392673254013,
-0.031203703954815865,
0.11581093072891235,
-0.0744495838880539,
0.19395925104618073,
0.01719796098768711,
0.014287159778177738,
0.0916559100151062,
0.05038322135806084,
-0.05808406323194504,
-0.07368700206279755,
-0.10248131304979324,
0.010812131687998772,
0.03546109423041344,
0.010252019390463829,
-0.008802837692201138,
0.0211968794465065,
-0.11341743916273117,
-0.050869911909103394,
-0.06302189081907272,
0.0072614275850355625,
-0.01001308299601078,
-0.042155615985393524,
-0.05533592775464058,
-0.022557416930794716,
-0.020093943923711777,
0.02266426384449005,
0.14185629785060883,
-0.07527699321508408,
0.18586260080337524,
0.02357078716158867,
0.1586609035730362,
-0.11956068128347397,
-0.06724818795919418,
-0.029193658381700516,
-0.05280323326587677,
0.06468886137008667,
-0.08884575963020325,
-0.027708567678928375,
0.1332162618637085,
-0.01903904788196087,
0.04655366763472557,
0.12936700880527496,
0.02046884410083294,
0.015383756719529629,
0.034968774765729904,
-0.2578005790710449,
-0.07463036477565765,
-0.03505445644259453,
-0.012416874058544636,
0.05272092670202255,
0.05525677278637886,
0.19735674560070038,
-0.03551921248435974,
-0.08521962910890579,
0.020131373777985573,
0.02735883742570877,
-0.02776256389915943,
0.10749414563179016,
0.019579345360398293,
-0.004837906453758478,
-0.16151933372020721,
0.08257976174354553,
-0.005964108742773533,
-0.08297000825405121,
0.028665626421570778,
0.2024049311876297,
-0.12141239643096924,
-0.10309756547212601,
-0.06804922968149185,
0.07315051555633545,
-0.09220825880765915,
0.016043387353420258,
-0.005091092549264431,
-0.1521538347005844,
0.06916408240795135,
0.07598215341567993,
0.04075418785214424,
0.06513199955224991,
-0.11743064224720001,
-0.015730571001768112,
-0.04170290008187294,
-0.002195435343310237,
0.03521120920777321,
0.01863143965601921,
-0.057492829859256744,
0.15846455097198486,
-0.0676199421286583,
0.08538917452096939,
-0.0744810476899147,
-0.1058846190571785,
-0.1395980566740036,
0.04660497233271599,
-0.08038312196731567,
-0.07247276604175568,
-0.12832807004451752,
-0.052204377949237823,
-0.0067099276930093765,
-0.03388519585132599,
0.006552806124091148,
-0.06627799570560455,
-0.10922821611166,
0.01822470687329769,
-0.00743203004822135,
-0.009385870769619942,
-0.06096754968166351,
0.026706209406256676,
0.06246216222643852,
-0.039788868278265,
0.15730851888656616,
0.22509248554706573,
-0.13591648638248444,
0.11564400047063828,
-0.09797432273626328,
-0.105463907122612,
0.046008042991161346,
0.009427277371287346,
0.03594303876161575,
0.0503489226102829,
-0.03594081476330757,
0.0044484552927315235,
0.03905477747321129,
0.08074651658535004,
0.08456914126873016,
-0.06776505708694458,
0.020801106467843056,
-0.05122765153646469,
-0.14904099702835083,
-0.016655439510941505,
-0.0464773029088974,
0.06876829266548157,
-0.006725262850522995,
0.11020535975694656,
-0.0515950471162796,
0.07739507406949997,
-0.07558431476354599,
0.050614211708307266,
0.021146971732378006,
-0.14688286185264587,
-0.006612539757043123,
-0.07093682140111923,
0.042144812643527985,
-0.008834975771605968,
0.20241086184978485,
-0.03228091076016426,
0.010342049412429333,
0.033811055123806,
0.06203942745923996,
-0.01957780309021473,
0.009357001632452011,
0.2014283686876297,
0.12640917301177979,
-0.08496357500553131,
-0.02679651789367199,
0.06793134659528732,
0.07248228788375854,
0.07093550264835358,
0.10807815194129944,
-0.015352966263890266,
0.028434239327907562,
0.07829629629850388,
-0.060215238481760025,
0.07576877623796463,
-0.08603982627391815,
-0.11668483167886734,
0.05793621391057968,
0.012955795042216778,
-0.055695828050374985,
0.20305177569389343,
0.19142870604991913,
-0.026278704404830933,
0.018410727381706238,
-0.0029499190859496593,
-0.10117456316947937,
-0.15619947016239166,
-0.05423750728368759,
-0.07170962542295456,
-0.1319410353899002,
-0.004549739416688681,
-0.16646917164325714,
0.022016216069459915,
-0.01132756657898426,
0.09506805986166,
-0.06855440139770508,
-0.01345991250127554,
0.1364889293909073,
-0.1055467277765274,
0.0847758799791336,
-0.024517204612493515,
0.07877567410469055,
-0.03746940940618515,
-0.018209461122751236,
-0.10342709720134735,
0.007514837197959423,
0.01131442841142416,
0.06840907037258148,
-0.10897937417030334,
0.02432350255548954,
-0.12208317965269089,
-0.08617185056209564,
-0.026142612099647522,
0.09279687702655792,
-0.0403008833527565,
0.15116846561431885,
0.02645145356655121,
-0.06710928678512573,
-0.004313822835683823,
0.2646709978580475,
-0.08046227693557739,
-0.08319197595119476,
-0.030799202620983124,
0.2152107208967209,
0.04053696244955063,
0.06396269053220749,
0.019140036776661873,
0.038027774542570114,
-0.07184682041406631,
0.2957373559474945,
0.34401440620422363,
-0.1318037211894989,
-0.007773484103381634,
0.04225075617432594,
0.04406323283910751,
0.14687567949295044,
0.07998795062303543,
0.11360671371221542,
0.2849363386631012,
-0.09197647124528885,
0.016657205298542976,
-0.04230864346027374,
-0.01424806285649538,
-0.06908884644508362,
0.045314885675907135,
0.08216670155525208,
-0.09241747111082077,
-0.022950593382120132,
0.08125471323728561,
-0.29741767048835754,
0.10791494697332382,
-0.15600289404392242,
-0.14948409795761108,
-0.05027429759502411,
-0.008771711029112339,
0.014683255925774574,
0.019041186198592186,
0.09663030505180359,
0.025651484727859497,
-0.07275258749723434,
0.07816889137029648,
0.024486342445015907,
-0.23020237684249878,
-0.01345184724777937,
0.1456068754196167,
-0.06789913028478622,
-0.025938833132386208,
-0.021313713863492012,
0.051610056310892105,
0.05763651058077812,
0.09027529507875443,
-0.03809558227658272,
-0.0746568813920021,
-0.007141788024455309,
-0.022818787023425102,
0.01914946548640728,
0.0597183033823967,
0.06841408461332321,
-0.0920223817229271,
0.1167774423956871,
-0.07350476831197739,
0.0650370642542839,
0.037623800337314606,
-0.022277191281318665,
0.0018526542698964477,
0.013183658011257648,
-0.06512464582920074,
0.05533479526638985,
0.1295643299818039,
-0.025459708645939827,
-0.002524374984204769,
-0.028180841356515884,
-0.0767761766910553,
-0.024015206843614578,
-0.04643676429986954,
-0.09101243317127228,
-0.18130090832710266,
-0.12738600373268127,
0.041754670441150665,
-0.03240608796477318,
-0.2046082615852356,
0.0060346988029778,
-0.1128578633069992,
0.03700976446270943,
-0.14154092967510223,
0.10004086047410965,
0.07216610759496689,
0.004716616589576006,
0.006774604320526123,
0.0675399899482727,
0.045677728950977325,
0.14796748757362366,
-0.16543124616146088,
-0.04919974133372307
] |
null | null | transformers |
## Evaluation on Common Voice Maltese Test
```python
import torchaudio
from datasets import load_dataset, load_metric
from transformers import (
Wav2Vec2ForCTC,
Wav2Vec2Processor,
)
import torch
import re
import sys
model_name = "RuudVelo/XLSR-Wav2Vec2-Maltese-1"
device = "cuda"
chars_to_ignore_regex = '[\\,\\?\\.\\!\\-\\;\\:\\"\\“\\%\\‘\\”\\�]'
model = Wav2Vec2ForCTC.from_pretrained(model_name).to(device)
processor = Wav2Vec2Processor.from_pretrained(model_name)
ds = load_dataset("common_voice", "mt", split="test", data_dir="./cv-corpus-6.1-2020-12-11")
resampler = torchaudio.transforms.Resample(orig_freq=48_000, new_freq=16_000)
def map_to_array(batch):
speech, _ = torchaudio.load(batch["path"])
batch["speech"] = resampler.forward(speech.squeeze(0)).numpy()
batch["sampling_rate"] = resampler.new_freq
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower() + " "
return batch
ds = ds.map(map_to_array)
def map_to_pred(batch):
features = processor(batch["speech"], sampling_rate=batch["sampling_rate"][0], padding=True, return_tensors="pt")
input_values = features.input_values.to(device)
attention_mask = features.attention_mask.to(device)
with torch.no_grad():
logits = model(input_values, attention_mask=attention_mask).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["predicted"] = processor.batch_decode(pred_ids)
batch["target"] = batch["sentence"]
return batch
result = ds.map(map_to_pred, batched=True, batch_size=16, remove_columns=list(ds.features.keys()))
wer = load_metric("wer")
print(wer.compute(predictions=result["predicted"], references=result["target"]))
```
**Result**: 30.0 % | {"language": "mt", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "model-index": [{"name": "XLSR Wav2Vec2 Maltese by RuudVelo", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice mt", "type": "common_voice", "args": "mt"}, "metrics": [{"type": "wer", "value": 30.0, "name": "Test WER"}]}]}]} | automatic-speech-recognition | RuudVelo/XLSR-Wav2Vec2-Maltese-1 | [
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"mt",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"mt"
] | TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #mt #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
## Evaluation on Common Voice Maltese Test
Result: 30.0 % | [
"## Evaluation on Common Voice Maltese Test\n\n\n\nResult: 30.0 %"
] | [
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #mt #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"## Evaluation on Common Voice Maltese Test\n\n\n\nResult: 30.0 %"
] | [
72,
14
] | [
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #mt #license-apache-2.0 #model-index #endpoints_compatible #region-us \n## Evaluation on Common Voice Maltese Test\n\n\n\nResult: 30.0 %"
] | [
-0.2320515513420105,
0.08612678945064545,
-0.004072627518326044,
-0.06803230941295624,
0.011319282464683056,
-0.061852313578128815,
0.14265359938144684,
0.09329134970903397,
-0.021707121282815933,
0.11022014915943146,
0.03967910259962082,
0.2507733702659607,
0.04042598232626915,
-0.0029636395629495382,
-0.1525714099407196,
-0.0889599621295929,
0.06635595858097076,
0.003882785327732563,
0.04278764873743057,
0.1271056979894638,
0.14343024790287018,
-0.013222809880971909,
0.0045898123644292355,
0.05769762396812439,
-0.008464409038424492,
0.05457929149270058,
0.08822840452194214,
-0.1434004306793213,
0.11827317625284195,
0.08936475962400436,
-0.03908456489443779,
0.07216854393482208,
-0.034631166607141495,
-0.12551060318946838,
0.006527394521981478,
0.017755139619112015,
0.08729051053524017,
-0.019703064113855362,
0.06504631787538528,
0.006673348136246204,
0.043812889605760574,
0.04989242926239967,
-0.043740998953580856,
0.006542698480188847,
0.05314604565501213,
-0.09153985977172852,
-0.10772664844989777,
0.007685563527047634,
0.04037628322839737,
0.08378303050994873,
-0.05458087846636772,
0.23276177048683167,
-0.1293502002954483,
0.09846160560846329,
0.0752825066447258,
-0.26171863079071045,
-0.013722996227443218,
-0.05315787345170975,
0.07760380208492279,
0.10548365116119385,
0.03808174282312393,
0.08855702728033066,
0.027395401149988174,
0.021536486223340034,
-0.21702681481838226,
-0.02381438948214054,
0.011341550387442112,
-0.03479480743408203,
-0.1261935532093048,
-0.021216625347733498,
0.2520301043987274,
0.023471780121326447,
-0.1042114868760109,
-0.09949962794780731,
0.07617446035146713,
0.16467200219631195,
-0.005326725542545319,
-0.01595238782465458,
-0.05025353282690048,
0.0028981470968574286,
-0.005795339588075876,
0.03501821309328079,
-0.03777962923049927,
-0.10815676301717758,
-0.10930477827787399,
0.29326874017715454,
0.06347046792507172,
0.012850437313318253,
0.004378225188702345,
-0.03874015435576439,
-0.0523555651307106,
-0.047522347420454025,
-0.051861729472875595,
0.060895953327417374,
0.05546429380774498,
-0.02946625091135502,
-0.003161159111186862,
-0.11398323625326157,
0.15090642869472504,
-0.018575826659798622,
0.08848266303539276,
-0.02150025963783264,
-0.13489247858524323,
0.13693898916244507,
-0.04113610461354256,
0.15184444189071655,
-0.08586800843477249,
-0.08978525549173355,
0.02411324717104435,
0.015397564508020878,
0.03854602575302124,
0.0052140201441943645,
-0.03210100904107094,
-0.049898047000169754,
0.01714668609201908,
0.12782920897006989,
0.05826644226908684,
-0.031030435115098953,
0.04019060730934143,
-0.013895640149712563,
-0.0041376748122274876,
-0.11090680211782455,
-0.00700360257178545,
0.04637204855680466,
-0.01594962365925312,
0.1774904727935791,
-0.1307334303855896,
0.012080308981239796,
-0.13168416917324066,
0.054939720779657364,
0.0875907763838768,
-0.0038442001678049564,
0.1052221730351448,
-0.11733130365610123,
0.05862647294998169,
0.03745771944522858,
-0.0032203304581344128,
-0.06719908118247986,
-0.008073564618825912,
-0.08864708989858627,
-0.09585339576005936,
-0.016466107219457626,
-0.12161564081907272,
-0.028786903247237206,
-0.03522918000817299,
-0.009920656681060791,
-0.07497341185808182,
0.08545573800802231,
-0.07139145582914352,
0.16302144527435303,
-0.0003760743420571089,
-0.039657507091760635,
-0.03624674305319786,
0.046406712383031845,
-0.03215181455016136,
-0.07089170813560486,
-0.021711299195885658,
0.09400179237127304,
-0.1072641909122467,
-0.029100265353918076,
-0.12894272804260254,
-0.08265413343906403,
-0.038189847022295,
0.04170207679271698,
0.04007193073630333,
0.0695076510310173,
-0.21376626193523407,
-0.08442188054323196,
0.02647378109395504,
-0.12754368782043457,
-0.018305707722902298,
0.24871529638767242,
0.09487942606210709,
-0.018028028309345245,
0.15467451512813568,
0.21641002595424652,
0.001972741447389126,
-0.19174529612064362,
-0.006383721716701984,
0.032916951924562454,
-0.023130962625145912,
-0.07286230474710464,
0.13414724171161652,
-0.1279667466878891,
-0.0323747880756855,
0.008143394254148006,
-0.11540314555168152,
0.011094375513494015,
-0.030345307663083076,
-0.02976270765066147,
0.0012331303441897035,
-0.08337410539388657,
0.013772278092801571,
0.0009083065669983625,
-0.004972072318196297,
-0.15033690631389618,
-0.06391698867082596,
-0.12077206373214722,
0.10458271205425262,
-0.08449331670999527,
0.03733520209789276,
-0.1247497946023941,
0.2900125980377197,
-0.12394656985998154,
-0.05308714881539345,
-0.08037685602903366,
0.25693079829216003,
-0.02747141756117344,
0.04895595461130142,
0.12096430361270905,
-0.05478235334157944,
0.012918285094201565,
-0.03872450441122055,
-0.017350520938634872,
0.03389787673950195,
0.15175947546958923,
0.005429903510957956,
-0.07445567846298218,
-0.18796628713607788,
0.08811113238334656,
-0.008163725957274437,
-0.04230731353163719,
-0.05091503635048866,
-0.053488120436668396,
0.08719736337661743,
0.006761546246707439,
-0.08040701597929001,
0.022308556362986565,
0.019057128578424454,
0.08096768707036972,
-0.0012057387502864003,
0.014642050489783287,
0.009829080663621426,
0.03298341482877731,
-0.04793708771467209,
0.24895738065242767,
-0.1296592652797699,
0.1639207899570465,
0.135789155960083,
-0.102588951587677,
0.037288907915353775,
0.07390893250703812,
-0.016463952139019966,
0.04262444004416466,
-0.07695365697145462,
-0.024995528161525726,
0.16377204656600952,
0.0009088668739423156,
0.09243663400411606,
-0.12332617491483688,
0.012586594559252262,
0.028092047199606895,
-0.04503186419606209,
-0.04932938516139984,
0.1341264396905899,
0.033980850130319595,
-0.10833585262298584,
0.07394397258758545,
0.045887067914009094,
-0.10158559679985046,
0.14858929812908173,
-0.0754651427268982,
-0.10881096869707108,
0.07167525589466095,
0.05475490540266037,
-0.008105914108455181,
0.07056314498186111,
-0.18031245470046997,
-0.011205378919839859,
0.04307730123400688,
0.038467831909656525,
0.0947149470448494,
-0.12899883091449738,
0.005815266165882349,
0.006965833716094494,
-0.13171935081481934,
-0.2284664511680603,
0.10273274034261703,
-0.020707935094833374,
0.08665122091770172,
-0.10549909621477127,
-0.15390396118164062,
0.0004228009202051908,
-0.03830306977033615,
-0.1415403187274933,
0.034575507044792175,
-0.028221437707543373,
-0.07562746107578278,
-0.1104557067155838,
-0.02016204036772251,
-0.023007679730653763,
-0.04125717282295227,
0.08874634653329849,
-0.07924014329910278,
0.030283361673355103,
-0.02560124360024929,
0.05811895430088043,
-0.07535839825868607,
0.030556514859199524,
-0.07770400494337082,
0.021775225177407265,
-0.012121651321649551,
-0.10996672511100769,
-0.019848033785820007,
-0.08126591891050339,
0.03610040247440338,
-0.06928880512714386,
-0.09075756371021271,
0.017295876517891884,
0.2083476334810257,
0.016605356708168983,
0.04364030808210373,
-0.04697064310312271,
0.23202228546142578,
-0.134596586227417,
-0.08028148114681244,
0.1620803028345108,
-0.00807567685842514,
-0.03329026326537132,
0.15144504606723785,
0.019526012241840363,
-0.03772386908531189,
-0.09118258208036423,
-0.03366938605904579,
-0.02882413938641548,
-0.2997899651527405,
-0.13500288128852844,
-0.07437587529420853,
-0.05171184614300728,
-0.11186668276786804,
0.044579483568668365,
0.06393023580312729,
-0.035192545503377914,
-0.03256404027342796,
-0.2124667912721634,
0.05848701298236847,
-0.028146982192993164,
0.19942434132099152,
-0.0024888492189347744,
0.1169993057847023,
-0.04447255656123161,
0.006448063068091869,
0.06959566473960876,
0.03089323081076145,
0.012273009866476059,
0.08031992614269257,
-0.03598528355360031,
0.04592905938625336,
0.09470784664154053,
0.07781136780977249,
0.012553224340081215,
0.009529206901788712,
0.011732609942555428,
-0.0018465942703187466,
-0.0978597104549408,
0.007837026380002499,
-0.017699243500828743,
0.27958792448043823,
-0.04997076839208603,
-0.05823185294866562,
-0.13439424335956573,
0.05591917410492897,
0.14340002834796906,
0.11141981184482574,
-0.06511306762695312,
-0.02644961327314377,
-0.041741035878658295,
-0.11982069164514542,
0.04211999848484993,
0.03626381978392601,
-0.007129126228392124,
-0.04048619419336319,
0.046294793486595154,
0.06186198070645332,
0.10015490651130676,
-0.015075862407684326,
0.020459681749343872,
-0.12255949527025223,
-0.015012157149612904,
-0.03421315178275108,
0.06969188898801804,
-0.2111051082611084,
0.31139814853668213,
0.06083663925528526,
0.2011295109987259,
-0.04220017045736313,
0.002749874722212553,
0.09139663726091385,
0.1160593032836914,
0.14635664224624634,
-0.002638650592416525,
-0.11512552946805954,
-0.0951085165143013,
-0.12159042060375214,
0.04411054775118828,
0.030490417033433914,
0.14674915373325348,
-0.027764473110437393,
0.028657007962465286,
0.016568070277571678,
-0.00940295122563839,
-0.05602386221289635,
-0.1269722878932953,
-0.007543377578258514,
0.026774005964398384,
0.23676486313343048,
-0.06380011886358261,
-0.023229392245411873,
-0.15578575432300568,
-0.24236950278282166,
0.033461932092905045,
-0.10479628294706345,
-0.04092762619256973,
-0.03344235569238663,
-0.07409278303384781,
0.12544693052768707,
-0.05166908726096153,
-0.11481405049562454,
0.01983766071498394,
-0.01135520450770855,
-0.013174702413380146,
0.003983341157436371,
-0.0006755345384590328,
-0.08610967546701431,
-0.04506967216730118,
-0.03123944252729416,
0.30928418040275574,
-0.02999313734471798,
0.10198868811130524,
0.09133796393871307,
0.009294943884015083,
0.031116485595703125,
-0.021723542362451553,
0.06306733191013336,
-0.09287507086992264,
-0.11936258524656296,
0.12462291866540909,
0.044336020946502686,
-0.09591683000326157,
-0.21310463547706604,
-0.1109127551317215,
0.19429627060890198,
0.1112770065665245,
-0.004701714497059584,
0.01797882467508316,
0.21241608262062073,
-0.06360059976577759,
-0.18386158347129822,
-0.028609639033675194,
0.00031419264269061387,
0.0803258866071701,
-0.10593777894973755,
0.00366391334682703,
0.04687371104955673,
0.019509557634592056,
-0.06683129072189331,
-0.05024342983961105,
-0.3151533305644989,
-0.10269689559936523,
0.20954382419586182,
-0.08107639104127884,
0.1290920525789261,
-0.05052778869867325,
-0.09100434184074402,
0.017146140336990356,
0.09140670299530029,
-0.14072078466415405,
0.00975935161113739,
0.08641254901885986,
0.07387398928403854,
0.050132133066654205,
0.04146593436598778,
-0.00474196532741189,
0.12693285942077637,
0.05503841117024422,
-0.10578710585832596,
0.031220732256770134,
0.05343424528837204,
-0.07446938753128052,
0.08139496296644211,
0.05441699177026749,
-0.129564106464386,
0.008853144943714142,
-0.04794778302311897,
-0.05908249318599701,
-0.08735347539186478,
0.06792069971561432,
0.03499450534582138,
0.03951326385140419,
-0.02626471221446991,
-0.046431783586740494,
-0.020244073122739792,
0.02397758699953556,
-0.0831570252776146,
-0.17183706164360046,
0.0641959011554718,
0.03226347640156746,
0.20433409512043,
-0.18383102118968964,
-0.17950189113616943,
0.039150286465883255,
-0.07995463162660599,
0.05699339509010315,
0.05757889151573181,
0.025771673768758774,
0.09690307825803757,
0.05803941190242767,
0.05058157816529274,
0.026254283264279366,
-0.09348853677511215,
0.043362684547901154,
0.018383145332336426,
0.05215781554579735,
-0.050457753241062164,
-0.0026092652697116137,
-0.07053713500499725,
0.04566867649555206,
0.016640499234199524,
0.10025276243686676,
-0.0037154110614210367,
0.030383607372641563,
-0.05065631493926048,
-0.06798114627599716,
-0.13344770669937134,
0.38111352920532227,
0.0453454963862896,
0.0047334060072898865,
-0.1206701248884201,
0.019579298794269562,
-0.06340819597244263,
-0.04750071093440056,
-0.021603163331747055,
-0.04246818646788597,
0.020318208262324333,
-0.056550003588199615,
-0.008812104351818562,
-0.06440325081348419,
0.08006825298070908,
-0.15914040803909302,
-0.11332187801599503,
-0.15516620874404907,
0.05822773650288582,
0.14808763563632965,
0.10321581363677979,
0.037871528416872025,
-0.13130813837051392,
-0.12784524261951447,
-0.04453461617231369,
0.049651987850666046,
0.08307565748691559,
-0.024256428703665733,
-0.09756737947463989,
0.167904794216156,
0.01397603377699852,
0.019205590710043907,
-0.030919833108782768,
-0.056663475930690765,
0.020180311053991318,
0.10934162139892578,
0.011269660666584969,
0.017790259793400764,
-0.021768400445580482,
0.0035339249297976494,
0.03386319428682327,
-0.1348332315683365,
-0.08785292506217957,
0.04208463802933693,
-0.11479676514863968,
0.061746180057525635,
0.021994395181536674,
0.0982113927602768,
-0.04208314046263695,
0.036918554455041885,
0.043893001973629,
-0.07222377508878708,
0.05279630422592163,
0.11143694818019867,
-0.1331181675195694,
0.051962316036224365,
-0.1993037611246109,
-0.11508283764123917,
0.05504133924841881,
0.06317027658224106,
-0.05135856196284294,
-0.12293936312198639,
0.07608258724212646,
0.20787104964256287,
0.05028799921274185,
0.07122517377138138,
0.08498292416334152,
-0.01186920516192913,
-0.04453866928815842,
-0.07096396386623383,
-0.07501498609781265,
0.04451790824532509,
0.00034053390845656395,
0.15763980150222778,
0.09368417412042618,
0.12223634868860245,
-0.06276670098304749,
0.002883046166971326,
-0.05057574436068535,
0.0668821707367897,
-0.04289982467889786,
-0.10023455321788788,
-0.04844963923096657,
0.005227769259363413,
0.0873134508728981,
0.0016037599416449666,
0.17731238901615143,
0.04921575263142586,
0.009711840189993382,
0.06608978658914566,
0.06548163294792175,
0.013780983164906502,
0.002561128232628107,
0.2764810025691986,
0.03371647372841835,
0.051542554050683975,
-0.039459388703107834,
-0.11041703820228577,
-0.004584180656820536,
0.11592092365026474,
-0.024713141843676567,
0.22097612917423248,
0.22558316588401794,
0.10046259313821793,
0.13545870780944824,
-0.01413000002503395,
-0.05977823957800865,
-0.019437380135059357,
-0.009081227704882622,
0.08294486254453659,
0.03697887435555458,
0.239563450217247,
0.11991515755653381,
-0.03212881088256836,
0.033877648413181305,
-0.02091207541525364,
-0.055768731981515884,
-0.22224467992782593,
-0.051848359405994415,
-0.09746339917182922,
-0.09410391002893448,
0.05782046914100647,
-0.01653715968132019,
0.09350689500570297,
0.018371520563960075,
0.12000520527362823,
-0.025678742676973343,
-0.023640155792236328,
-0.13560032844543457,
-0.07172337174415588,
0.16979163885116577,
-0.14426511526107788,
0.02947363629937172,
-0.10410002619028091,
-0.020488938316702843,
0.11521179229021072,
-0.024661453440785408,
0.08386866748332977,
-0.014291984960436821,
-0.08318640291690826,
-0.01348093617707491,
-0.1902768611907959,
-0.019611654803156853,
-0.005469525698572397,
-0.03969557210803032,
0.14173562824726105,
0.22816501557826996,
0.059113211929798126,
-0.034794505685567856,
0.056925445795059204,
0.07867607474327087,
-0.0034755405504256487,
-0.2156049907207489,
-0.09726346284151077,
0.06263609975576401,
0.012188761495053768,
0.05992991104722023,
-0.030004233121871948,
-0.02975407987833023,
0.05324701592326164,
0.20776160061359406,
0.2057117223739624,
0.02649664878845215,
0.0004938271595165133,
-0.014745308086276054,
-0.015192129649221897,
-0.03570516034960747,
-0.030674150213599205,
0.08290541172027588,
0.18702076375484467,
0.0030888617038726807,
-0.016760332509875298,
-0.13146381080150604,
-0.002043383428826928,
-0.05297012999653816,
0.17146417498588562,
-0.005537266843020916,
-0.1559789776802063,
0.004904597532004118,
0.2222720831632614,
0.01579371839761734,
-0.040340520441532135,
-0.1697850376367569,
-0.11859644949436188,
-0.13360534608364105,
0.03755144774913788,
0.030470648780465126,
0.11658771336078644,
-0.0438353531062603,
-0.09539328515529633,
-0.047136835753917694,
-0.11372792720794678,
-0.04558371379971504,
-0.08533927798271179,
-0.12473954260349274,
0.03769027441740036,
-0.005212385207414627,
0.07997104525566101,
0.054114930331707,
0.2052658498287201,
0.05451113358139992,
0.1084371954202652,
0.05485318973660469,
0.1570628583431244,
0.0008994698291644454,
-0.10818424075841904,
0.14027895033359528,
0.12500211596488953,
0.006295133847743273,
0.11603879183530807,
0.06658854335546494,
-0.0524199903011322,
0.10005715489387512,
-0.17008668184280396,
-0.12993839383125305,
-0.14443500339984894,
0.05192918702960014,
-0.0881359875202179,
-0.019255589693784714,
-0.023035531863570213,
-0.001954791834577918,
0.03481793403625488,
-0.04441685229539871,
-0.023071007803082466,
0.06141304597258568,
-0.05963944271206856,
-0.009313042275607586,
-0.25812217593193054,
-0.021978728473186493,
-0.2121935784816742,
-0.03435823321342468,
-0.1108115166425705,
-0.028684351593255997,
-0.04212620109319687,
-0.039730243384838104,
-0.00557599077001214,
0.01108343992382288,
0.11515737324953079,
-0.04184317588806152,
0.009803752414882183,
-0.15658776462078094,
0.03403317555785179,
0.06882911920547485,
-0.11543615162372589,
0.00255312561057508
] |
null | null | transformers | <!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-large-xls-r-1b-cv8-mt-lm
This model is a fine-tuned version of [wav2vec2-large-xls-r-1b-cv8-mt-lm](https://huggingface.co/facebook/wav2vec2-xls-r-1b) on the common_voice 8 dataset.
It achieves the following results on the test set:
- Loss: 0.2210
- Wer: 0.1974
Note that the above test results come from the original model without LM (language model) which can be found at https://huggingface.co/RuudVelo/wav2vec2-large-xls-r-1b-cv8-mt. The results with the LM model can be found on the right side of this model card.
## Model description
Model RuudVelo/wav2vec2-large-xls-r-1b-cv8-mt which has been improved with a KenLM 3-gram.
## Intended uses & limitations
More information needed
## Training and evaluation data
Common Voice 8 mt dataset has been used for the model
## Training procedure
### Training hyperparameters
The following config and hyperparameters were used during training:
model = Wav2Vec2ForCTC.from_pretrained(
"facebook/wav2vec2-xls-r-1b",
attention_dropout=0.05,
hidden_dropout=0.05,
feat_proj_dropout=0.05,
mask_time_prob=0.55,
mask_feature_prob=0.10,
layerdrop=0.05,
ctc_zero_infinity=True,
ctc_loss_reduction="mean",
pad_token_id=processor.tokenizer.pad_token_id,
vocab_size=len(processor.tokenizer),
)
from transformers import TrainingArguments
training_args = TrainingArguments(
output_dir=repo_name,
group_by_length=True,
per_device_train_batch_size=32,
gradient_accumulation_steps=2,
evaluation_strategy="steps",
num_train_epochs=50,
gradient_checkpointing=True,
fp16=True,
save_steps=400,
eval_steps=400,
logging_steps=400,
learning_rate=5.5e-05,
warmup_steps=500,
save_total_limit=2,
push_to_hub=True,
report_to="tensorboard")
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.18.3
- Tokenizers 0.11.0 | {"language": ["mt"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "mozilla-foundation/common_voice_8_0", "generated_from_trainer", "mt", "robust-speech-event", "model_for_talk", "hf-asr-leaderboard"], "datasets": ["mozilla-foundation/common_voice_8_0"], "model-index": [{"name": "wav2vec2-large-xls-r-1b-cv8-mt-lm", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Common Voice 8", "type": "mozilla-foundation/common_voice_8_0", "args": "mt"}, "metrics": [{"type": "wer", "value": 15.88, "name": "Test WER"}, {"type": "cer", "value": 3.65, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Dev Data", "type": "speech-recognition-community-v2/dev_data", "args": "mt"}, "metrics": [{"type": "wer", "name": "Test WER"}, {"type": "cer", "name": "Test CER"}]}]}]} | automatic-speech-recognition | RuudVelo/wav2vec2-large-xls-r-1b-cv8-mt-lm | [
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"mozilla-foundation/common_voice_8_0",
"generated_from_trainer",
"mt",
"robust-speech-event",
"model_for_talk",
"hf-asr-leaderboard",
"dataset:mozilla-foundation/common_voice_8_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"mt"
] | TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #mt #robust-speech-event #model_for_talk #hf-asr-leaderboard #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# wav2vec2-large-xls-r-1b-cv8-mt-lm
This model is a fine-tuned version of wav2vec2-large-xls-r-1b-cv8-mt-lm on the common_voice 8 dataset.
It achieves the following results on the test set:
- Loss: 0.2210
- Wer: 0.1974
Note that the above test results come from the original model without LM (language model) which can be found at URL The results with the LM model can be found on the right side of this model card.
## Model description
Model RuudVelo/wav2vec2-large-xls-r-1b-cv8-mt which has been improved with a KenLM 3-gram.
## Intended uses & limitations
More information needed
## Training and evaluation data
Common Voice 8 mt dataset has been used for the model
## Training procedure
### Training hyperparameters
The following config and hyperparameters were used during training:
model = Wav2Vec2ForCTC.from_pretrained(
"facebook/wav2vec2-xls-r-1b",
attention_dropout=0.05,
hidden_dropout=0.05,
feat_proj_dropout=0.05,
mask_time_prob=0.55,
mask_feature_prob=0.10,
layerdrop=0.05,
ctc_zero_infinity=True,
ctc_loss_reduction="mean",
pad_token_id=processor.tokenizer.pad_token_id,
vocab_size=len(processor.tokenizer),
)
from transformers import TrainingArguments
training_args = TrainingArguments(
output_dir=repo_name,
group_by_length=True,
per_device_train_batch_size=32,
gradient_accumulation_steps=2,
evaluation_strategy="steps",
num_train_epochs=50,
gradient_checkpointing=True,
fp16=True,
save_steps=400,
eval_steps=400,
logging_steps=400,
learning_rate=5.5e-05,
warmup_steps=500,
save_total_limit=2,
push_to_hub=True,
report_to="tensorboard")
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.18.3
- Tokenizers 0.11.0 | [
"# wav2vec2-large-xls-r-1b-cv8-mt-lm\n\nThis model is a fine-tuned version of wav2vec2-large-xls-r-1b-cv8-mt-lm on the common_voice 8 dataset.\nIt achieves the following results on the test set:\n- Loss: 0.2210\n- Wer: 0.1974\n\nNote that the above test results come from the original model without LM (language model) which can be found at URL The results with the LM model can be found on the right side of this model card.",
"## Model description\nModel RuudVelo/wav2vec2-large-xls-r-1b-cv8-mt which has been improved with a KenLM 3-gram.",
"## Intended uses & limitations\nMore information needed",
"## Training and evaluation data\nCommon Voice 8 mt dataset has been used for the model",
"## Training procedure",
"### Training hyperparameters\nThe following config and hyperparameters were used during training:\nmodel = Wav2Vec2ForCTC.from_pretrained(\n \"facebook/wav2vec2-xls-r-1b\", \n attention_dropout=0.05,\n hidden_dropout=0.05,\n feat_proj_dropout=0.05,\n mask_time_prob=0.55,\n mask_feature_prob=0.10,\n layerdrop=0.05,\n ctc_zero_infinity=True,\n ctc_loss_reduction=\"mean\", \n pad_token_id=processor.tokenizer.pad_token_id,\n vocab_size=len(processor.tokenizer),\n)\nfrom transformers import TrainingArguments\n\ntraining_args = TrainingArguments(\n output_dir=repo_name,\n group_by_length=True,\n per_device_train_batch_size=32,\n gradient_accumulation_steps=2,\n evaluation_strategy=\"steps\",\n num_train_epochs=50,\n gradient_checkpointing=True,\n fp16=True,\n save_steps=400,\n eval_steps=400,\n logging_steps=400,\n learning_rate=5.5e-05, \n warmup_steps=500,\n save_total_limit=2,\n push_to_hub=True, \n report_to=\"tensorboard\")",
"### Framework versions\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #mt #robust-speech-event #model_for_talk #hf-asr-leaderboard #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# wav2vec2-large-xls-r-1b-cv8-mt-lm\n\nThis model is a fine-tuned version of wav2vec2-large-xls-r-1b-cv8-mt-lm on the common_voice 8 dataset.\nIt achieves the following results on the test set:\n- Loss: 0.2210\n- Wer: 0.1974\n\nNote that the above test results come from the original model without LM (language model) which can be found at URL The results with the LM model can be found on the right side of this model card.",
"## Model description\nModel RuudVelo/wav2vec2-large-xls-r-1b-cv8-mt which has been improved with a KenLM 3-gram.",
"## Intended uses & limitations\nMore information needed",
"## Training and evaluation data\nCommon Voice 8 mt dataset has been used for the model",
"## Training procedure",
"### Training hyperparameters\nThe following config and hyperparameters were used during training:\nmodel = Wav2Vec2ForCTC.from_pretrained(\n \"facebook/wav2vec2-xls-r-1b\", \n attention_dropout=0.05,\n hidden_dropout=0.05,\n feat_proj_dropout=0.05,\n mask_time_prob=0.55,\n mask_feature_prob=0.10,\n layerdrop=0.05,\n ctc_zero_infinity=True,\n ctc_loss_reduction=\"mean\", \n pad_token_id=processor.tokenizer.pad_token_id,\n vocab_size=len(processor.tokenizer),\n)\nfrom transformers import TrainingArguments\n\ntraining_args = TrainingArguments(\n output_dir=repo_name,\n group_by_length=True,\n per_device_train_batch_size=32,\n gradient_accumulation_steps=2,\n evaluation_strategy=\"steps\",\n num_train_epochs=50,\n gradient_checkpointing=True,\n fp16=True,\n save_steps=400,\n eval_steps=400,\n logging_steps=400,\n learning_rate=5.5e-05, \n warmup_steps=500,\n save_total_limit=2,\n push_to_hub=True, \n report_to=\"tensorboard\")",
"### Framework versions\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] | [
122,
133,
42,
12,
18,
3,
334,
40
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #mt #robust-speech-event #model_for_talk #hf-asr-leaderboard #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# wav2vec2-large-xls-r-1b-cv8-mt-lm\n\nThis model is a fine-tuned version of wav2vec2-large-xls-r-1b-cv8-mt-lm on the common_voice 8 dataset.\nIt achieves the following results on the test set:\n- Loss: 0.2210\n- Wer: 0.1974\n\nNote that the above test results come from the original model without LM (language model) which can be found at URL The results with the LM model can be found on the right side of this model card.## Model description\nModel RuudVelo/wav2vec2-large-xls-r-1b-cv8-mt which has been improved with a KenLM 3-gram.## Intended uses & limitations\nMore information needed## Training and evaluation data\nCommon Voice 8 mt dataset has been used for the model## Training procedure"
] | [
-0.10338649153709412,
0.15464036166667938,
-0.003087265184149146,
0.0027615679427981377,
0.04121048003435135,
0.014731991104781628,
0.09154680371284485,
0.16499029099941254,
-0.0013256368692964315,
0.060483887791633606,
0.07770336419343948,
0.10730844736099243,
0.04069172963500023,
0.00881914235651493,
0.05438815802335739,
-0.1891939640045166,
0.042161956429481506,
-0.03724920004606247,
0.007253183517605066,
0.0915936604142189,
0.0969047099351883,
-0.08378545194864273,
0.06383079290390015,
0.03615931048989296,
-0.021981678903102875,
0.03349575400352478,
-0.006452750880271196,
-0.03289927542209625,
0.0984029471874237,
0.0402928963303566,
0.007311217021197081,
0.06866870820522308,
0.07615426927804947,
-0.23236693441867828,
0.008975335396826267,
0.04423978924751282,
0.058039914816617966,
0.06426861137151718,
0.05207732319831848,
-0.042940106242895126,
0.12336008995771408,
-0.0038322587497532368,
0.023520207032561302,
0.07873307913541794,
-0.043853987008333206,
-0.1254619061946869,
-0.11310695856809616,
0.0687553733587265,
0.04451639577746391,
0.11415016651153564,
-0.04104974865913391,
0.12191099673509598,
-0.015179620124399662,
0.05277262255549431,
0.2604486346244812,
-0.2898055613040924,
0.014037935994565487,
0.183170884847641,
0.052585501223802567,
0.0085084717720747,
-0.056593358516693115,
0.07906179130077362,
0.04256097227334976,
-0.009966311044991016,
-0.03626500070095062,
-0.017331348732113838,
0.055324386805295944,
0.06724904477596283,
-0.14294588565826416,
-0.008118453435599804,
0.14839255809783936,
0.01608019322156906,
-0.07977614551782608,
-0.11286409199237823,
0.012928628362715244,
-0.05904817208647728,
0.011955145746469498,
-0.0583767406642437,
0.012574340216815472,
-0.009093869477510452,
-0.037886057049036026,
-0.15518984198570251,
-0.058760304003953934,
-0.096592478454113,
-0.0038888531271368265,
0.1309337317943573,
0.044622521847486496,
0.009539719671010971,
-0.01232654694467783,
0.11327355355024338,
-0.1494496464729309,
-0.06070441007614136,
-0.09618662297725677,
-0.05715138837695122,
-0.04203663766384125,
0.019946295768022537,
-0.06445527821779251,
-0.11659497767686844,
0.03859783336520195,
0.06695037335157394,
0.049801621586084366,
0.035673197358846664,
-0.030315035954117775,
0.017792360857129097,
-0.03518850728869438,
0.17833463847637177,
-0.09455382823944092,
-0.002128130290657282,
0.015449302271008492,
0.1080283671617508,
-0.03393370658159256,
-0.023432297632098198,
-0.06440208852291107,
-0.02118750847876072,
0.006333811674267054,
0.09646371752023697,
-0.01436438038945198,
0.04271246865391731,
-0.025497179478406906,
-0.04055143892765045,
-0.006941198371350765,
-0.16196651756763458,
-0.01790929213166237,
0.003942054696381092,
-0.062366124242544174,
0.06116178631782532,
0.0980808436870575,
-0.008457394316792488,
-0.04445571452379227,
-0.037685826420784,
-0.04083035886287689,
0.0005339766503311694,
-0.056112680584192276,
-0.06954554468393326,
0.0007278394768945873,
-0.019693991169333458,
-0.05409354716539383,
-0.06657368689775467,
-0.1400354653596878,
-0.029907649382948875,
0.009049231186509132,
-0.0027704236563295126,
0.036254145205020905,
-0.045892324298620224,
-0.021027013659477234,
-0.012085503898561,
0.014366725459694862,
0.022608725354075432,
-0.00723577244207263,
0.037637531757354736,
0.02196124941110611,
0.042082544416189194,
0.019718313589692116,
0.049372248351573944,
-0.03810594603419304,
0.03855349123477936,
-0.09050561487674713,
0.08503568172454834,
-0.12695938348770142,
-0.09888331592082977,
-0.11221252381801605,
-0.053997885435819626,
-0.052822038531303406,
0.03244081884622574,
0.07669435441493988,
0.12738341093063354,
-0.17685171961784363,
-0.05973498895764351,
0.12481104582548141,
-0.09691279381513596,
-0.04647201672196388,
0.03020910732448101,
-0.0017655311385169625,
0.029447786509990692,
0.05690973997116089,
0.0978764221072197,
0.1539737433195114,
-0.20719313621520996,
-0.07701556384563446,
-0.0175813939422369,
0.042742229998111725,
0.08318563550710678,
0.08068609237670898,
-0.10567066073417664,
-0.06947311758995056,
0.058594658970832825,
-0.13465654850006104,
-0.06288738548755646,
-0.06992331147193909,
-0.062020443379879,
-0.07093290984630585,
-0.05617453157901764,
0.051003411412239075,
-0.04359085485339165,
0.005863636266440153,
-0.014768874272704124,
-0.12034095823764801,
0.03892006352543831,
0.17473837733268738,
-0.06569399684667587,
0.026753580197691917,
-0.15973520278930664,
0.1006263792514801,
-0.11782878637313843,
-0.014681139029562473,
-0.17553067207336426,
-0.015313391573727131,
0.0072286296635866165,
-0.11226630210876465,
0.09045615792274475,
0.0891721248626709,
0.04725410416722298,
0.033725008368492126,
-0.023690223693847656,
0.006235557142645121,
-0.08421751111745834,
-0.01749740168452263,
-0.029252002015709877,
-0.12638284265995026,
-0.0760379210114479,
-0.014467415399849415,
0.10801135748624802,
-0.15974539518356323,
-0.008376180194318295,
0.01169738918542862,
0.12353330850601196,
0.02050873264670372,
-0.06714238226413727,
0.042306363582611084,
-0.005537179298698902,
-0.019438758492469788,
-0.025362875312566757,
-0.008104762993752956,
-0.014958102256059647,
-0.1285446286201477,
0.10679050534963608,
-0.14062446355819702,
-0.07340117543935776,
0.10651389509439468,
0.05180370807647705,
-0.02394220046699047,
0.060497961938381195,
-0.05665396898984909,
0.017196856439113617,
-0.060291144996881485,
-0.06445864588022232,
0.14560723304748535,
0.04152040183544159,
0.12041974812746048,
-0.07310716062784195,
-0.054258041083812714,
0.0008057474624365568,
-0.0163185466080904,
-0.05157064273953438,
0.05639750510454178,
0.08577094972133636,
-0.00567069835960865,
-0.006336733233183622,
0.051087409257888794,
-0.010432537645101547,
0.18174679577350616,
0.0035145620349794626,
-0.10007358342409134,
-0.07446295768022537,
-0.03236435726284981,
-0.006607301067560911,
0.11806343495845795,
-0.08891349285840988,
0.006303366273641586,
0.05122527852654457,
0.018436240032315254,
0.08570177108049393,
-0.13385821878910065,
0.01657055877149105,
0.06107579916715622,
-0.11105004698038101,
0.0035436125472187996,
0.09691457450389862,
-0.010315933264791965,
0.07094601541757584,
-0.02970951609313488,
0.015990516170859337,
-0.05912870913743973,
-0.04113112762570381,
-0.11125176399946213,
0.10156574100255966,
-0.1047089472413063,
-0.19016654789447784,
-0.17594818770885468,
0.04255883768200874,
-0.12219811975955963,
-0.025459492579102516,
0.012419342063367367,
-0.0910772755742073,
-0.07454291731119156,
-0.11848090589046478,
0.09546010196208954,
-0.054354555904865265,
0.016558347269892693,
0.02647569216787815,
-0.004505532793700695,
0.09466611593961716,
-0.1698758453130722,
-0.009785149246454239,
-0.02629736065864563,
-0.0926714539527893,
-0.051221366971731186,
0.006581861525774002,
0.04927138611674309,
0.14009304344654083,
0.022988956421613693,
0.05200270935893059,
0.013472585007548332,
0.21138077974319458,
-0.04501495510339737,
-0.03790866211056709,
0.16441954672336578,
0.007324902806431055,
0.0074196611531078815,
0.016110507771372795,
0.012418745085597038,
-0.09086193889379501,
0.0038759016897529364,
0.10644149780273438,
-0.03083738125860691,
-0.27713462710380554,
-0.0506853349506855,
-0.014966992661356926,
-0.05761012062430382,
0.004715193063020706,
0.0433463416993618,
0.012893019244074821,
0.026385445147752762,
-0.009391960687935352,
-0.056791599839925766,
0.020565763115882874,
0.032134655863046646,
0.0830981433391571,
-0.05066531524062157,
0.10254127532243729,
-0.06287101656198502,
-0.002296230522915721,
0.06224602833390236,
0.03856044262647629,
0.15399277210235596,
0.008130066096782684,
0.06761658191680908,
0.1283879429101944,
0.1197902038693428,
0.010478287003934383,
0.029016220942139626,
0.004441319964826107,
-0.0032098356168717146,
0.02903142012655735,
-0.1170373186469078,
-0.03237893432378769,
0.024144742637872696,
0.08597014844417572,
-0.0012770779430866241,
-0.098431296646595,
0.07258009910583496,
0.007444649934768677,
0.16912314295768738,
0.07922986149787903,
-0.20890609920024872,
-0.054866254329681396,
-0.00487857311964035,
-0.017494460567831993,
-0.02169797569513321,
0.017606744542717934,
0.05140068009495735,
-0.07682234048843384,
0.07661990076303482,
-0.012713189236819744,
0.06665731221437454,
-0.03641248494386673,
0.012130340561270714,
-0.02213834971189499,
0.07032858580350876,
0.005949192680418491,
0.07207425683736801,
-0.1689814329147339,
0.1700284630060196,
0.0610569603741169,
0.0669277086853981,
-0.05207327380776405,
0.02740272507071495,
0.04796750470995903,
0.018915848806500435,
0.14929458498954773,
0.006657368969172239,
-0.09103642404079437,
-0.10580804198980331,
-0.10000178217887878,
0.028798377141356468,
0.13570474088191986,
-0.07011847198009491,
0.11711012572050095,
-0.06572254002094269,
-0.030939262360334396,
-0.005799328442662954,
0.059267546981573105,
-0.20531494915485382,
-0.09202537685632706,
0.09180156141519547,
0.06448405981063843,
-0.01940341107547283,
-0.0733279138803482,
-0.06607728451490402,
-0.03444531559944153,
0.19772952795028687,
-0.040497470647096634,
0.007656628731638193,
-0.10154394805431366,
0.025517502799630165,
0.19927237927913666,
-0.06400196254253387,
-0.023258095607161522,
0.0028001191094517708,
0.17816227674484253,
-0.0076742819510400295,
-0.12318401038646698,
-0.02210690639913082,
-0.06597481667995453,
-0.16792018711566925,
0.01731991395354271,
0.1993456333875656,
0.009162412025034428,
0.04613424465060234,
0.01991008222103119,
0.019854716956615448,
0.03399442136287689,
-0.07744196802377701,
0.06718084961175919,
0.22864125669002533,
-0.07143377512693405,
0.06170031428337097,
-0.07665663212537766,
0.02910820208489895,
-0.09194622188806534,
-0.03582339361310005,
0.10768526792526245,
0.22470755875110626,
-0.043113864958286285,
0.12295954674482346,
0.056376006454229355,
-0.13426898419857025,
-0.19033657014369965,
0.0015698836650699377,
0.04146214947104454,
0.064154714345932,
0.028597712516784668,
-0.15842919051647186,
0.03341475874185562,
0.04296018183231354,
-0.03229055553674698,
0.014406214468181133,
-0.3284865617752075,
-0.16717927157878876,
0.08764500916004181,
0.028436429798603058,
0.08830004185438156,
-0.04829839989542961,
-0.0631694495677948,
-0.048219967633485794,
-0.10092857480049133,
0.02634846419095993,
-0.03816844895482063,
0.1433022916316986,
-0.003057911992073059,
0.04983166232705116,
0.07353828102350235,
-0.032053474336862564,
0.15436074137687683,
0.02728218398988247,
0.014916744083166122,
-0.013186588883399963,
0.03866884857416153,
0.05680680274963379,
-0.056938108056783676,
0.15947473049163818,
0.007522092200815678,
0.09099949896335602,
-0.16333433985710144,
-0.07366391271352768,
-0.048757147043943405,
0.022849926725029945,
-0.034234266728162766,
-0.03403773158788681,
-0.022830715402960777,
0.03972180560231209,
0.02380692958831787,
-0.005131647456437349,
-0.004542096052318811,
-0.1326705664396286,
0.05882140249013901,
0.11243244260549545,
0.11122118681669235,
0.020591434091329575,
-0.10012255609035492,
0.055484652519226074,
-0.015039469115436077,
0.07126667350530624,
-0.08997312188148499,
0.0674043595790863,
0.11414298415184021,
0.026824047788977623,
0.127548947930336,
0.008951719850301743,
-0.1709105521440506,
-0.005735181272029877,
0.03968946635723114,
-0.10473985970020294,
-0.11623382568359375,
-0.05535309016704559,
0.026166871190071106,
-0.054247066378593445,
0.03382733464241028,
0.21789050102233887,
-0.04273573309183121,
-0.017408929765224457,
-0.007775004021823406,
0.01741264946758747,
-0.06728736311197281,
0.1508069634437561,
0.05868048220872879,
0.06400194764137268,
-0.05037497729063034,
0.10582640767097473,
-0.015408666804432869,
-0.016339870169758797,
0.06429159641265869,
-0.05160578340291977,
-0.03193367272615433,
-0.05058051645755768,
-0.06632230430841446,
0.14252053201198578,
-0.09284146875143051,
-0.053671594709157944,
-0.09071813523769379,
-0.050654150545597076,
0.03891581669449806,
0.08739359676837921,
0.02539902552962303,
0.08244788646697998,
-0.004055498633533716,
-0.03319167345762253,
-0.08812703937292099,
0.08288659155368805,
0.011805757880210876,
0.0013683602446690202,
-0.12283682078123093,
0.08360899239778519,
-0.00021948439825791866,
0.04933934658765793,
-0.021530788391828537,
-0.031633637845516205,
-0.04057082161307335,
0.0016232413472607732,
-0.1539948284626007,
-0.018639158457517624,
-0.029912011697888374,
-0.040288954973220825,
0.04878202825784683,
-0.036301273852586746,
-0.02537059783935547,
0.058206260204315186,
-0.10428167879581451,
-0.05128968879580498,
-0.033471811562776566,
0.03527069836854935,
-0.11318755149841309,
0.037760090082883835,
0.01242231298238039,
-0.091291144490242,
0.05863182991743088,
0.04211542755365372,
-0.010500618256628513,
0.09007170796394348,
-0.05897868052124977,
-0.0067019350826740265,
-0.008620440028607845,
0.06660059094429016,
-0.00816329289227724,
-0.13178479671478271,
0.01789400912821293,
0.009426752105355263,
0.019977042451500893,
0.0064676194451749325,
0.0003811508940998465,
-0.10963032394647598,
-0.10831017792224884,
-0.01981458254158497,
0.07381566613912582,
-0.00618322379887104,
0.046606749296188354,
0.0835176631808281,
0.10456554591655731,
0.08854416012763977,
-0.06103803589940071,
0.062150031328201294,
-0.18770503997802734,
0.011386695317924023,
-0.043227698653936386,
0.016958890482783318,
0.017341608181595802,
-0.02756841853260994,
0.07349354028701782,
-0.07385420799255371,
0.1639757752418518,
-0.031414713710546494,
0.05455774441361427,
0.025517262518405914,
-0.04706455394625664,
0.03403247892856598,
0.01507757417857647,
0.13281160593032837,
0.04600687325000763,
0.03633194416761398,
0.017415886744856834,
-0.022106949239969254,
0.008669554255902767,
0.0346069261431694,
0.13703417778015137,
0.1144845262169838,
0.09185284376144409,
0.06195608526468277,
0.13667163252830505,
-0.06874658912420273,
-0.03960486873984337,
0.04685094580054283,
-0.09002124518156052,
0.11112513393163681,
-0.0508582666516304,
0.019754895940423012,
0.13012954592704773,
-0.14941371977329254,
0.10774517804384232,
-0.015164315700531006,
-0.035167474299669266,
-0.1937488168478012,
-0.1669802963733673,
-0.10493437945842743,
-0.11985175311565399,
0.057099197059869766,
-0.09446270018815994,
0.057400014251470566,
0.023618392646312714,
0.05559435859322548,
0.026410873979330063,
0.10341273993253708,
-0.13425926864147186,
-0.04586675390601158,
0.051511943340301514,
-0.027386434376239777,
-0.046739861369132996,
-0.0034764602314680815,
0.007742597721517086,
0.1459210366010666,
0.028518913313746452,
0.05045772343873978,
-0.0054048700258135796,
-0.0019886570516973734,
0.03126060962677002,
-0.03182905912399292,
-0.05489663779735565,
-0.019097570329904556,
-0.03686199709773064,
0.08904919773340225,
0.06285756081342697,
0.05131865292787552,
-0.04367345944046974,
-0.035383813083171844,
0.24203117191791534,
-0.058002546429634094,
-0.08798706531524658,
-0.15059994161128998,
0.14638526737689972,
0.06342262774705887,
0.041429344564676285,
-0.0029733823612332344,
-0.06997988373041153,
0.02083471044898033,
0.18728193640708923,
0.18189892172813416,
0.019489716738462448,
-0.008579126559197903,
0.0008226012578234076,
-0.000281875254586339,
-0.021345142275094986,
0.0322275273501873,
0.06296170502901077,
0.14923357963562012,
-0.0012013122905045748,
-0.022686120122671127,
-0.04725601151585579,
-0.043965715914964676,
0.03772544115781784,
0.1316749006509781,
-0.037945911288261414,
-0.039733801037073135,
-0.0003007800260093063,
0.04640016704797745,
-0.027586761862039566,
-0.16567689180374146,
-0.0097205126658082,
-0.12007927894592285,
-0.1259789913892746,
-0.04907935485243797,
0.03721843659877777,
0.007405241020023823,
0.041339755058288574,
-0.061618607491254807,
-0.05039249360561371,
0.18054357171058655,
0.0005792139563709497,
-0.10510609298944473,
-0.1261446326971054,
0.05595751106739044,
0.0136403264477849,
0.17570039629936218,
0.03983606398105621,
0.06646915525197983,
0.054741550236940384,
-0.022464772686362267,
-0.09910039603710175,
0.11475284397602081,
0.021949527785182,
-0.10850915312767029,
0.01283248234540224,
0.2048656940460205,
0.00418582558631897,
0.16336548328399658,
0.024310335516929626,
-0.07400760054588318,
0.02553894929587841,
0.0009727508295327425,
-0.07228851318359375,
-0.06935064494609833,
0.023728370666503906,
-0.11696677654981613,
0.13641783595085144,
0.16272836923599243,
-0.06394261121749878,
0.00831810012459755,
-0.0654497817158699,
0.07942850142717361,
0.0004424146900419146,
0.03089783526957035,
0.00956637505441904,
-0.17808064818382263,
0.022621044889092445,
-0.09159328788518906,
-0.013798840343952179,
-0.26141080260276794,
-0.10009226948022842,
0.10017292201519012,
-0.05570657551288605,
0.008023237809538841,
0.11017696559429169,
0.10937091708183289,
0.06887637078762054,
-0.037398409098386765,
-0.041724663227796555,
0.01854831725358963,
0.0913306325674057,
-0.15951941907405853,
-0.08563464879989624
] |
null | null | transformers | <!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-large-xls-r-1b-cv8-mt
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-1b](https://huggingface.co/facebook/wav2vec2-xls-r-1b) on the common_voice dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2210
- Wer: 0.1974
## Model description
Note: another version of this model is available with a KenLM 3gram model. This model performs better than this model. See https://huggingface.co/RuudVelo/wav2vec2-large-xls-r-1b-cv8-mt-lm
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following config and hyperparameters were used during training:
model = Wav2Vec2ForCTC.from_pretrained(
"facebook/wav2vec2-xls-r-1b",
attention_dropout=0.05,
hidden_dropout=0.05,
feat_proj_dropout=0.05,
mask_time_prob=0.55,
mask_feature_prob=0.10,
layerdrop=0.05,
ctc_zero_infinity=True,
ctc_loss_reduction="mean",
pad_token_id=processor.tokenizer.pad_token_id,
vocab_size=len(processor.tokenizer),
)
from transformers import TrainingArguments
training_args = TrainingArguments(
output_dir=repo_name,
group_by_length=True,
per_device_train_batch_size=32,
gradient_accumulation_steps=2,
evaluation_strategy="steps",
num_train_epochs=50,
gradient_checkpointing=True,
fp16=True,
save_steps=400,
eval_steps=400,
logging_steps=400,
learning_rate=5.5e-05,
warmup_steps=500,
save_total_limit=2,
push_to_hub=True,
report_to="tensorboard")
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 3.4564 | 13.33 | 400 | 0.3783 | 0.3981 |
| 0.7931 | 26.66 | 800 | 0.2377 | 0.2298 |
| 0.5364 | 39.98 | 1200 | 0.2210 | 0.1974 |
Note that the test WER of 19.74 is different than the above reported 17.57. This was due to a bug which was found while processing files with an older version of the datasets library. The right library is listed below.
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.3
- Tokenizers 0.11.0
| {"language": ["mt"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "mozilla-foundation/common_voice_8_0", "generated_from_trainer", "mt", "robust-speech-event", "model_for_talk", "hf-asr-leaderboard"], "datasets": ["mozilla-foundation/common_voice_8_0"], "model-index": [{"name": "wav2vec2-large-xls-r-1b-cv8-mt", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Common Voice 8", "type": "mozilla-foundation/common_voice_8_0", "args": "mt"}, "metrics": [{"type": "wer", "value": 17.57, "name": "Test WER"}, {"type": "cer", "value": 3.86, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Dev Data", "type": "speech-recognition-community-v2/dev_data", "args": "mt"}, "metrics": [{"type": "wer", "name": "Test WER"}, {"type": "cer", "name": "Test CER"}]}]}]} | automatic-speech-recognition | RuudVelo/wav2vec2-large-xls-r-1b-cv8-mt | [
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"mozilla-foundation/common_voice_8_0",
"generated_from_trainer",
"mt",
"robust-speech-event",
"model_for_talk",
"hf-asr-leaderboard",
"dataset:mozilla-foundation/common_voice_8_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"mt"
] | TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #mt #robust-speech-event #model_for_talk #hf-asr-leaderboard #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us
| wav2vec2-large-xls-r-1b-cv8-mt
==============================
This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on the common\_voice dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2210
* Wer: 0.1974
Model description
-----------------
Note: another version of this model is available with a KenLM 3gram model. This model performs better than this model. See URL
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following config and hyperparameters were used during training:
model = Wav2Vec2ForCTC.from\_pretrained(
"facebook/wav2vec2-xls-r-1b",
attention\_dropout=0.05,
hidden\_dropout=0.05,
feat\_proj\_dropout=0.05,
mask\_time\_prob=0.55,
mask\_feature\_prob=0.10,
layerdrop=0.05,
ctc\_zero\_infinity=True,
ctc\_loss\_reduction="mean",
pad\_token\_id=processor.tokenizer.pad\_token\_id,
vocab\_size=len(processor.tokenizer),
)
from transformers import TrainingArguments
training\_args = TrainingArguments(
output\_dir=repo\_name,
group\_by\_length=True,
per\_device\_train\_batch\_size=32,
gradient\_accumulation\_steps=2,
evaluation\_strategy="steps",
num\_train\_epochs=50,
gradient\_checkpointing=True,
fp16=True,
save\_steps=400,
eval\_steps=400,
logging\_steps=400,
learning\_rate=5.5e-05,
warmup\_steps=500,
save\_total\_limit=2,
push\_to\_hub=True,
report\_to="tensorboard")
### Training results
Note that the test WER of 19.74 is different than the above reported 17.57. This was due to a bug which was found while processing files with an older version of the datasets library. The right library is listed below.
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2+cu102
* Datasets 1.18.3
* Tokenizers 0.11.0
| [
"### Training hyperparameters\n\n\nThe following config and hyperparameters were used during training:\n\n\nmodel = Wav2Vec2ForCTC.from\\_pretrained(\n\"facebook/wav2vec2-xls-r-1b\",\nattention\\_dropout=0.05,\nhidden\\_dropout=0.05,\nfeat\\_proj\\_dropout=0.05,\nmask\\_time\\_prob=0.55,\nmask\\_feature\\_prob=0.10,\nlayerdrop=0.05,\nctc\\_zero\\_infinity=True,\nctc\\_loss\\_reduction=\"mean\",\npad\\_token\\_id=processor.tokenizer.pad\\_token\\_id,\nvocab\\_size=len(processor.tokenizer),\n)\n\n\nfrom transformers import TrainingArguments\n\n\ntraining\\_args = TrainingArguments(\noutput\\_dir=repo\\_name,\ngroup\\_by\\_length=True,\nper\\_device\\_train\\_batch\\_size=32,\ngradient\\_accumulation\\_steps=2,\nevaluation\\_strategy=\"steps\",\nnum\\_train\\_epochs=50,\ngradient\\_checkpointing=True,\nfp16=True,\nsave\\_steps=400,\neval\\_steps=400,\nlogging\\_steps=400,\nlearning\\_rate=5.5e-05,\nwarmup\\_steps=500,\nsave\\_total\\_limit=2,\npush\\_to\\_hub=True,\nreport\\_to=\"tensorboard\")",
"### Training results\n\n\n\nNote that the test WER of 19.74 is different than the above reported 17.57. This was due to a bug which was found while processing files with an older version of the datasets library. The right library is listed below.",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.3\n* Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #mt #robust-speech-event #model_for_talk #hf-asr-leaderboard #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following config and hyperparameters were used during training:\n\n\nmodel = Wav2Vec2ForCTC.from\\_pretrained(\n\"facebook/wav2vec2-xls-r-1b\",\nattention\\_dropout=0.05,\nhidden\\_dropout=0.05,\nfeat\\_proj\\_dropout=0.05,\nmask\\_time\\_prob=0.55,\nmask\\_feature\\_prob=0.10,\nlayerdrop=0.05,\nctc\\_zero\\_infinity=True,\nctc\\_loss\\_reduction=\"mean\",\npad\\_token\\_id=processor.tokenizer.pad\\_token\\_id,\nvocab\\_size=len(processor.tokenizer),\n)\n\n\nfrom transformers import TrainingArguments\n\n\ntraining\\_args = TrainingArguments(\noutput\\_dir=repo\\_name,\ngroup\\_by\\_length=True,\nper\\_device\\_train\\_batch\\_size=32,\ngradient\\_accumulation\\_steps=2,\nevaluation\\_strategy=\"steps\",\nnum\\_train\\_epochs=50,\ngradient\\_checkpointing=True,\nfp16=True,\nsave\\_steps=400,\neval\\_steps=400,\nlogging\\_steps=400,\nlearning\\_rate=5.5e-05,\nwarmup\\_steps=500,\nsave\\_total\\_limit=2,\npush\\_to\\_hub=True,\nreport\\_to=\"tensorboard\")",
"### Training results\n\n\n\nNote that the test WER of 19.74 is different than the above reported 17.57. This was due to a bug which was found while processing files with an older version of the datasets library. The right library is listed below.",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.3\n* Tokenizers 0.11.0"
] | [
122,
377,
55,
38
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #mt #robust-speech-event #model_for_talk #hf-asr-leaderboard #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following config and hyperparameters were used during training:\n\n\nmodel = Wav2Vec2ForCTC.from\\_pretrained(\n\"facebook/wav2vec2-xls-r-1b\",\nattention\\_dropout=0.05,\nhidden\\_dropout=0.05,\nfeat\\_proj\\_dropout=0.05,\nmask\\_time\\_prob=0.55,\nmask\\_feature\\_prob=0.10,\nlayerdrop=0.05,\nctc\\_zero\\_infinity=True,\nctc\\_loss\\_reduction=\"mean\",\npad\\_token\\_id=processor.tokenizer.pad\\_token\\_id,\nvocab\\_size=len(processor.tokenizer),\n)\n\n\nfrom transformers import TrainingArguments\n\n\ntraining\\_args = TrainingArguments(\noutput\\_dir=repo\\_name,\ngroup\\_by\\_length=True,\nper\\_device\\_train\\_batch\\_size=32,\ngradient\\_accumulation\\_steps=2,\nevaluation\\_strategy=\"steps\",\nnum\\_train\\_epochs=50,\ngradient\\_checkpointing=True,\nfp16=True,\nsave\\_steps=400,\neval\\_steps=400,\nlogging\\_steps=400,\nlearning\\_rate=5.5e-05,\nwarmup\\_steps=500,\nsave\\_total\\_limit=2,\npush\\_to\\_hub=True,\nreport\\_to=\"tensorboard\")"
] | [
-0.09495615214109421,
0.0676240548491478,
-0.006055687088519335,
0.055215898901224136,
0.08299581706523895,
0.02501375786960125,
0.11957112699747086,
0.16183869540691376,
-0.05864749848842621,
0.14134126901626587,
0.06896041333675385,
0.11193617433309555,
0.0819912850856781,
0.13539749383926392,
0.0025034432765096426,
-0.22309936583042145,
0.05295225977897644,
-0.06705227494239807,
-0.11388608068227768,
0.10952650755643845,
0.04472414031624794,
-0.08663381636142731,
0.016940979287028313,
0.009350135922431946,
-0.05070681497454643,
-0.028432466089725494,
-0.025364596396684647,
-0.06760940700769424,
0.08704189956188202,
0.03275257349014282,
0.07001427561044693,
0.03544914349913597,
0.07801257073879242,
-0.27861911058425903,
0.0046618967317044735,
0.12421248853206635,
0.0005465928697958589,
0.060048557817935944,
0.15650254487991333,
-0.028574218973517418,
0.07447045296430588,
-0.09506581723690033,
0.07669058442115784,
0.04756489768624306,
-0.15301965177059174,
-0.25202420353889465,
-0.08503887057304382,
0.0652475506067276,
0.11490684002637863,
0.09272883087396622,
-0.04697602987289429,
0.08699136972427368,
-0.11007274687290192,
0.07198967784643173,
0.18459486961364746,
-0.2384553998708725,
-0.0670849159359932,
0.0038801615592092276,
0.022565549239516258,
-0.061564214527606964,
-0.1141248494386673,
-0.009158884175121784,
-0.014062341302633286,
-0.0027527762576937675,
0.08138266205787659,
-0.003699011169373989,
0.030095761641860008,
-0.020231030881404877,
-0.11534033715724945,
-0.05682113766670227,
0.027577079832553864,
0.07506866008043289,
0.000907032226677984,
-0.17748600244522095,
-0.0373072475194931,
-0.16435329616069794,
-0.04339980334043503,
-0.004291622433811426,
0.008193884044885635,
-0.0261521153151989,
-0.015578795224428177,
0.1321936547756195,
-0.02444375306367874,
-0.06360757350921631,
0.07526740431785583,
0.01742262952029705,
0.0359639972448349,
-0.06280220299959183,
-0.03982806205749512,
0.08378343284130096,
0.10618571937084198,
-0.17230133712291718,
-0.005875702947378159,
-0.0162519421428442,
-0.17675717175006866,
-0.020031537860631943,
0.007262540981173515,
0.014476465992629528,
0.08640135824680328,
0.181597501039505,
0.05059674382209778,
0.13763371109962463,
-0.0003607611870393157,
0.01241330523043871,
-0.04647049680352211,
0.12467683851718903,
-0.10215075314044952,
-0.16772642731666565,
-0.07226858288049698,
0.10670837014913559,
-0.04128184914588928,
-0.015194929204881191,
0.013770987279713154,
0.040572650730609894,
0.08136371523141861,
0.03429824858903885,
0.05325377359986305,
0.0903252363204956,
-0.08204536139965057,
-0.0112157566472888,
-0.021047361195087433,
-0.13992270827293396,
0.03776410594582558,
0.08036013692617416,
-0.06766258180141449,
-0.007768384180963039,
0.0062822080217301846,
-0.030830848962068558,
-0.051657114177942276,
0.11681283265352249,
-0.04909604415297508,
-0.0379631407558918,
-0.07073580473661423,
-0.11750522255897522,
0.01806008815765381,
-0.10048049688339233,
-0.020671198144555092,
-0.036783453077077866,
-0.08160341531038284,
-0.12063528597354889,
0.09346316009759903,
-0.07666405290365219,
0.02113589458167553,
-0.06341111660003662,
-0.08884760737419128,
0.05310628190636635,
0.011906282976269722,
0.09476462006568909,
-0.07134842127561569,
0.04781089723110199,
-0.029697727411985397,
0.0662217065691948,
0.1881687492132187,
0.050115201622247696,
-0.02344600483775139,
0.057327739894390106,
-0.21979176998138428,
0.10701397806406021,
-0.1122177466750145,
0.07128861546516418,
-0.16828487813472748,
-0.08395754545927048,
-0.010846687480807304,
0.013964264653623104,
0.10156545788049698,
0.13362878561019897,
-0.18422120809555054,
-0.06105677783489227,
0.21266284584999084,
-0.05483682081103325,
-0.06931377202272415,
0.09618476033210754,
-0.03758940100669861,
0.018754124641418457,
0.0034699407406151295,
0.168864443898201,
0.08858472108840942,
-0.09489128738641739,
0.037085648626089096,
-0.09385930001735687,
0.04931672662496567,
0.12981072068214417,
0.00605010986328125,
-0.07825269550085068,
0.0936136245727539,
-0.0014827352715656161,
0.0003695962077472359,
0.004511215258389711,
-0.058172743767499924,
-0.05920536071062088,
0.02630634233355522,
-0.026035260409116745,
-0.001130738528445363,
0.04997749254107475,
-0.0017620286671444774,
-0.056370995938777924,
-0.14044836163520813,
-0.02192559838294983,
0.05431142449378967,
-0.09687720239162445,
-0.008883717469871044,
-0.09019878506660461,
0.05726655572652817,
0.059071172028779984,
0.021112050861120224,
-0.16967983543872833,
-0.024098465219140053,
-0.006216457579284906,
-0.08715566247701645,
-0.02610768750309944,
0.04001699388027191,
0.08895662426948547,
0.018118947744369507,
0.008065306581556797,
-0.03687256947159767,
-0.022628363221883774,
0.013087849132716656,
-0.032448239624500275,
-0.26734793186187744,
-0.048554468899965286,
-0.016904963180422783,
0.1902751624584198,
-0.2061932533979416,
0.023135956376791,
0.059663038700819016,
0.1264859139919281,
0.025176504626870155,
-0.059692490845918655,
0.02735665999352932,
0.055333372205495834,
-0.0006667558336630464,
-0.0722971260547638,
0.03721921890974045,
-0.0037629297003149986,
-0.05104472115635872,
-0.02661995403468609,
-0.17641623318195343,
0.03736623749136925,
0.09079355746507645,
-0.017923889681696892,
-0.16060391068458557,
0.013223126530647278,
-0.04047686606645584,
-0.05873705446720123,
-0.026025306433439255,
0.030252331867814064,
0.17205242812633514,
0.05327014625072479,
0.08041604608297348,
-0.03354775160551071,
-0.04451094940304756,
0.036147426813840866,
0.019243644550442696,
0.019228344783186913,
0.1594831347465515,
0.0265570767223835,
-0.019455401226878166,
0.07272407412528992,
0.03231600672006607,
0.00623522512614727,
0.1267228126525879,
-0.055113524198532104,
-0.06860483437776566,
-0.04636883735656738,
0.047493163496255875,
0.05361931025981903,
0.09755577892065048,
-0.12947361171245575,
0.0022004269994795322,
0.03550277650356293,
0.029776081442832947,
-0.006924728862941265,
-0.12203630805015564,
0.009521673433482647,
0.04206375032663345,
-0.04578665643930435,
-0.029127230867743492,
0.0020384977106004953,
0.01805567927658558,
0.06531029939651489,
0.01525332685559988,
-0.06374753266572952,
-0.03914278373122215,
-0.054220136255025864,
-0.08117293566465378,
0.17620079219341278,
-0.0787685215473175,
-0.14587660133838654,
-0.07179200649261475,
-0.006251845043152571,
0.012132757343351841,
-0.0245947428047657,
0.025915957987308502,
-0.08826557546854019,
-0.07708162814378738,
-0.06288935244083405,
0.022972004488110542,
0.035460393875837326,
-0.008143693208694458,
0.053971294313669205,
0.02316875010728836,
0.09220170974731445,
-0.09524158388376236,
0.01688206009566784,
-0.020382292568683624,
-0.004901941400021315,
0.038675952702760696,
0.03990984708070755,
0.08235079050064087,
0.13336040079593658,
0.05354626849293709,
0.03934874013066292,
-0.03225133940577507,
0.15901726484298706,
-0.1289856731891632,
0.03099939413368702,
0.09772970527410507,
-0.030429782345891,
0.030547963455319405,
0.18121875822544098,
0.02884451486170292,
-0.10475751757621765,
0.000362959923222661,
0.06603207439184189,
-0.007792152464389801,
-0.2005224972963333,
-0.0065576001070439816,
-0.07530622184276581,
0.006365633569657803,
0.1185545101761818,
0.016444610431790352,
0.016352403908967972,
0.016632018610835075,
0.009447590447962284,
-0.018545735627412796,
0.07428782433271408,
0.05299888551235199,
0.04326506704092026,
0.06166592985391617,
0.11391075700521469,
-0.0032778242602944374,
-0.006934278178960085,
0.00852882768958807,
-0.050702016800642014,
0.1803431659936905,
-0.03157973662018776,
0.15747413039207458,
0.094048872590065,
0.1296404004096985,
-0.0014339386252686381,
0.05133018642663956,
0.011365151032805443,
0.0059426273219287395,
0.03285003453493118,
-0.06737875193357468,
-0.06403766572475433,
-0.0025730696506798267,
0.03724401816725731,
-0.03200355917215347,
-0.06667965650558472,
0.019091367721557617,
0.09499204158782959,
0.2837945520877838,
0.1047305315732956,
-0.2698324918746948,
-0.02768981084227562,
-0.031224651262164116,
-0.022000323981046677,
-0.04930330812931061,
-0.000488258374389261,
0.06513302773237228,
-0.1261497288942337,
0.06770509481430054,
-0.06509171426296234,
0.09520254284143448,
-0.100670725107193,
0.024703843519091606,
0.08216342329978943,
0.058942269533872604,
0.00458159064874053,
0.01219231728464365,
-0.19482076168060303,
0.2476627677679062,
-0.023438915610313416,
0.09038247913122177,
-0.057220350950956345,
0.0757937952876091,
0.0231951791793108,
-0.08886906504631042,
0.11353771388530731,
-0.03326034173369408,
-0.13691486418247223,
-0.16196295619010925,
-0.059643667191267014,
-0.008054043166339397,
0.11620018631219864,
-0.06792048364877701,
0.12247220426797867,
-0.020964397117495537,
-0.06248380243778229,
0.02441445365548134,
-0.05991236865520477,
-0.1303163319826126,
-0.094911128282547,
0.05645068734884262,
-0.07722972333431244,
0.09415276348590851,
-0.0636991411447525,
-0.04075618088245392,
-0.1607195883989334,
0.18389655649662018,
-0.14710482954978943,
-0.013948415406048298,
-0.15483365952968597,
0.018827257677912712,
0.13683293759822845,
-0.057225313037633896,
0.06641994416713715,
0.012295180931687355,
0.04544895514845848,
0.03771651163697243,
-0.03477237746119499,
0.15222269296646118,
-0.08105720579624176,
-0.19894854724407196,
-0.08678222447633743,
0.11093656718730927,
0.05032319203019142,
0.023305634036660194,
-0.002833153586834669,
0.04163501784205437,
0.016208462417125702,
-0.10148856788873672,
0.07068272680044174,
0.030115235596895218,
0.044813018292188644,
-0.008842399343848228,
-0.03515122830867767,
-0.045219942927360535,
-0.062092069536447525,
-0.0009146056836470962,
0.042993366718292236,
0.24077677726745605,
-0.10037285834550858,
0.07327298074960709,
0.04869312793016434,
-0.05494080111384392,
-0.16833288967609406,
0.04161135107278824,
0.09846549481153488,
0.022589582949876785,
0.0003863904858008027,
-0.17887645959854126,
0.04715948551893234,
0.09066291898488998,
0.007915448397397995,
0.08324388414621353,
-0.3193844258785248,
-0.12843908369541168,
0.08970151841640472,
0.04889789968729019,
-0.15758322179317474,
-0.150347039103508,
-0.04176061227917671,
-0.01317165233194828,
-0.15420234203338623,
0.008975030854344368,
-0.051782604306936264,
0.08662235736846924,
0.012325490824878216,
-0.072822704911232,
0.0171493012458086,
-0.08007648587226868,
0.12317071855068207,
0.01876465044915676,
0.07658067345619202,
-0.05438775569200516,
0.04346726834774017,
0.03869534283876419,
-0.0676145926117897,
-0.019184865057468414,
-0.048138029873371124,
0.0030365418642759323,
-0.12552720308303833,
-0.024507567286491394,
-0.0788559839129448,
0.004628688562661409,
-0.07728967815637589,
-0.028223825618624687,
0.008776072412729263,
0.07299469411373138,
0.11184149235486984,
-0.0028879649471491575,
0.033749278634786606,
-0.08405670523643494,
0.14883321523666382,
0.11084331572055817,
0.04758882895112038,
0.0767834410071373,
-0.15203696489334106,
0.021918712183833122,
0.030897170305252075,
0.02761721797287464,
-0.09918312728404999,
0.06177729368209839,
0.1326649934053421,
0.025905510410666466,
0.14986015856266022,
0.05183197557926178,
-0.059565868228673935,
-0.013158234767615795,
0.06361667811870575,
-0.09686432778835297,
-0.06074261665344238,
0.051958441734313965,
-0.08723821491003036,
-0.08366899192333221,
-0.04896852374076843,
0.13187925517559052,
-0.004129593726247549,
0.020532185211777687,
0.04313836619257927,
0.06941274553537369,
-0.015150545164942741,
0.23650874197483063,
-0.027829976752400398,
0.1032777950167656,
-0.08684524893760681,
0.07083427160978317,
0.05237666517496109,
-0.10908569395542145,
0.059676192700862885,
0.0789739266037941,
-0.08495603501796722,
-0.032487787306308746,
0.07747001945972443,
0.1241876631975174,
0.1347712278366089,
-0.017936071380972862,
-0.1093224510550499,
-0.12913693487644196,
0.07849258184432983,
0.11491566151380539,
0.00810079276561737,
0.03572162985801697,
0.00974308978766203,
0.006574364844709635,
-0.08095370978116989,
0.07703308761119843,
0.11742908507585526,
0.021085985004901886,
-0.06365608423948288,
0.12824749946594238,
0.0025259065441787243,
-0.05974491685628891,
0.006527131423354149,
-0.003427876392379403,
-0.18499203026294708,
0.023402506485581398,
-0.06064416095614433,
0.003837763098999858,
-0.07478725910186768,
-0.011431722901761532,
0.009272672235965729,
0.010905728675425053,
-0.02434934675693512,
-0.012639016844332218,
-0.09565208852291107,
-0.06745490431785583,
-0.007315525785088539,
0.08236026763916016,
-0.1384122371673584,
0.005900419782847166,
0.022330552339553833,
-0.1432456225156784,
0.08692225068807602,
0.026263250038027763,
0.0022418394219130278,
-0.013146961107850075,
-0.11075949668884277,
-0.007533936761319637,
-0.003721315646544099,
-0.029943814501166344,
0.04289861395955086,
-0.19671763479709625,
0.015966489911079407,
-0.08186902850866318,
-0.03342070430517197,
0.005767500959336758,
-0.014405309222638607,
-0.13481347262859344,
0.05922625586390495,
-0.03116825595498085,
-0.017056917771697044,
-0.04592045024037361,
0.026108618825674057,
0.07978376746177673,
-0.004176303278654814,
0.12769527733325958,
-0.06881958246231079,
0.10781904309988022,
-0.21013116836547852,
-0.045799095183610916,
0.03156496211886406,
-0.029343925416469574,
0.04501209408044815,
-0.011962619610130787,
0.13501869142055511,
-0.07559336721897125,
0.047425415366888046,
-0.01485519204288721,
0.027684777975082397,
0.02324998751282692,
-0.10943304002285004,
-0.05831845477223396,
0.07116106152534485,
0.0698203593492508,
0.045912906527519226,
-0.017600351944565773,
0.04409945756196976,
-0.012371497228741646,
0.016276082023978233,
0.07620028406381607,
0.1930963695049286,
0.18408453464508057,
0.04711524024605751,
0.04355601966381073,
0.03105746954679489,
-0.1602877527475357,
-0.10471395403146744,
0.18927110731601715,
-0.11646363139152527,
0.16953173279762268,
-0.06962957978248596,
0.047715507447719574,
0.07477451115846634,
-0.16663844883441925,
0.07962927967309952,
-0.06383518129587173,
-0.08860360085964203,
-0.07865961641073227,
-0.09517046064138412,
-0.05286562815308571,
-0.09904559701681137,
0.031206708401441574,
-0.058646731078624725,
0.04735120013356209,
0.0967404916882515,
0.06721243262290955,
0.055228859186172485,
0.11327998340129852,
-0.0025264392606914043,
-0.006001367699354887,
0.11896965652704239,
0.04639492928981781,
-0.045828867703676224,
-0.055944036692380905,
-0.061536166816949844,
-0.012868507765233517,
0.006522911135107279,
0.06906993687152863,
-0.016950948163866997,
-0.07485128939151764,
0.050441812723875046,
-0.007691279519349337,
-0.0955343246459961,
0.03193850442767143,
-0.016025485470891,
0.002984740538522601,
0.11021403968334198,
0.06278108805418015,
-0.005795537959784269,
-0.01768614538013935,
0.18403376638889313,
-0.0752963051199913,
-0.08777698874473572,
-0.16383352875709534,
0.1412479728460312,
0.012546916492283344,
0.015405010432004929,
-0.008800238370895386,
-0.08030413836240768,
0.009031054563820362,
0.22056397795677185,
0.1720363199710846,
-0.01955370232462883,
0.004338938742876053,
0.04123546555638313,
-0.001057151355780661,
-0.0031919165048748255,
0.08107852190732956,
0.0638069286942482,
0.08117271214723587,
-0.02749350108206272,
0.008799596689641476,
-0.008579682558774948,
-0.1115553081035614,
-0.015810182318091393,
0.07435186952352524,
0.03675530478358269,
-0.0007648933096788824,
-0.020241035148501396,
0.125777930021286,
-0.10232634842395782,
-0.2390630841255188,
0.12045981734991074,
-0.16243141889572144,
-0.17913757264614105,
-0.04458281397819519,
0.056622326374053955,
0.026030568405985832,
0.07153034210205078,
0.028921054676175117,
-0.05410238355398178,
0.14361824095249176,
0.021731287240982056,
0.024885283783078194,
-0.11254677176475525,
0.007464244030416012,
-0.04313606396317482,
0.16223391890525818,
-0.03571108356118202,
0.008871285244822502,
0.12674225866794586,
0.05491909757256508,
-0.08738286793231964,
0.01300795003771782,
0.08815636485815048,
-0.14556530117988586,
0.023297671228647232,
0.129500612616539,
-0.03253323584794998,
0.0934821143746376,
0.0702151209115982,
-0.06593409180641174,
0.06367742270231247,
-0.09204263985157013,
-0.012040229514241219,
-0.059923212975263596,
0.014590865932404995,
-0.013866066001355648,
0.13677415251731873,
0.23304618895053864,
-0.04646865651011467,
0.019534945487976074,
-0.04527166485786438,
0.01174931600689888,
-0.024509593844413757,
0.080905020236969,
-0.08181320875883102,
-0.23060274124145508,
0.06820765137672424,
-0.010915585793554783,
0.03690555691719055,
-0.12289902567863464,
-0.10669150203466415,
0.07508813589811325,
-0.07065895944833755,
-0.061131108552217484,
0.12988369166851044,
0.044175442308187485,
0.06784294545650482,
-0.0433293953537941,
-0.11621039360761642,
0.008107060566544533,
0.18660637736320496,
-0.11781089752912521,
-0.06751316785812378
] |
null | null | transformers | <!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-large-xls-r-1b-nl-lm
This model is a fine-tuned version of [wav2vec2-large-xls-r-1b-nl-lm](https://huggingface.co/facebook/wav2vec2-xls-r-1b) on the common_voice 8 dataset.
It achieves the following results on the test set:
- Loss: 0.1479
- Wer: 0.1156
Note that the above test results come from the original model without LM (language model) which can be found at https://huggingface.co/RuudVelo/wav2vec2-large-xls-r-1b-nl. The results with the LM model can be found on the right side of this model card.
## Model description
Model RuudVelo/wav2vec2-large-xls-r-1b-nl which has been improved with a KenLM 5-gram.
## Intended uses & limitations
More information needed
## Training and evaluation data
Common Voice 8 nl dataset has been used for the model
## Training procedure
### Training hyperparameters
Parameters can be found in the run.sh file at https://huggingface.co/RuudVelo/wav2vec2-large-xls-r-1b-nl
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.18.3
- Tokenizers 0.11.0 | {"language": ["nl"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "mozilla-foundation/common_voice_8_0", "generated_from_trainer", "nl", "robust-speech-event", "model_for_talk", "hf-asr-leaderboard"], "datasets": ["mozilla-foundation/common_voice_8_0"], "model-index": [{"name": "wav2vec2-large-xls-r-1b-nl-lm", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Common Voice 8", "type": "mozilla-foundation/common_voice_8_0", "args": "nl"}, "metrics": [{"type": "wer", "value": 9.73, "name": "Test WER"}, {"type": "cer", "value": 2.89, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Dev Data", "type": "speech-recognition-community-v2/dev_data", "args": "nl"}, "metrics": [{"type": "wer", "value": 27.27, "name": "Test WER"}, {"type": "cer", "value": 13.23, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Test Data", "type": "speech-recognition-community-v2/eval_data", "args": "nl"}, "metrics": [{"type": "wer", "value": 27.67, "name": "Test WER"}]}]}]} | automatic-speech-recognition | RuudVelo/wav2vec2-large-xls-r-1b-nl-lm | [
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"mozilla-foundation/common_voice_8_0",
"generated_from_trainer",
"nl",
"robust-speech-event",
"model_for_talk",
"hf-asr-leaderboard",
"dataset:mozilla-foundation/common_voice_8_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"nl"
] | TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #nl #robust-speech-event #model_for_talk #hf-asr-leaderboard #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# wav2vec2-large-xls-r-1b-nl-lm
This model is a fine-tuned version of wav2vec2-large-xls-r-1b-nl-lm on the common_voice 8 dataset.
It achieves the following results on the test set:
- Loss: 0.1479
- Wer: 0.1156
Note that the above test results come from the original model without LM (language model) which can be found at URL The results with the LM model can be found on the right side of this model card.
## Model description
Model RuudVelo/wav2vec2-large-xls-r-1b-nl which has been improved with a KenLM 5-gram.
## Intended uses & limitations
More information needed
## Training and evaluation data
Common Voice 8 nl dataset has been used for the model
## Training procedure
### Training hyperparameters
Parameters can be found in the URL file at URL
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.18.3
- Tokenizers 0.11.0 | [
"# wav2vec2-large-xls-r-1b-nl-lm\n\nThis model is a fine-tuned version of wav2vec2-large-xls-r-1b-nl-lm on the common_voice 8 dataset.\nIt achieves the following results on the test set:\n- Loss: 0.1479\n- Wer: 0.1156\n\nNote that the above test results come from the original model without LM (language model) which can be found at URL The results with the LM model can be found on the right side of this model card.",
"## Model description\nModel RuudVelo/wav2vec2-large-xls-r-1b-nl which has been improved with a KenLM 5-gram.",
"## Intended uses & limitations\nMore information needed",
"## Training and evaluation data\nCommon Voice 8 nl dataset has been used for the model",
"## Training procedure",
"### Training hyperparameters\nParameters can be found in the URL file at URL",
"### Framework versions\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #nl #robust-speech-event #model_for_talk #hf-asr-leaderboard #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# wav2vec2-large-xls-r-1b-nl-lm\n\nThis model is a fine-tuned version of wav2vec2-large-xls-r-1b-nl-lm on the common_voice 8 dataset.\nIt achieves the following results on the test set:\n- Loss: 0.1479\n- Wer: 0.1156\n\nNote that the above test results come from the original model without LM (language model) which can be found at URL The results with the LM model can be found on the right side of this model card.",
"## Model description\nModel RuudVelo/wav2vec2-large-xls-r-1b-nl which has been improved with a KenLM 5-gram.",
"## Intended uses & limitations\nMore information needed",
"## Training and evaluation data\nCommon Voice 8 nl dataset has been used for the model",
"## Training procedure",
"### Training hyperparameters\nParameters can be found in the URL file at URL",
"### Framework versions\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] | [
117,
122,
37,
12,
18,
3,
19,
40
] | [
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #nl #robust-speech-event #model_for_talk #hf-asr-leaderboard #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# wav2vec2-large-xls-r-1b-nl-lm\n\nThis model is a fine-tuned version of wav2vec2-large-xls-r-1b-nl-lm on the common_voice 8 dataset.\nIt achieves the following results on the test set:\n- Loss: 0.1479\n- Wer: 0.1156\n\nNote that the above test results come from the original model without LM (language model) which can be found at URL The results with the LM model can be found on the right side of this model card.## Model description\nModel RuudVelo/wav2vec2-large-xls-r-1b-nl which has been improved with a KenLM 5-gram.## Intended uses & limitations\nMore information needed## Training and evaluation data\nCommon Voice 8 nl dataset has been used for the model## Training procedure### Training hyperparameters\nParameters can be found in the URL file at URL### Framework versions\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.18.3\n- Tokenizers 0.11.0"
] | [
-0.0907697081565857,
0.20084209740161896,
-0.002857782645151019,
0.046323709189891815,
0.10616680234670639,
0.010908756405115128,
0.08095788955688477,
0.1623026430606842,
-0.06024245545268059,
0.05186144635081291,
0.016259338706731796,
0.07695142179727554,
0.07279407978057861,
0.002156089525669813,
0.07421943545341492,
-0.21570992469787598,
-0.006830073893070221,
-0.02690798230469227,
0.022499902173876762,
0.08425185084342957,
0.08243153989315033,
-0.07046650350093842,
0.03919924423098564,
0.0370064377784729,
-0.07580573111772537,
0.026625068858265877,
-0.028477564454078674,
-0.042703770101070404,
0.09038712829351425,
0.04061143100261688,
0.026624646037817,
0.02247708849608898,
0.09577162563800812,
-0.27261561155319214,
0.0005832175957038999,
0.07486024498939514,
0.03631005808711052,
0.04799140617251396,
0.029388166964054108,
-0.06084074079990387,
0.12236779928207397,
-0.09032221883535385,
0.0350111648440361,
0.09318238496780396,
-0.03981899470090866,
-0.1309688240289688,
-0.10093934088945389,
0.11771434545516968,
0.04767192527651787,
0.12722204625606537,
-0.012717817910015583,
0.11872849613428116,
-0.03166639432311058,
0.08415783941745758,
0.23668503761291504,
-0.26338011026382446,
0.011983363889157772,
0.12463764101266861,
0.025036804378032684,
0.021332548931241035,
-0.07704082876443863,
0.07125760614871979,
0.05009618028998375,
-0.009458489716053009,
0.006621368229389191,
-0.022121256217360497,
-0.04713268578052521,
0.024102088063955307,
-0.12177331000566483,
-0.014843364246189594,
0.1632164865732193,
0.04083259031176567,
-0.05486110597848892,
-0.10880274325609207,
0.022210197523236275,
-0.039510514587163925,
0.003009447129443288,
-0.04339178279042244,
0.008972926065325737,
-0.02464430034160614,
-0.020093439146876335,
-0.10919755697250366,
-0.0691862553358078,
-0.07651285082101822,
0.05185641348361969,
0.12556451559066772,
0.027900293469429016,
-0.002850761404260993,
-0.02011268585920334,
0.10132262110710144,
-0.11486474424600601,
-0.08267422765493393,
-0.07237721979618073,
-0.0226107407361269,
-0.06091595068573952,
-0.0064916121773421764,
-0.043136902153491974,
-0.09730610251426697,
0.027886154130101204,
0.041788697242736816,
0.02133249305188656,
0.031698305159807205,
-0.013677997514605522,
0.01449194923043251,
0.011472801677882671,
0.11723703145980835,
-0.09666026383638382,
-0.038468137383461,
0.026197150349617004,
0.07060667872428894,
-0.003794335527345538,
-0.021423175930976868,
-0.07483799010515213,
-0.016488758847117424,
0.015104145742952824,
0.09054255485534668,
0.02320701628923416,
0.030774401500821114,
-0.04143136739730835,
-0.031885068863630295,
0.02184976451098919,
-0.1628967523574829,
0.010112980380654335,
0.005203696433454752,
-0.08334163576364517,
0.03291713446378708,
0.10698994994163513,
-0.0007423002971336246,
-0.05621512234210968,
-0.0011109848273918033,
-0.02342280186712742,
0.012914799153804779,
-0.04592626914381981,
-0.08636648207902908,
0.0342990942299366,
-0.05345778167247772,
-0.01539913471788168,
-0.10331036895513535,
-0.20411594212055206,
-0.04988672956824303,
0.00900952983647585,
-0.02288176491856575,
0.01543678343296051,
-0.030844828113913536,
-0.04859985038638115,
-0.016404416412115097,
-0.0017137089744210243,
0.022552143782377243,
-0.021843744441866875,
0.06355903297662735,
0.009608794935047626,
0.0390753373503685,
0.06010586395859718,
0.039445988833904266,
-0.031169269233942032,
0.016474924981594086,
-0.07133514434099197,
0.1377754807472229,
-0.13023021817207336,
-0.018676625564694405,
-0.12319950014352798,
-0.05865108594298363,
-0.050691135227680206,
0.010107402689754963,
0.10505452007055283,
0.12839826941490173,
-0.15244358777999878,
-0.0538174994289875,
0.1472932994365692,
-0.08426278829574585,
-0.04728579893708229,
0.052321236580610275,
-0.012589015066623688,
0.06441069394350052,
0.08606429398059845,
0.117552749812603,
0.22724106907844543,
-0.191705122590065,
-0.10103809833526611,
0.001109508448280394,
-0.0029434149619191885,
0.031504228711128235,
0.037502221763134,
-0.03515346720814705,
-0.038905639201402664,
0.05635169893503189,
-0.06734777241945267,
0.015039859339594841,
-0.05988305062055588,
-0.07794806361198425,
-0.045727383345365524,
-0.08870027959346771,
0.05440344661474228,
0.011271162889897823,
0.004074799362570047,
-0.027826303616166115,
-0.1393091082572937,
0.09878003597259521,
0.1558840274810791,
-0.022440219298005104,
0.012274504639208317,
-0.12008863687515259,
0.05431707575917244,
-0.08714424818754196,
0.0005523108411580324,
-0.1905323714017868,
-0.035200681537389755,
0.0034780465066432953,
-0.05639345571398735,
0.07355067133903503,
0.03442395105957985,
0.0571114681661129,
0.04394640401005745,
-0.035548608750104904,
-0.006954534910619259,
-0.09303683787584305,
-0.004880872089415789,
-0.03552797809243202,
-0.12300316989421844,
-0.07018190622329712,
-0.03447391465306282,
0.12001986056566238,
-0.1648590862751007,
0.018970411270856857,
0.037371646612882614,
0.13000240921974182,
0.001817397540435195,
-0.06506089866161346,
0.07350362837314606,
0.02377069927752018,
0.004367178771644831,
-0.03408770635724068,
0.004612155258655548,
0.012298500165343285,
-0.09447745233774185,
0.017020123079419136,
-0.12232204526662827,
-0.06730566918849945,
0.09767026454210281,
0.047635868191719055,
-0.06838905811309814,
0.041866008192300797,
-0.04715370386838913,
-0.0077369967475533485,
-0.06774277985095978,
-0.07150929421186447,
0.14275570213794708,
0.036751747131347656,
0.0978585034608841,
-0.04219721257686615,
-0.05051521211862564,
0.016382206231355667,
-0.0038394429720938206,
-0.03176718205213547,
0.06174398586153984,
0.051528941839933395,
-0.017192417755723,
0.028666537255048752,
-0.02664855495095253,
-0.013455110602080822,
0.1636405736207962,
0.010259930975735188,
-0.08095879852771759,
-0.057145994156599045,
-0.003139244159683585,
0.005050336942076683,
0.11263158917427063,
-0.09426825493574142,
-0.017296606674790382,
0.04146391898393631,
0.009693441912531853,
0.05201253667473793,
-0.14652962982654572,
0.0024107983335852623,
0.05381234735250473,
-0.07091611623764038,
-0.013984897173941135,
0.03439708799123764,
-0.01952679269015789,
0.06785262376070023,
-0.0028954443987458944,
0.045382071286439896,
-0.03990722447633743,
-0.034781794995069504,
-0.11825000494718552,
0.10768069326877594,
-0.12237108498811722,
-0.22165575623512268,
-0.13992823660373688,
0.04286099970340729,
-0.08081831783056259,
-0.034498222172260284,
0.05058623105287552,
-0.12392567098140717,
-0.05238917097449303,
-0.0738428607583046,
0.02506655640900135,
-0.04762899503111839,
-0.012779515236616135,
0.033714815974235535,
0.04021574184298515,
0.07136453688144684,
-0.17017823457717896,
0.010219761170446873,
-0.0073661780916154385,
-0.05843500792980194,
-0.04064108431339264,
0.015936139971017838,
0.0890766903758049,
0.1489829421043396,
0.011936363764107227,
0.04402003437280655,
-0.006873630452901125,
0.18970604240894318,
-0.053837910294532776,
0.004228453151881695,
0.15426650643348694,
0.04467516019940376,
0.03442729264497757,
0.05597034841775894,
0.034951597452163696,
-0.07366473972797394,
0.012433847412467003,
0.09217729419469833,
-0.055636174976825714,
-0.26436954736709595,
-0.06939329206943512,
-0.019246943295001984,
-0.061130084097385406,
-0.0005225720233283937,
0.06261095404624939,
0.025537921115756035,
0.03559788316488266,
-0.0006704193656332791,
-0.015579435043036938,
0.030317939817905426,
0.07000220566987991,
0.06641603261232376,
-0.04556369408965111,
0.08046668022871017,
-0.06936684250831604,
0.021725129336118698,
0.04880841448903084,
0.04987888038158417,
0.16721539199352264,
-0.0328446663916111,
0.0881449356675148,
0.10062103718519211,
0.12535856664180756,
-0.023884477093815804,
0.011638692580163479,
0.01577153615653515,
0.00864002387970686,
0.05034222826361656,
-0.11156947165727615,
-0.05365750193595886,
0.05964295566082001,
0.041384387761354446,
0.012725629843771458,
-0.07479941844940186,
0.0510982982814312,
0.016395576298236847,
0.1842101812362671,
0.06608486920595169,
-0.217209130525589,
-0.08458865433931351,
-0.008407614193856716,
0.015079000033438206,
-0.015350724570453167,
-0.002659579971805215,
0.07157520204782486,
-0.10553517192602158,
0.060701269656419754,
0.009518972598016262,
0.09368651360273361,
-0.06793920695781708,
-0.018175732344388962,
-0.040980610996484756,
0.07756239175796509,
0.013129686005413532,
0.09164292365312576,
-0.15952317416667938,
0.1544787883758545,
0.05921454727649689,
0.11073029786348343,
-0.057391662150621414,
0.03680279850959778,
0.03861096873879433,
0.03510164096951485,
0.167084202170372,
0.015308491885662079,
-0.017929906025528908,
-0.18586432933807373,
-0.06476036459207535,
-0.0031331204809248447,
0.1307218223810196,
-0.03692542016506195,
0.0833500325679779,
-0.012189025990664959,
-0.03756779059767723,
0.004309730138629675,
-0.02488011308014393,
-0.2520417273044586,
-0.11965040862560272,
0.055102813988924026,
0.0655701756477356,
-0.015225396491587162,
-0.05570785328745842,
-0.07656297087669373,
-0.06317649036645889,
0.2196723371744156,
-0.012831083498895168,
-0.05030369013547897,
-0.11889231204986572,
0.0494319386780262,
0.12937010824680328,
-0.06686456501483917,
-0.0285351425409317,
0.022985914722085,
0.1426006257534027,
-0.006867476738989353,
-0.0821211040019989,
-0.004347884561866522,
-0.08441396802663803,
-0.11072586476802826,
-0.01603773981332779,
0.18799807131290436,
0.06873831897974014,
0.04977962002158165,
0.022975023835897446,
0.0036552278324961662,
0.05208811163902283,
-0.08765900880098343,
0.06154218316078186,
0.2352590560913086,
-0.02811305969953537,
0.06991535425186157,
-0.08862407505512238,
0.02565629780292511,
-0.09852311760187149,
-0.003787073539569974,
0.14379119873046875,
0.16611158847808838,
-0.07484715431928635,
0.12244729697704315,
0.08630122989416122,
-0.1389203816652298,
-0.17442156374454498,
0.03312395140528679,
0.07453586906194687,
0.044554196298122406,
0.014491226524114609,
-0.1901031732559204,
0.06673635542392731,
0.020593533292412758,
-0.022857187315821648,
0.02994558960199356,
-0.30936479568481445,
-0.13448145985603333,
0.049965232610702515,
0.06515560299158096,
0.014652237296104431,
-0.09115655720233917,
-0.062068622559309006,
-0.0500374436378479,
-0.20214198529720306,
0.05591090768575668,
-0.08019717782735825,
0.11676198989152908,
0.028099030256271362,
0.04846550524234772,
0.05090264603495598,
-0.03935772553086281,
0.11665178835391998,
0.06144654005765915,
0.006882825866341591,
-0.05118885263800621,
0.04400254413485527,
0.10597135871648788,
-0.06875883042812347,
0.17157061398029327,
0.00432271882891655,
0.07307137548923492,
-0.18760859966278076,
-0.061313699930906296,
-0.06620553880929947,
0.06403001397848129,
-0.039433885365724564,
-0.06712783128023148,
-0.022213298827409744,
0.055865578353405,
0.0588507205247879,
-0.009438852779567242,
-0.04570872709155083,
-0.07300831377506256,
0.06852298974990845,
0.14294083416461945,
0.09547797590494156,
0.033729348331689835,
-0.12750747799873352,
0.021244056522846222,
-0.03426715359091759,
0.07118399441242218,
-0.15106122195720673,
0.03426135703921318,
0.1049501821398735,
0.07689899951219559,
0.12203569710254669,
-0.01145975198596716,
-0.15194156765937805,
0.008607255294919014,
0.02599821239709854,
-0.06153405085206032,
-0.14397184550762177,
-0.056879378855228424,
-0.00009755017526913434,
-0.06155060976743698,
0.043218664824962616,
0.15848882496356964,
-0.07654308527708054,
-0.020824242383241653,
-0.011218257248401642,
0.017030440270900726,
-0.07267515361309052,
0.15289011597633362,
0.07733689993619919,
0.08656206727027893,
-0.04815161973237991,
0.0969749465584755,
0.027162304148077965,
-0.025212282314896584,
0.0949995294213295,
0.008239900693297386,
-0.0655340701341629,
-0.06339146941900253,
-0.0574863962829113,
0.1048501580953598,
-0.0256475992500782,
-0.061764396727085114,
-0.07101763039827347,
-0.03439415618777275,
0.025621840730309486,
0.055229611694812775,
0.018308935686945915,
0.008019745349884033,
-0.020355695858597755,
-0.0005138583364896476,
-0.13483889400959015,
0.08662082254886627,
0.03007843904197216,
0.0031438947189599276,
-0.13941608369350433,
0.0825730562210083,
0.04974149540066719,
0.08385512232780457,
-0.01087331771850586,
-0.05403393134474754,
-0.05532130226492882,
0.029188185930252075,
-0.1604887694120407,
-0.019448883831501007,
-0.07539332658052444,
0.00501299137249589,
0.002768032718449831,
-0.04982592165470123,
-0.0146157406270504,
0.07934501767158508,
-0.08062442392110825,
-0.07687383145093918,
-0.014050356112420559,
0.06699085235595703,
-0.13419966399669647,
-0.010638114996254444,
0.007560142781585455,
-0.07874231040477753,
0.05336504429578781,
0.030882295221090317,
-0.015243652276694775,
0.07672759890556335,
-0.051881566643714905,
-0.01770170032978058,
0.023943135514855385,
0.052063774317502975,
0.019561171531677246,
-0.1204032301902771,
0.023272601887583733,
-0.014977015554904938,
0.025874394923448563,
0.005778566934168339,
0.00784572958946228,
-0.10742358863353729,
-0.07057106494903564,
-0.06224970519542694,
0.026974381878972054,
-0.03900831937789917,
0.06959356367588043,
0.10461010783910751,
0.06193755194544792,
0.08132915943861008,
-0.05215459689497948,
0.10490760952234268,
-0.19024144113063812,
-0.026466630399227142,
-0.03553398698568344,
0.018390599638223648,
0.0077049643732607365,
-0.02318033203482628,
0.08670888841152191,
-0.055198848247528076,
0.10506770759820938,
-0.010886299423873425,
0.09266111999750137,
0.025276964530348778,
-0.0574895516037941,
0.004995683208107948,
-0.008580980822443962,
0.11622782796621323,
0.05054859071969986,
0.02369001694023609,
0.04848981648683548,
-0.01666419208049774,
0.03220663219690323,
-0.020309684798121452,
0.10763660073280334,
0.12319067120552063,
0.08496659249067307,
0.06532370299100876,
0.13701249659061432,
-0.08927211910486221,
-0.10347775369882584,
0.025114361196756363,
-0.06905216723680496,
0.09999176114797592,
-0.07853231579065323,
0.04726795852184296,
0.12365862727165222,
-0.12703396379947662,
0.10522569715976715,
-0.008026246912777424,
-0.07404177635908127,
-0.12752127647399902,
-0.16184963285923004,
-0.07155787199735641,
-0.10084109753370285,
0.05266539379954338,
-0.07768086344003677,
0.0004136633942835033,
0.022881023585796356,
0.027710776776075363,
0.02478407882153988,
0.11608166247606277,
-0.06812451779842377,
-0.0022366084158420563,
0.04783518239855766,
-0.0380793996155262,
-0.03327499330043793,
-0.048804838210344315,
0.023749442771077156,
0.12289566546678543,
0.039266008883714676,
0.05108284577727318,
0.01954210177063942,
-0.005505923181772232,
0.04081626981496811,
-0.02678236924111843,
-0.07171326875686646,
-0.005982724949717522,
0.01524301152676344,
0.07978761941194534,
0.07922008633613586,
0.07304468750953674,
-0.0483417883515358,
-0.03427748382091522,
0.2276710420846939,
-0.06532703340053558,
-0.07220330089330673,
-0.1449747234582901,
0.1427629292011261,
0.028338612988591194,
0.007314485497772694,
0.0014271888649091125,
-0.11387549340724945,
0.015935709699988365,
0.19847075641155243,
0.17612458765506744,
0.03538941219449043,
0.010572907514870167,
-0.017557833343744278,
-0.006935912650078535,
-0.030135689303278923,
0.0375111922621727,
0.1042538583278656,
0.15880827605724335,
-0.02810622565448284,
0.00831831619143486,
-0.02947068028151989,
-0.07587143033742905,
-0.014537107199430466,
0.06332069635391235,
-0.032219234853982925,
-0.017782915383577347,
-0.01745741255581379,
0.09734201431274414,
-0.03875187411904335,
-0.18567371368408203,
0.03675118461251259,
-0.1133139505982399,
-0.13504184782505035,
-0.041343338787555695,
0.034414805471897125,
0.03002273105084896,
0.035835158079862595,
-0.03150942549109459,
-0.013070638291537762,
0.19489584863185883,
0.0006469045183621347,
-0.11620784550905228,
-0.10240455716848373,
0.019071044400334358,
0.03157937526702881,
0.17168690264225006,
0.005430551245808601,
0.0897955670952797,
0.09368041902780533,
0.01599368266761303,
-0.1489095687866211,
0.07235110551118851,
0.030275896191596985,
-0.10674908012151718,
0.014721356332302094,
0.18136602640151978,
-0.033552706241607666,
0.13681048154830933,
0.041370682418346405,
-0.10109511017799377,
0.006244666874408722,
-0.01944781467318535,
0.01008813176304102,
-0.12892885506153107,
0.03732170909643173,
-0.07968427240848541,
0.14752203226089478,
0.15320973098278046,
-0.05920008197426796,
-0.013118643313646317,
-0.07631238549947739,
0.041785914450883865,
0.012358120642602444,
0.01222261879593134,
-0.016353994607925415,
-0.15978440642356873,
0.026387464255094528,
-0.04570017382502556,
0.02761729434132576,
-0.22654524445533752,
-0.11584973335266113,
0.11919447779655457,
-0.06554437428712845,
0.010634574107825756,
0.10101152956485748,
0.11481177061796188,
0.05818894878029823,
-0.04107426106929779,
-0.15811453759670258,
0.00822033453732729,
0.12148675322532654,
-0.17860914766788483,
-0.0791718065738678
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
#
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-1b](https://huggingface.co/facebook/wav2vec2-xls-r-1b) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - NL dataset. This model is also available with a language model which improves these results. This model can be found at https://huggingface.co/RuudVelo/wav2vec2-large-xls-r-1b-nl-lm. The Common Voice 8 Dutch test Wer is 9.73 of that model.
It achieves the following results on the evaluation set:
- Loss: 0.1479
- Wer: 0.1156
## Model description
Model fine-tuned using the wav2vec-als-r-1b model architecture
## Intended uses & limitations
More information needed
## Training and evaluation data
Model has been trained on Common Voice 8 Dutch
## Training procedure
### Training hyperparameters
Model parameters can be found under Files and versions in the run.sh file.
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 1.2223 | 0.52 | 500 | 0.3866 | 0.3425 |
| 1.0748 | 1.03 | 1000 | 0.2574 | 0.2169 |
| 1.0416 | 1.55 | 1500 | 0.2177 | 0.1946 |
| 0.9951 | 2.06 | 2000 | 0.2008 | 0.1760 |
| 0.975 | 2.58 | 2500 | 0.1961 | 0.1751 |
| 0.9461 | 3.1 | 3000 | 0.1989 | 0.1782 |
| 0.9381 | 3.61 | 3500 | 0.1928 | 0.1699 |
| 0.934 | 4.13 | 4000 | 0.1923 | 0.1633 |
| 0.9322 | 4.64 | 4500 | 0.1871 | 0.1634 |
| 0.9012 | 5.16 | 5000 | 0.1890 | 0.1702 |
| 0.9045 | 5.68 | 5500 | 0.1882 | 0.1740 |
| 0.8826 | 6.19 | 6000 | 0.1856 | 0.1575 |
| 0.8848 | 6.71 | 6500 | 0.1861 | 0.1617 |
| 0.8723 | 7.22 | 7000 | 0.1927 | 0.1646 |
| 0.8725 | 7.74 | 7500 | 0.1798 | 0.1531 |
| 0.8573 | 8.26 | 8000 | 0.1781 | 0.1587 |
| 0.8633 | 8.77 | 8500 | 0.1852 | 0.1628 |
| 0.8603 | 9.29 | 9000 | 0.1833 | 0.1601 |
| 0.8421 | 9.8 | 9500 | 0.1788 | 0.1543 |
| 0.8404 | 10.32 | 10000 | 0.1844 | 0.1556 |
| 0.8342 | 10.84 | 10500 | 0.1770 | 0.1538 |
| 0.8161 | 11.35 | 11000 | 0.1821 | 0.1567 |
| 0.8371 | 11.87 | 11500 | 0.1909 | 0.1629 |
| 0.8083 | 12.38 | 12000 | 0.1778 | 0.1498 |
| 0.806 | 12.9 | 12500 | 0.1802 | 0.1547 |
| 0.8013 | 13.42 | 13000 | 0.1859 | 0.1584 |
| 0.7913 | 13.93 | 13500 | 0.1875 | 0.1517 |
| 0.8063 | 14.45 | 14000 | 0.1799 | 0.1571 |
| 0.7991 | 14.96 | 14500 | 0.1792 | 0.1538 |
| 0.7843 | 15.48 | 15000 | 0.1753 | 0.1464 |
| 0.7905 | 16.0 | 15500 | 0.1784 | 0.1508 |
| 0.7808 | 16.51 | 16000 | 0.1771 | 0.1485 |
| 0.7743 | 17.03 | 16500 | 0.1795 | 0.1491 |
| 0.7833 | 17.54 | 17000 | 0.1722 | 0.1484 |
| 0.7763 | 18.06 | 17500 | 0.1767 | 0.1518 |
| 0.7698 | 18.58 | 18000 | 0.1720 | 0.1460 |
| 0.7571 | 19.09 | 18500 | 0.1735 | 0.1478 |
| 0.7673 | 19.61 | 19000 | 0.1817 | 0.1511 |
| 0.7415 | 20.12 | 19500 | 0.1763 | 0.1481 |
| 0.751 | 20.64 | 20000 | 0.1742 | 0.1484 |
| 0.7563 | 21.16 | 20500 | 0.1810 | 0.1611 |
| 0.7423 | 21.67 | 21000 | 0.1817 | 0.1557 |
| 0.7242 | 22.19 | 21500 | 0.1690 | 0.1446 |
| 0.7251 | 22.7 | 22000 | 0.1684 | 0.1446 |
| 0.7302 | 23.22 | 22500 | 0.1735 | 0.1430 |
| 0.733 | 23.74 | 23000 | 0.1720 | 0.1454 |
| 0.7128 | 24.25 | 23500 | 0.1668 | 0.1383 |
| 0.7184 | 24.77 | 24000 | 0.1635 | 0.1377 |
| 0.7015 | 25.28 | 24500 | 0.1646 | 0.1389 |
| 0.7198 | 25.8 | 25000 | 0.1775 | 0.1462 |
| 0.7178 | 26.32 | 25500 | 0.1705 | 0.1419 |
| 0.7199 | 26.83 | 26000 | 0.1649 | 0.1416 |
| 0.6981 | 27.35 | 26500 | 0.1724 | 0.1418 |
| 0.6886 | 27.86 | 27000 | 0.1633 | 0.1382 |
| 0.6922 | 28.38 | 27500 | 0.1698 | 0.1420 |
| 0.6833 | 28.9 | 28000 | 0.1611 | 0.1351 |
| 0.6798 | 29.41 | 28500 | 0.1639 | 0.1365 |
| 0.6711 | 29.93 | 29000 | 0.1668 | 0.1358 |
| 0.6762 | 30.44 | 29500 | 0.1682 | 0.1355 |
| 0.6594 | 30.96 | 30000 | 0.1629 | 0.1345 |
| 0.6664 | 31.48 | 30500 | 0.1625 | 0.1321 |
| 0.6838 | 31.99 | 31000 | 0.1597 | 0.1372 |
| 0.6603 | 32.51 | 31500 | 0.1583 | 0.1302 |
| 0.6468 | 33.02 | 32000 | 0.1595 | 0.1322 |
| 0.6464 | 33.54 | 32500 | 0.1609 | 0.1315 |
| 0.6623 | 34.06 | 33000 | 0.1622 | 0.1366 |
| 0.6414 | 34.57 | 33500 | 0.1587 | 0.1330 |
| 0.6242 | 35.09 | 34000 | 0.1614 | 0.1337 |
| 0.632 | 35.6 | 34500 | 0.1568 | 0.1272 |
| 0.6346 | 36.12 | 35000 | 0.1583 | 0.1274 |
| 0.6143 | 36.64 | 35500 | 0.1576 | 0.1264 |
| 0.6208 | 37.15 | 36000 | 0.1621 | 0.1263 |
| 0.6185 | 37.67 | 36500 | 0.1623 | 0.1270 |
| 0.6128 | 38.18 | 37000 | 0.1604 | 0.1268 |
| 0.6151 | 38.7 | 37500 | 0.1593 | 0.1246 |
| 0.6082 | 39.22 | 38000 | 0.1532 | 0.1238 |
| 0.6 | 39.73 | 38500 | 0.1524 | 0.1224 |
| 0.6032 | 40.25 | 39000 | 0.1521 | 0.1212 |
| 0.6016 | 40.76 | 39500 | 0.1551 | 0.1215 |
| 0.6009 | 41.28 | 40000 | 0.1523 | 0.1215 |
| 0.5875 | 41.8 | 40500 | 0.1541 | 0.1216 |
| 0.608 | 42.31 | 41000 | 0.1536 | 0.1209 |
| 0.5876 | 42.83 | 41500 | 0.1567 | 0.1211 |
| 0.5714 | 43.34 | 42000 | 0.1532 | 0.1217 |
| 0.5756 | 43.86 | 42500 | 0.1516 | 0.1196 |
| 0.5719 | 44.38 | 43000 | 0.1491 | 0.1191 |
| 0.5829 | 44.89 | 43500 | 0.1497 | 0.1193 |
| 0.5664 | 45.41 | 44000 | 0.1487 | 0.1173 |
| 0.5707 | 45.92 | 44500 | 0.1470 | 0.1164 |
| 0.5696 | 46.44 | 45000 | 0.1479 | 0.1161 |
| 0.5767 | 46.96 | 45500 | 0.1492 | 0.1175 |
| 0.5573 | 47.47 | 46000 | 0.1471 | 0.1165 |
| 0.5625 | 47.99 | 46500 | 0.1484 | 0.1168 |
| 0.5671 | 48.5 | 47000 | 0.1474 | 0.1162 |
| 0.5484 | 49.02 | 47500 | 0.1479 | 0.1158 |
| 0.555 | 49.54 | 48000 | 0.1477 | 0.1157 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.3
- Tokenizers 0.11.0
| {"language": ["nl"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "mozilla-foundation/common_voice_8_0", "generated_from_trainer", "nl", "robust-speech-event", "model_for_talk", "hf-asr-leaderboard"], "datasets": ["mozilla-foundation/common_voice_8_0"], "model-index": [{"name": "wav2vec2-large-xls-r-1b-nl", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Common Voice 8", "type": "mozilla-foundation/common_voice_8_0", "args": "nl"}, "metrics": [{"type": "wer", "value": 11.12, "name": "Test WER"}, {"type": "cer", "value": 3.2, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Dev Data", "type": "speech-recognition-community-v2/dev_data", "args": "nl"}, "metrics": [{"type": "wer", "value": 31.92, "name": "Test WER"}, {"type": "cer", "value": 13.87, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Test Data", "type": "speech-recognition-community-v2/eval_data", "args": "nl"}, "metrics": [{"type": "wer", "value": 32.17, "name": "Test WER"}]}]}]} | automatic-speech-recognition | RuudVelo/wav2vec2-large-xls-r-1b-nl | [
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"mozilla-foundation/common_voice_8_0",
"generated_from_trainer",
"nl",
"robust-speech-event",
"model_for_talk",
"hf-asr-leaderboard",
"dataset:mozilla-foundation/common_voice_8_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"nl"
] | TAGS
#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #nl #robust-speech-event #model_for_talk #hf-asr-leaderboard #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on the MOZILLA-FOUNDATION/COMMON\_VOICE\_8\_0 - NL dataset. This model is also available with a language model which improves these results. This model can be found at URL The Common Voice 8 Dutch test Wer is 9.73 of that model.
It achieves the following results on the evaluation set:
* Loss: 0.1479
* Wer: 0.1156
Model description
-----------------
Model fine-tuned using the wav2vec-als-r-1b model architecture
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
Model has been trained on Common Voice 8 Dutch
Training procedure
------------------
### Training hyperparameters
Model parameters can be found under Files and versions in the URL file.
### Training results
### Framework versions
* Transformers 4.17.0.dev0
* Pytorch 1.10.2+cu102
* Datasets 1.18.3
* Tokenizers 0.11.0
| [
"### Training hyperparameters\n\n\nModel parameters can be found under Files and versions in the URL file.",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.3\n* Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #nl #robust-speech-event #model_for_talk #hf-asr-leaderboard #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nModel parameters can be found under Files and versions in the URL file.",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.3\n* Tokenizers 0.11.0"
] | [
121,
24,
4,
38
] | [
"passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #nl #robust-speech-event #model_for_talk #hf-asr-leaderboard #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nModel parameters can be found under Files and versions in the URL file.### Training results### Framework versions\n\n\n* Transformers 4.17.0.dev0\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.3\n* Tokenizers 0.11.0"
] | [
-0.150462806224823,
0.1037806048989296,
-0.0025302921421825886,
0.014728493057191372,
0.18645618855953217,
-0.024865319952368736,
0.13055437803268433,
0.10583333671092987,
-0.06792597472667694,
0.012849612161517143,
0.010503734461963177,
0.10114362090826035,
0.0451023168861866,
-0.006821698043495417,
0.006962021347135305,
-0.21597354114055634,
0.003708522068336606,
0.0181499645113945,
-0.007795989979058504,
0.06894928216934204,
0.09008702635765076,
-0.05944662168622017,
0.045888930559158325,
0.08796911686658859,
-0.14649251103401184,
0.06831672042608261,
0.020604228600859642,
-0.10040963441133499,
0.10338230431079865,
0.04411173239350319,
0.01931222341954708,
-0.001839998411014676,
0.07730469107627869,
-0.1466548591852188,
0.03925070911645889,
0.07755422592163086,
0.01652524620294571,
0.06609445810317993,
0.03624364361166954,
-0.05000843480229378,
0.0844128206372261,
0.03389985114336014,
-0.03489934653043747,
0.11036268621683121,
-0.009178089909255505,
-0.2068113535642624,
-0.05345708876848221,
0.04660347104072571,
0.02379314973950386,
0.11173724383115768,
0.0013422912452369928,
0.09460582584142685,
-0.05259747803211212,
0.09578762203454971,
0.25217968225479126,
-0.15505605936050415,
-0.0087488554418087,
-0.007745847571641207,
-0.03319133073091507,
0.044790782034397125,
-0.05623438581824303,
0.036784447729587555,
0.04000619426369667,
0.00944054126739502,
0.012049468234181404,
-0.06021979823708534,
-0.14951245486736298,
-0.03254542499780655,
-0.11684007197618484,
0.040270451456308365,
0.1525331735610962,
0.014666585251688957,
-0.013666187413036823,
-0.04422347620129585,
-0.04860672727227211,
0.045372407883405685,
-0.04126936197280884,
0.06647680699825287,
0.01411924883723259,
-0.01046013180166483,
0.012252653948962688,
-0.07860957831144333,
-0.06966670602560043,
-0.10143473744392395,
0.02144583687186241,
0.18948829174041748,
0.006109739188104868,
0.03651014342904091,
-0.09510283172130585,
0.02907354198396206,
0.01937372423708439,
-0.12113407999277115,
-0.021614179015159607,
0.02366570755839348,
0.004664392210543156,
0.04499657079577446,
-0.040854956954717636,
-0.11208631098270416,
0.10085581988096237,
-0.057117950171232224,
-0.01399159338325262,
0.02887418493628502,
-0.09197875112295151,
0.08720064163208008,
-0.06856352835893631,
0.10461411625146866,
-0.12623728811740875,
0.024660522118210793,
0.08479534089565277,
0.06038441136479378,
0.07904443144798279,
-0.03419838473200798,
-0.14811833202838898,
-0.009455339051783085,
0.004063909407705069,
0.045962829142808914,
-0.018538085743784904,
0.0526156947016716,
0.0025589801371097565,
-0.04037797078490257,
-0.04587665572762489,
-0.14287683367729187,
0.00016657329979352653,
0.0774252861738205,
-0.03321471065282822,
0.11320216953754425,
0.07641057670116425,
0.003918870817869902,
-0.11356567591428757,
-0.017476722598075867,
-0.05756637081503868,
0.07600092142820358,
-0.02856707014143467,
-0.08077593892812729,
0.04905430227518082,
-0.04876083880662918,
-0.014678671024739742,
-0.11994689702987671,
-0.08507777750492096,
-0.024978382512927055,
0.003337332746013999,
-0.006378944031894207,
-0.01507119182497263,
-0.05562214553356171,
-0.10627125948667526,
-0.006494767032563686,
-0.03945512697100639,
0.02347947098314762,
-0.04691549390554428,
0.05792445316910744,
0.048864178359508514,
0.05637502297759056,
-0.05419904366135597,
0.0836167186498642,
-0.02047053351998329,
-0.03460172191262245,
-0.12439137697219849,
0.09462343156337738,
-0.13311424851417542,
0.07343917340040207,
-0.06446922570466995,
-0.06356674432754517,
-0.023566056042909622,
0.0287078395485878,
0.08516941219568253,
0.08428355306386948,
-0.10933668911457062,
-0.12067501246929169,
0.16938219964504242,
-0.09497341513633728,
-0.06825751066207886,
0.12540441751480103,
0.015336471609771252,
0.018537070602178574,
0.06074284389615059,
0.23952284455299377,
0.08474116027355194,
-0.11218509078025818,
0.03345290571451187,
0.041891828179359436,
0.03662428632378578,
-0.10921453684568405,
0.028768660500645638,
-0.07554006576538086,
0.04912290722131729,
0.029129454866051674,
-0.0051442100666463375,
0.09786566346883774,
-0.03775717318058014,
-0.06088841333985329,
-0.0381401926279068,
-0.11082334071397781,
0.0032172328792512417,
0.042617738246917725,
0.031117672100663185,
-0.0174421276897192,
-0.071770079433918,
0.05695571005344391,
0.11693985760211945,
-0.11993866413831711,
0.07456385344266891,
-0.1009211391210556,
0.09540338069200516,
-0.10130231082439423,
0.001459332648664713,
-0.14379937946796417,
0.07614670693874359,
-0.01744488999247551,
-0.0059156259521842,
0.022775977849960327,
-0.03841667249798775,
0.037611838430166245,
0.02678973786532879,
-0.026443203911185265,
0.004618474282324314,
-0.011811569333076477,
0.014857584610581398,
-0.021256839856505394,
-0.22879308462142944,
-0.03743739798665047,
-0.02337043732404709,
0.13123123347759247,
-0.13160456717014313,
0.019386043772101402,
0.09873760491609573,
0.1295771449804306,
-0.009929221123456955,
-0.020486051216721535,
0.07437874376773834,
0.05928001180291176,
-0.014321363531053066,
-0.0357666015625,
0.036447469145059586,
0.03312196582555771,
-0.08377429842948914,
0.13603034615516663,
-0.1284129023551941,
0.07723814249038696,
0.15256381034851074,
-0.0990438312292099,
0.040519654750823975,
0.07593595236539841,
-0.04280773550271988,
-0.0231137964874506,
-0.024132095277309418,
-0.04760995879769325,
0.15898193418979645,
-0.017656667158007622,
0.17094679176807404,
-0.09001137316226959,
0.0016472518909722567,
0.048708464950323105,
-0.029425088316202164,
0.036339230835437775,
0.06760239601135254,
0.08985864371061325,
-0.009277625940740108,
0.06454924494028091,
-0.07884678989648819,
-0.15299999713897705,
0.2286175936460495,
-0.046450186520814896,
-0.07903335988521576,
0.047970499843358994,
-0.03565402701497078,
0.008016655221581459,
0.13874641060829163,
-0.26178109645843506,
-0.07499341666698456,
0.0240046214312315,
-0.004651952534914017,
0.08119095861911774,
-0.19040609896183014,
0.03047618828713894,
-0.03029177337884903,
-0.08619879931211472,
-0.06847082078456879,
0.07426703721284866,
-0.028867250308394432,
0.0805070772767067,
-0.06180566921830177,
-0.1229952722787857,
-0.0004168061423115432,
-0.0640908032655716,
-0.138876274228096,
0.10428604483604431,
-0.09022553265094757,
-0.2275613397359848,
-0.15743976831436157,
-0.019968995824456215,
-0.088506318628788,
0.0070512983947992325,
0.07104672491550446,
-0.11552266776561737,
-0.026996396481990814,
-0.05294264480471611,
0.030651498585939407,
0.004533695988357067,
0.00952430721372366,
0.09518808126449585,
0.018575290217995644,
0.056533075869083405,
-0.18108344078063965,
0.0004321947635617107,
-0.07467998564243317,
-0.031184718012809753,
-0.05709196999669075,
0.013511224649846554,
0.06677927821874619,
0.18198423087596893,
0.054025061428546906,
0.056401096284389496,
-0.018113834783434868,
0.1251162886619568,
-0.0966658741235733,
-0.029474806040525436,
0.1709383875131607,
-0.005570170935243368,
0.037674516439437866,
0.08158726245164871,
0.07094678282737732,
-0.025295741856098175,
-0.018156886100769043,
0.009270211681723595,
-0.05920577794313431,
-0.21958614885807037,
-0.0846901461482048,
-0.11167886853218079,
-0.06001290678977966,
-0.0292389914393425,
0.048523131757974625,
0.08903580158948898,
-0.0003605806559789926,
-0.008211522363126278,
-0.00026985324802808464,
0.03397615998983383,
0.028787748888134956,
0.11421695351600647,
0.017989613115787506,
0.07719412446022034,
-0.06497536599636078,
-0.006076403893530369,
0.021788468584418297,
0.009776411578059196,
0.17060047388076782,
0.001578176161274314,
0.1627635657787323,
0.05822288617491722,
0.09207814186811447,
0.0453558973968029,
0.0559341236948967,
-0.03467324748635292,
0.012686161324381828,
0.05138503760099411,
-0.11025319993495941,
-0.03223862126469612,
0.028567662462592125,
0.037692099809646606,
0.07313457131385803,
-0.05324668809771538,
0.020697636529803276,
0.03495040163397789,
0.28424689173698425,
0.01102434005588293,
-0.21557749807834625,
-0.08379004150629044,
-0.01922590285539627,
-0.03182551637291908,
0.0071211750619113445,
0.057122498750686646,
0.1383916288614273,
-0.051596954464912415,
0.06650014966726303,
0.005988872144371271,
0.09231264889240265,
-0.04420094937086105,
0.034618329256772995,
-0.0015229409327730536,
0.15375319123268127,
0.02319764904677868,
0.059590477496385574,
-0.24324019253253937,
0.21031861007213593,
0.03988067805767059,
0.10483969748020172,
-0.046047892421483994,
0.037265844643116,
0.0898822769522667,
0.1217084527015686,
0.06720862537622452,
0.019547121599316597,
0.07060703635215759,
-0.07226671278476715,
-0.03656111657619476,
0.019686175510287285,
0.07632242888212204,
0.07548634707927704,
0.012355426326394081,
-0.02367134392261505,
-0.05748771131038666,
0.044403351843357086,
-0.06980332732200623,
-0.17793437838554382,
-0.10375051945447922,
0.05263270437717438,
0.22587938606739044,
0.03197765350341797,
-0.05500872805714607,
-0.11254213750362396,
-0.06623219698667526,
0.08288779109716415,
-0.11007320880889893,
-0.045689910650253296,
-0.10445044934749603,
-0.03239905834197998,
0.0636914074420929,
-0.050203606486320496,
0.0015561239561066031,
0.019282599911093712,
0.14004114270210266,
0.0063363066874444485,
-0.09941042214632034,
0.07663822919130325,
-0.11746218800544739,
-0.11601289361715317,
-0.010579010471701622,
0.20654802024364471,
0.013133654370903969,
0.03359975665807724,
0.013706684112548828,
-0.012383012101054192,
0.0002668607048690319,
-0.06379110366106033,
0.0789271742105484,
0.1821146160364151,
-0.07457451522350311,
0.04721316695213318,
-0.06025708094239235,
-0.044134676456451416,
-0.04996689409017563,
-0.01062522642314434,
0.17390447854995728,
0.03678128868341446,
-0.05410786718130112,
0.11482568830251694,
0.17639832198619843,
-0.10523291677236557,
-0.24557028710842133,
0.04182499274611473,
0.07215897738933563,
0.014224842190742493,
-0.10118991136550903,
-0.1884530484676361,
0.10993988066911697,
-0.018308214843273163,
-0.0390271358191967,
0.10239968448877335,
-0.2750193774700165,
-0.12420602887868881,
0.12526266276836395,
0.030475126579403877,
0.11402972042560577,
-0.11715956777334213,
-0.051935095340013504,
-0.09870612621307373,
-0.12599259614944458,
0.0414191409945488,
-0.22562135756015778,
0.12002887576818466,
0.021634820848703384,
0.0874350517988205,
-0.012624346651136875,
-0.03991902247071266,
0.06549974530935287,
0.06511212140321732,
-0.03760010004043579,
-0.02083946391940117,
0.05988878756761551,
0.1588754653930664,
0.013362770900130272,
0.0838126465678215,
-0.052420876920223236,
0.01615891419351101,
-0.09515485912561417,
-0.025186818093061447,
-0.09113845974206924,
0.08080815523862839,
0.01582467183470726,
-0.021576711907982826,
0.05521281808614731,
-0.04159530624747276,
0.07031712681055069,
0.03664717450737953,
0.10946016758680344,
-0.08485671877861023,
0.14401628077030182,
0.180405855178833,
0.15913988649845123,
-0.17191597819328308,
-0.07785790413618088,
-0.023316148668527603,
-0.019219979643821716,
0.11446475237607956,
-0.06977198272943497,
0.06561795622110367,
0.04090244695544243,
0.07888387143611908,
0.07371875643730164,
0.06425442546606064,
-0.10464023053646088,
0.023526625707745552,
0.023378627374768257,
-0.09881583601236343,
-0.12778538465499878,
-0.058288898319005966,
0.008497733622789383,
-0.016728250309824944,
0.1821703165769577,
0.1786663681268692,
-0.07710754871368408,
-0.003943463321775198,
0.011314334347844124,
0.012713956646621227,
-0.11996541917324066,
0.15577496588230133,
0.12298278510570526,
0.05783124268054962,
-0.13797755539417267,
0.08606749027967453,
-0.013804582878947258,
0.025441104546189308,
0.057272154837846756,
0.10828934609889984,
-0.06934589892625809,
-0.07595322281122208,
-0.09369760006666183,
0.09710343927145004,
0.03424014151096344,
-0.11773510277271271,
-0.14269696176052094,
-0.095390185713768,
0.01060726772993803,
0.17160314321517944,
0.04674316942691803,
-0.019513744860887527,
-0.09055502712726593,
-0.01743411459028721,
-0.10878512263298035,
0.042391661554574966,
-0.022802161052823067,
0.040113452821969986,
-0.13176678121089935,
0.06383652985095978,
0.028237322345376015,
0.05999660864472389,
-0.04158008471131325,
-0.0561995692551136,
-0.06534741073846817,
0.059361182153224945,
-0.1505303829908371,
0.009026896208524704,
-0.07050780951976776,
0.014326753094792366,
0.0063913906924426556,
-0.08362370729446411,
-0.05191190913319588,
0.12069735676050186,
-0.10681957006454468,
-0.0003706739516928792,
0.0027894689701497555,
0.05199011042714119,
-0.11888589709997177,
0.03580614551901817,
-0.027392728254199028,
-0.05485206097364426,
0.10424891114234924,
0.09583891183137894,
-0.12136008590459824,
0.08679036796092987,
-0.20992113649845123,
-0.10656877607107162,
0.08052723109722137,
0.04455661401152611,
0.04951712116599083,
-0.01772192120552063,
-0.01226109080016613,
0.06281719356775284,
0.07702573388814926,
-0.035673338919878006,
0.09156206995248795,
-0.0791083350777626,
-0.018541868776082993,
-0.07362545281648636,
0.008619344793260098,
-0.04504593834280968,
0.06141871586441994,
0.10728133469820023,
0.08315090835094452,
0.12059531360864639,
-0.09279459714889526,
0.04902735725045204,
-0.10995051264762878,
0.022342205047607422,
-0.005917540285736322,
-0.07662241160869598,
-0.06880081444978714,
-0.011823993176221848,
0.09130078554153442,
-0.06578133255243301,
0.1302805244922638,
0.012318169698119164,
0.0637129694223404,
0.02247324399650097,
-0.07482114434242249,
0.006096269004046917,
0.021686013787984848,
0.1714044064283371,
0.052457813173532486,
0.019303085282444954,
0.06059303879737854,
-0.03451967239379883,
0.10510031878948212,
0.09487929940223694,
0.07718442380428314,
0.0659760981798172,
0.08873177319765091,
0.1234913170337677,
0.1991775929927826,
-0.04979062080383301,
-0.08695864677429199,
0.01840648613870144,
-0.03574531897902489,
0.10718465596437454,
-0.09421442449092865,
0.17778435349464417,
0.16471868753433228,
-0.013412325643002987,
0.06340959668159485,
-0.011457019485533237,
-0.057706642895936966,
-0.1423535794019699,
-0.11603788286447525,
-0.043202124536037445,
-0.21275117993354797,
0.016627324745059013,
-0.06994067132472992,
0.026198524981737137,
0.05594766139984131,
0.036861907690763474,
0.020870249718427658,
0.1190686970949173,
0.08266811072826385,
-0.038129206746816635,
0.10059542208909988,
-0.05897584930062294,
-0.014033695682883263,
-0.08417607843875885,
-0.05144748091697693,
0.1023651733994484,
0.012741019949316978,
0.05075349658727646,
0.012822097167372704,
-0.09662772715091705,
0.06318949908018112,
-0.04373641312122345,
-0.08544455468654633,
0.04514220356941223,
-0.008786896243691444,
0.027376236394047737,
0.04553568735718727,
0.07711969316005707,
-0.05611278861761093,
0.011717806570231915,
0.16846323013305664,
-0.10363587737083435,
-0.10022285580635071,
-0.08527467399835587,
0.1471216380596161,
0.011336194351315498,
0.0084039606153965,
-0.04248766973614693,
-0.1017451137304306,
-0.050038278102874756,
0.23976343870162964,
0.2437252253293991,
-0.008488720282912254,
0.035047344863414764,
-0.03312680497765541,
-0.003737029153853655,
-0.058567654341459274,
0.057187024503946304,
0.15215688943862915,
0.11858738213777542,
0.024572007358074188,
-0.04076479747891426,
-0.02675553224980831,
-0.0807061493396759,
-0.0923071801662445,
0.046904731541872025,
-0.05179787799715996,
-0.04752330109477043,
-0.02382609434425831,
0.06675359606742859,
-0.11908215284347534,
-0.19864411652088165,
-0.08840221166610718,
-0.12589190900325775,
-0.09282465279102325,
-0.06794685125350952,
0.07409138977527618,
0.09354259073734283,
0.023558003827929497,
-0.032167140394449234,
-0.016788464039564133,
0.149523064494133,
0.011637252755463123,
-0.10380292683839798,
-0.08745656907558441,
0.030546342954039574,
-0.2804034352302551,
0.06412788480520248,
-0.06023092195391655,
0.052625928074121475,
0.07236990332603455,
0.10663042962551117,
-0.029749024659395218,
0.049092430621385574,
0.028003334999084473,
-0.1623605340719223,
-0.0012246596161276102,
0.1940702199935913,
-0.03117494471371174,
0.09553824365139008,
-0.03398190066218376,
-0.1180775985121727,
-0.0005946806049905717,
-0.09906492382287979,
-0.05269902944564819,
-0.05351322889328003,
-0.0475650280714035,
-0.06330965459346771,
0.08253038674592972,
0.07987961173057556,
-0.06402482092380524,
-0.03854572772979736,
-0.06417325139045715,
-0.0192670077085495,
0.04271450638771057,
-0.12002116441726685,
-0.05412696301937103,
-0.2403395175933838,
-0.0018701300723478198,
-0.05946986749768257,
0.02009378746151924,
-0.1537095308303833,
-0.017701605334877968,
-0.000811650010291487,
-0.07551901042461395,
-0.04676779359579086,
-0.0035672029480338097,
0.12465829402208328,
0.04671744257211685,
-0.03684438392519951,
-0.10979893803596497,
0.02965736947953701,
0.1430930346250534,
-0.1885184496641159,
-0.10078413039445877
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-large-xls-r-300m-cv8-nl
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the common_voice dataset. In addition a 6gram KenLM model was trained and used. The KenLM model was based on train+validation Common Voice 8
It achieves results depicted on the rigth side on the model card (testset CV8)
## Model description
Dutch wav2vec2-xls-r-300m model using Common Voice 8 dataset
## Intended uses & limitations
More information needed
## Training and evaluation data
The model was trained on Dutch common voice 8 with 75 epochs. The train set consisted of the common voice 8 train set and evaluation set was the common voice 8 validation set. The WER reported is on the common voice 8 test set which was not part of training nor validation (eval)
## Training procedure
### Training hyperparameters
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.18.1
- Tokenizers 0.11.0
| {"language": ["nl"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "mozilla-foundation/common_voice_8_0", "generated_from_trainer", "nl", "robust-speech-event", "model_for_talk", "hf-asr-leaderboard"], "datasets": ["mozilla-foundation/common_voice_8_0"], "model-index": [{"name": "wav2vec2-large-xls-r-300m-cv8-nl", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Common Voice 8", "type": "mozilla-foundation/common_voice_8_0", "args": "nl"}, "metrics": [{"type": "wer", "value": 14.53, "name": "Test WER"}, {"type": "cer", "value": 4.7, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Dev Data", "type": "speech-recognition-community-v2/dev_data", "args": "nl"}, "metrics": [{"type": "wer", "value": 33.7, "name": "Test WER"}, {"type": "cer", "value": 15.64, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Test Data", "type": "speech-recognition-community-v2/eval_data", "args": "nl"}, "metrics": [{"type": "wer", "value": 35.19, "name": "Test WER"}]}]}]} | automatic-speech-recognition | RuudVelo/wav2vec2-large-xls-r-300m-cv8-nl | [
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"mozilla-foundation/common_voice_8_0",
"generated_from_trainer",
"nl",
"robust-speech-event",
"model_for_talk",
"hf-asr-leaderboard",
"dataset:mozilla-foundation/common_voice_8_0",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"nl"
] | TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #nl #robust-speech-event #model_for_talk #hf-asr-leaderboard #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
# wav2vec2-large-xls-r-300m-cv8-nl
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the common_voice dataset. In addition a 6gram KenLM model was trained and used. The KenLM model was based on train+validation Common Voice 8
It achieves results depicted on the rigth side on the model card (testset CV8)
## Model description
Dutch wav2vec2-xls-r-300m model using Common Voice 8 dataset
## Intended uses & limitations
More information needed
## Training and evaluation data
The model was trained on Dutch common voice 8 with 75 epochs. The train set consisted of the common voice 8 train set and evaluation set was the common voice 8 validation set. The WER reported is on the common voice 8 test set which was not part of training nor validation (eval)
## Training procedure
### Training hyperparameters
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.18.1
- Tokenizers 0.11.0
| [
"# wav2vec2-large-xls-r-300m-cv8-nl\n\nThis model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the common_voice dataset. In addition a 6gram KenLM model was trained and used. The KenLM model was based on train+validation Common Voice 8\nIt achieves results depicted on the rigth side on the model card (testset CV8)",
"## Model description\n\nDutch wav2vec2-xls-r-300m model using Common Voice 8 dataset",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nThe model was trained on Dutch common voice 8 with 75 epochs. The train set consisted of the common voice 8 train set and evaluation set was the common voice 8 validation set. The WER reported is on the common voice 8 test set which was not part of training nor validation (eval)",
"## Training procedure",
"### Training hyperparameters",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.18.1\n- Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #nl #robust-speech-event #model_for_talk #hf-asr-leaderboard #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"# wav2vec2-large-xls-r-300m-cv8-nl\n\nThis model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the common_voice dataset. In addition a 6gram KenLM model was trained and used. The KenLM model was based on train+validation Common Voice 8\nIt achieves results depicted on the rigth side on the model card (testset CV8)",
"## Model description\n\nDutch wav2vec2-xls-r-300m model using Common Voice 8 dataset",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nThe model was trained on Dutch common voice 8 with 75 epochs. The train set consisted of the common voice 8 train set and evaluation set was the common voice 8 validation set. The WER reported is on the common voice 8 test set which was not part of training nor validation (eval)",
"## Training procedure",
"### Training hyperparameters",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.18.1\n- Tokenizers 0.11.0"
] | [
117,
103,
22,
12,
71,
3,
7,
38
] | [
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #mozilla-foundation/common_voice_8_0 #generated_from_trainer #nl #robust-speech-event #model_for_talk #hf-asr-leaderboard #dataset-mozilla-foundation/common_voice_8_0 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# wav2vec2-large-xls-r-300m-cv8-nl\n\nThis model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the common_voice dataset. In addition a 6gram KenLM model was trained and used. The KenLM model was based on train+validation Common Voice 8\nIt achieves results depicted on the rigth side on the model card (testset CV8)## Model description\n\nDutch wav2vec2-xls-r-300m model using Common Voice 8 dataset## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nThe model was trained on Dutch common voice 8 with 75 epochs. The train set consisted of the common voice 8 train set and evaluation set was the common voice 8 validation set. The WER reported is on the common voice 8 test set which was not part of training nor validation (eval)## Training procedure### Training hyperparameters### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.1+cu102\n- Datasets 1.18.1\n- Tokenizers 0.11.0"
] | [
-0.11757257580757141,
0.10258704423904419,
-0.004565280396491289,
-0.004500807728618383,
0.034526459872722626,
0.003411882556974888,
0.14583176374435425,
0.1254976987838745,
-0.024238351732492447,
0.06112528592348099,
0.0005320091149769723,
-0.053626351058483124,
0.09529378265142441,
-0.005911106243729591,
0.0661824643611908,
-0.2442784160375595,
0.09018205106258392,
-0.0439939983189106,
0.04949904978275299,
0.07218223810195923,
0.1391286551952362,
-0.05022423714399338,
0.035226382315158844,
0.0670771673321724,
-0.08525889366865158,
0.027433043345808983,
-0.0031873760744929314,
-0.0395987331867218,
0.1033683493733406,
0.03605053946375847,
0.057688158005476,
0.030931593850255013,
0.08286222815513611,
-0.24677243828773499,
0.0002464373828843236,
0.027309855446219444,
-0.008650255389511585,
0.03075576014816761,
0.0972253680229187,
-0.031749971210956573,
0.08953717350959778,
0.03881405293941498,
0.040603555738925934,
0.07364442944526672,
-0.02330034412443638,
-0.24900060892105103,
-0.10272271931171417,
0.06454414129257202,
0.05320223793387413,
0.11120329797267914,
-0.036760225892066956,
0.026275089010596275,
-0.05003880336880684,
0.09406980872154236,
0.17831438779830933,
-0.21704094111919403,
-0.03628973290324211,
0.10442229360342026,
0.07446557283401489,
0.032876309007406235,
-0.09556757658720016,
-0.0038301507011055946,
0.02601991966366768,
0.030404118821024895,
-0.019771799445152283,
-0.0159935113042593,
-0.0886346623301506,
-0.028095869347453117,
-0.09313841164112091,
-0.028251072391867638,
0.18479210138320923,
0.015619320794939995,
-0.07378924638032913,
-0.1066928282380104,
0.016862617805600166,
-0.09275127202272415,
0.04398247227072716,
-0.06131158769130707,
0.0020813345909118652,
-0.011447766795754433,
-0.0038883513770997524,
-0.016950175166130066,
-0.0736885592341423,
-0.049153465777635574,
0.02246355265378952,
0.06516986340284348,
0.03517729043960571,
0.013436338864266872,
-0.016102259978652,
0.08343641459941864,
-0.14290112257003784,
-0.1069321259856224,
-0.044909168034791946,
0.0174553282558918,
-0.0764518454670906,
0.010922765359282494,
-0.00545324245467782,
-0.15629436075687408,
0.02776188775897026,
0.07708319276571274,
0.001847701845690608,
0.058736417442560196,
-0.07935326546430588,
0.026930952444672585,
0.030664484947919846,
0.1802084892988205,
-0.06671911478042603,
-0.049199458211660385,
-0.04538431018590927,
0.013958514668047428,
0.007647753227502108,
-0.0209996048361063,
-0.09864702820777893,
0.021555356681346893,
0.14360280334949493,
0.06864717602729797,
-0.015052326023578644,
-0.05983635410666466,
-0.02252119779586792,
-0.017562413588166237,
0.0994592234492302,
-0.1316230148077011,
0.05956529080867767,
0.03260378539562225,
0.015802476555109024,
0.06010245904326439,
0.018421702086925507,
0.006137864198535681,
-0.0491211824119091,
0.09485358744859695,
0.0021608304232358932,
0.020615573972463608,
-0.06599681079387665,
-0.10070591419935226,
0.044051337987184525,
-0.042443688958883286,
-0.04255295917391777,
-0.03577210009098053,
-0.09234703332185745,
-0.0964806079864502,
0.005081797484308481,
-0.0673140361905098,
-0.02213558927178383,
-0.07089363783597946,
-0.08299897611141205,
0.02255215309560299,
0.004413592629134655,
0.02086338773369789,
-0.004187951795756817,
0.015016419813036919,
-0.025004714727401733,
0.007149395067244768,
0.022690612822771072,
0.045025791972875595,
-0.041442275047302246,
0.01561631541699171,
-0.09008096903562546,
0.10115980356931686,
-0.1093074306845665,
-0.12585903704166412,
-0.05344890058040619,
-0.08687388151884079,
-0.02650594525039196,
0.032125912606716156,
0.06362831592559814,
0.035284750163555145,
-0.26088619232177734,
-0.07131495326757431,
0.17079438269138336,
-0.15903863310813904,
-0.010819670744240284,
0.14641928672790527,
-0.010305674746632576,
0.015827365219593048,
0.04502319172024727,
0.13090679049491882,
0.1901191920042038,
-0.14360228180885315,
-0.04328775405883789,
-0.023983217775821686,
0.059465356171131134,
0.11195634305477142,
0.03674573078751564,
-0.07857976108789444,
0.0036146207712590694,
0.04544443637132645,
-0.03476974368095398,
-0.05316530168056488,
-0.047501057386398315,
-0.031201614066958427,
0.0039000422693789005,
-0.021609308198094368,
0.0794229656457901,
0.013232276774942875,
-0.061820097267627716,
-0.08256731182336807,
-0.12620213627815247,
0.07919865101575851,
0.1246427446603775,
-0.07317527383565903,
0.028600191697478294,
-0.08737064898014069,
0.035801324993371964,
-0.055311258882284164,
-0.05132367089390755,
-0.12775053083896637,
0.005222270730882883,
0.06540576368570328,
-0.06723493337631226,
0.041973285377025604,
0.12748143076896667,
0.04251328483223915,
0.022874338552355766,
-0.09009844809770584,
-0.04826727509498596,
-0.0927014946937561,
-0.008890757337212563,
-0.031651005148887634,
-0.156911700963974,
-0.07087141275405884,
-0.04204367473721504,
0.09032787382602692,
-0.09996870160102844,
-0.019735252484679222,
0.11694011092185974,
0.12213215231895447,
0.0026105688884854317,
-0.06428487598896027,
0.01528371311724186,
0.07967501878738403,
0.002925547305494547,
-0.0011082717683166265,
-0.04678073525428772,
0.045504696667194366,
-0.02533852495253086,
0.0984501987695694,
-0.1730777472257614,
-0.06671225279569626,
0.03276881203055382,
0.03928486257791519,
-0.04951752722263336,
0.09114447236061096,
-0.02932159975171089,
-0.02999565750360489,
-0.09892470389604568,
-0.09794473648071289,
0.14305230975151062,
-0.00536729721352458,
0.08229462802410126,
-0.09308670461177826,
-0.05094219371676445,
0.028500964865088463,
-0.01826014183461666,
-0.03696605563163757,
0.070892333984375,
-0.06510426849126816,
0.08152271062135696,
0.07564155012369156,
-0.06249017268419266,
-0.034536637365818024,
0.19223783910274506,
-0.0012877138797193766,
-0.042592182755470276,
-0.08733990043401718,
-0.03230307251214981,
-0.006956469267606735,
0.1508755385875702,
-0.16836698353290558,
0.0018168456153944135,
0.039836298674345016,
0.074766606092453,
0.050277456641197205,
-0.12576808035373688,
0.05165553092956543,
0.034717317670583725,
-0.10411840677261353,
0.0666772648692131,
0.03744399547576904,
-0.010359121486544609,
0.04855150356888771,
-0.06376897543668747,
-0.06056944653391838,
-0.015363084152340889,
-0.055859167128801346,
-0.14038391411304474,
0.10661422461271286,
-0.1395115852355957,
-0.24645191431045532,
-0.1957431435585022,
0.06151124835014343,
-0.09517980366945267,
-0.016196461394429207,
0.07649445533752441,
-0.07938284426927567,
-0.041740305721759796,
-0.07467392832040787,
-0.014288729056715965,
-0.09822018444538116,
-0.018442368134856224,
-0.024751855060458183,
-0.007885793223977089,
0.06973780691623688,
-0.14684247970581055,
-0.009623352438211441,
-0.01514363195747137,
-0.09912669658660889,
-0.0011073247296735644,
0.026015331968665123,
0.01866418682038784,
0.1593858003616333,
0.013698180206120014,
0.019804099574685097,
-0.009332442656159401,
0.14844422042369843,
-0.0929255560040474,
0.06820561736822128,
0.11156084388494492,
-0.03009842522442341,
0.030360255390405655,
0.1107623502612114,
0.025221185758709908,
-0.031011776998639107,
-0.01725653000175953,
0.03571580722928047,
-0.04818568006157875,
-0.30764198303222656,
-0.07488246262073517,
-0.03981207683682442,
-0.005399184767156839,
-0.056123677641153336,
0.0608113631606102,
0.14192460477352142,
0.007678365334868431,
-0.04140780493617058,
-0.07032322883605957,
0.07363726198673248,
0.04269033670425415,
0.08620817214250565,
0.009792209602892399,
0.07390619814395905,
-0.04413488879799843,
0.029852859675884247,
0.06108410656452179,
-0.0200046319514513,
0.11505551636219025,
0.06549757719039917,
0.07635224610567093,
0.09152229875326157,
0.10717575252056122,
0.05706636980175972,
0.026777738705277443,
-0.02948106825351715,
0.01676860824227333,
0.04451395943760872,
-0.09351059049367905,
0.004349111579358578,
0.026489030569791794,
0.051933977752923965,
0.022035956382751465,
-0.03843316063284874,
0.01814858987927437,
0.021947674453258514,
0.2054453045129776,
0.09104672819375992,
-0.11101437360048294,
-0.07507804036140442,
-0.031244952231645584,
-0.042755573987960815,
-0.07538606971502304,
-0.006357365287840366,
0.15621323883533478,
-0.15455937385559082,
0.09527837485074997,
0.010033907368779182,
0.05929439514875412,
-0.07893653213977814,
-0.0029261379968374968,
0.018703820183873177,
0.07882285863161087,
-0.0008887003641575575,
0.10235241055488586,
-0.2186720371246338,
0.16528251767158508,
0.0034236270003020763,
0.0737362802028656,
-0.040743470191955566,
0.031358879059553146,
-0.025924742221832275,
-0.0024288923013955355,
0.09964101016521454,
0.055558860301971436,
-0.132771298289299,
-0.007962487637996674,
-0.05930337309837341,
-0.012049704790115356,
0.10151199251413345,
-0.03211052343249321,
0.03599882870912552,
0.008647498674690723,
-0.04832788184285164,
-0.016439251601696014,
-0.007144318427890539,
-0.1461014300584793,
-0.07722592353820801,
0.05976049229502678,
0.08133724331855774,
0.053495049476623535,
-0.09346266090869904,
-0.08826666325330734,
-0.11051326245069504,
0.15976069867610931,
-0.17197149991989136,
-0.028535259887576103,
-0.09150125086307526,
-0.010132875293493271,
0.13725268840789795,
-0.0607878603041172,
0.00030222369241528213,
0.028951866552233696,
0.20728111267089844,
-0.032680824398994446,
0.0005401652888394892,
0.033652618527412415,
-0.10722308605909348,
-0.2194744348526001,
-0.039620593190193176,
0.24201518297195435,
0.09017471224069595,
0.08614880591630936,
0.0022528087720274925,
0.03648258000612259,
0.04941529780626297,
-0.06768016517162323,
0.047431573271751404,
0.1777389794588089,
-0.1493002325296402,
0.08366905152797699,
0.0034221382811665535,
-0.06153038144111633,
-0.06664520502090454,
-0.0619339644908905,
0.08781497180461884,
0.0588020496070385,
-0.07631581276655197,
0.1439906507730484,
0.12925444543361664,
-0.15394385159015656,
-0.19278544187545776,
0.003424472641199827,
0.11757775396108627,
0.06797116994857788,
-0.007067952770739794,
-0.17213961482048035,
0.06456208229064941,
0.073113352060318,
-0.03701126575469971,
-0.06681561470031738,
-0.23730690777301788,
-0.15426279604434967,
0.07074738293886185,
-0.07555700093507767,
-0.005302969366312027,
-0.023157693445682526,
-0.07973382622003555,
-0.06082174926996231,
-0.12105152010917664,
-0.040765341371297836,
-0.012293614447116852,
0.043898630887269974,
0.0037819924764335155,
0.0300394669175148,
0.038112252950668335,
-0.00843203254044056,
0.12919971346855164,
0.12311798334121704,
0.04805118963122368,
-0.03732725977897644,
0.06945054978132248,
0.05496221035718918,
-0.01983337290585041,
0.14301148056983948,
-0.017997216433286667,
0.015100602060556412,
-0.14044173061847687,
-0.03211890161037445,
-0.10022711753845215,
0.033721212297677994,
-0.06370758265256882,
-0.012964334338903427,
-0.05381208658218384,
0.02194729447364807,
0.019623305648565292,
-0.0035057186614722013,
-0.008830469101667404,
-0.07152561843395233,
0.0743103101849556,
0.21125878393650055,
0.16981731355190277,
0.012278524227440357,
-0.1485935002565384,
0.00711521040648222,
-0.025381961837410927,
0.0410640724003315,
0.02699442394077778,
0.039637550711631775,
0.08468867838382721,
0.02610202133655548,
0.16310319304466248,
-0.02356800064444542,
-0.19339951872825623,
-0.006839302834123373,
0.01795957237482071,
-0.04755980893969536,
-0.19173848628997803,
-0.05051747336983681,
0.10118347406387329,
-0.09691798686981201,
0.0034958089236170053,
0.14801238477230072,
-0.04036356881260872,
-0.0321955569088459,
-0.018735399469733238,
0.05617021769285202,
-0.04467716068029404,
0.1669192612171173,
0.042146384716033936,
0.10768328607082367,
-0.08070466667413712,
0.09054473042488098,
0.047879885882139206,
-0.01476321928203106,
0.07623603940010071,
0.037614189088344574,
-0.041870974004268646,
-0.027636554092168808,
0.006024354137480259,
0.1653684377670288,
-0.008171135559678078,
-0.1403607279062271,
-0.05248238518834114,
-0.16433997452259064,
0.007881594821810722,
0.1041034534573555,
0.03127152472734451,
0.05205223336815834,
0.01898459903895855,
-0.007362238131463528,
-0.09525953978300095,
0.12657193839550018,
0.05332905054092407,
0.07534729689359665,
-0.08212386816740036,
0.03133571147918701,
0.041243504732847214,
0.040265101939439774,
-0.0029749071691185236,
-0.05468693748116493,
-0.0089534567669034,
0.018062923103570938,
-0.13753068447113037,
0.0009873755043372512,
-0.031121281906962395,
-0.03554842248558998,
0.02536752074956894,
-0.07155705988407135,
0.010891900397837162,
0.07024557888507843,
-0.06762965023517609,
-0.0026220555882900953,
-0.045235972851514816,
0.03491277992725372,
-0.08593657612800598,
0.031840141862630844,
0.037582769989967346,
-0.10448016971349716,
0.08726513385772705,
0.05502016469836235,
-0.018312968313694,
0.07517072558403015,
-0.10721209645271301,
-0.018121549859642982,
-0.01856295019388199,
0.04143095016479492,
0.01324150338768959,
-0.15579088032245636,
-0.012746211141347885,
0.05625924468040466,
-0.009910657070577145,
0.011587361805140972,
0.026168234646320343,
-0.09354065358638763,
0.02549775317311287,
-0.02535400725901127,
-0.06269370764493942,
-0.012692810967564583,
0.07544871419668198,
0.0879913941025734,
0.03571662679314613,
0.104093536734581,
-0.09867187589406967,
0.09514330327510834,
-0.17143388092517853,
0.0017607973422855139,
-0.013750585727393627,
0.021824870258569717,
-0.035716746002435684,
-0.018154073506593704,
0.0663883239030838,
-0.03345859795808792,
0.09377925097942352,
0.03916619345545769,
0.0222234595566988,
0.00840004999190569,
-0.06037766486406326,
-0.04776309058070183,
0.029984816908836365,
-0.0021171346306800842,
-0.010275877080857754,
-0.015743404626846313,
-0.008177521638572216,
-0.09894068539142609,
-0.028613219037652016,
0.03498590365052223,
0.09394283592700958,
0.07332323491573334,
0.1344643533229828,
0.039877187460660934,
0.15827569365501404,
-0.04730089381337166,
-0.06195426359772682,
-0.01574336364865303,
-0.10008852928876877,
0.07878459990024567,
-0.0051060570403933525,
0.12320616841316223,
0.1283445805311203,
-0.1734035313129425,
0.1653304547071457,
0.012816041707992554,
-0.08989495038986206,
-0.11849958449602127,
-0.18140295147895813,
-0.10500390827655792,
-0.12567143142223358,
0.016021601855754852,
-0.11085538566112518,
0.03659335896372795,
0.07395651191473007,
0.05609133839607239,
0.00262266444042325,
0.10747429728507996,
-0.06907352060079575,
-0.016601504758000374,
0.07906684279441833,
-0.0038926047272980213,
0.030780820176005363,
-0.008291502483189106,
0.04278542846441269,
0.10825128108263016,
0.1037510558962822,
0.06538963317871094,
0.0406816266477108,
0.06378934532403946,
-0.022936375811696053,
0.006472076755017042,
-0.0877402126789093,
0.04083746671676636,
-0.08596234023571014,
0.04157107695937157,
0.12805692851543427,
0.07439159601926804,
-0.023775294423103333,
0.0022573918104171753,
0.10033223778009415,
-0.06050889566540718,
-0.11147657781839371,
-0.12629173696041107,
0.12342385202646255,
0.0008233192493207753,
0.0348224975168705,
0.03869403526186943,
-0.06888101994991302,
0.020017603412270546,
0.10533406585454941,
0.1962822526693344,
0.12864987552165985,
0.006564149167388678,
0.0032310890965163708,
-0.025796856731176376,
0.004648213740438223,
-0.015389894135296345,
0.03840438276529312,
0.20762845873832703,
0.005055975168943405,
0.10347210615873337,
-0.04676392674446106,
-0.09251850843429565,
0.008616888895630836,
0.03795831277966499,
-0.06242799386382103,
-0.07311786711215973,
-0.02578515186905861,
0.10671461373567581,
-0.006071510724723339,
-0.188373863697052,
0.038080934435129166,
-0.0993470549583435,
-0.12633183598518372,
0.010870973579585552,
0.11906114965677261,
0.09005444496870041,
0.017124375328421593,
-0.00012909767974633723,
-0.022206656634807587,
0.22856643795967102,
-0.0045040203258395195,
-0.032597459852695465,
-0.11961415410041809,
0.036388520151376724,
-0.13738298416137695,
0.15633930265903473,
0.02253023348748684,
0.1426582634449005,
0.07439295947551727,
0.04872288927435875,
-0.06778863817453384,
0.07461054623126984,
0.029280956834554672,
-0.05585845559835434,
-0.0035741771571338177,
0.23765414953231812,
0.005992955528199673,
0.11923903226852417,
0.048240650445222855,
-0.07260402292013168,
0.007745479699224234,
-0.05738253891468048,
-0.039339594542980194,
-0.06552565097808838,
0.09279511123895645,
-0.057174839079380035,
0.09539178758859634,
0.1484559029340744,
-0.0706244483590126,
-0.028117096051573753,
-0.04951663687825203,
-0.00425585824996233,
-0.011977623216807842,
0.09334670007228851,
0.03587048873305321,
-0.2092314213514328,
0.06414096057415009,
-0.028432302176952362,
0.025372402742505074,
-0.21619993448257446,
-0.08939297497272491,
0.057696450501680374,
-0.07267611473798752,
-0.009729334153234959,
0.07052791118621826,
0.007735304534435272,
0.04494139179587364,
-0.039534684270620346,
0.08771862089633942,
0.004446785897016525,
0.11285889893770218,
-0.11125510931015015,
-0.04194438084959984
] |
null | null | transformers |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-large-xls-r-300m-nl
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the common_voice dataset.
It achieves the following results on the test set:
- Loss: 0.3923
- Wer: 0.1748
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7.5e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 1.5787 | 0.89 | 400 | 0.6354 | 0.5643 |
| 0.3036 | 1.78 | 800 | 0.3690 | 0.3552 |
| 0.188 | 2.67 | 1200 | 0.3239 | 0.2958 |
| 0.1434 | 3.56 | 1600 | 0.3093 | 0.2515 |
| 0.1245 | 4.44 | 2000 | 0.3024 | 0.2433 |
| 0.1095 | 5.33 | 2400 | 0.3249 | 0.2643 |
| 0.0979 | 6.22 | 2800 | 0.3191 | 0.2281 |
| 0.0915 | 7.11 | 3200 | 0.3152 | 0.2216 |
| 0.0829 | 8.0 | 3600 | 0.3419 | 0.2218 |
| 0.0777 | 8.89 | 4000 | 0.3432 | 0.2132 |
| 0.073 | 9.78 | 4400 | 0.3223 | 0.2131 |
| 0.0688 | 10.67 | 4800 | 0.3094 | 0.2152 |
| 0.0647 | 11.56 | 5200 | 0.3411 | 0.2152 |
| 0.0639 | 12.44 | 5600 | 0.3762 | 0.2135 |
| 0.0599 | 13.33 | 6000 | 0.3790 | 0.2137 |
| 0.0572 | 14.22 | 6400 | 0.3693 | 0.2118 |
| 0.0563 | 15.11 | 6800 | 0.3495 | 0.2139 |
| 0.0521 | 16.0 | 7200 | 0.3800 | 0.2023 |
| 0.0508 | 16.89 | 7600 | 0.3678 | 0.2033 |
| 0.0513 | 17.78 | 8000 | 0.3845 | 0.1987 |
| 0.0476 | 18.67 | 8400 | 0.3511 | 0.2037 |
| 0.045 | 19.56 | 8800 | 0.3794 | 0.1994 |
| 0.044 | 20.44 | 9200 | 0.3525 | 0.2050 |
| 0.043 | 21.33 | 9600 | 0.4082 | 0.2007 |
| 0.0409 | 22.22 | 10000 | 0.3866 | 0.2004 |
| 0.0393 | 23.11 | 10400 | 0.3899 | 0.2008 |
| 0.0382 | 24.0 | 10800 | 0.3626 | 0.1951 |
| 0.039 | 24.89 | 11200 | 0.3936 | 0.1953 |
| 0.0361 | 25.78 | 11600 | 0.4262 | 0.1928 |
| 0.0362 | 26.67 | 12000 | 0.3796 | 0.1934 |
| 0.033 | 27.56 | 12400 | 0.3616 | 0.1934 |
| 0.0321 | 28.44 | 12800 | 0.3742 | 0.1933 |
| 0.0325 | 29.33 | 13200 | 0.3582 | 0.1869 |
| 0.0309 | 30.22 | 13600 | 0.3717 | 0.1874 |
| 0.029 | 31.11 | 14000 | 0.3814 | 0.1894 |
| 0.0296 | 32.0 | 14400 | 0.3698 | 0.1877 |
| 0.0281 | 32.89 | 14800 | 0.3976 | 0.1899 |
| 0.0275 | 33.78 | 15200 | 0.3854 | 0.1858 |
| 0.0264 | 34.67 | 15600 | 0.4021 | 0.1889 |
| 0.0261 | 35.56 | 16000 | 0.3850 | 0.1830 |
| 0.0242 | 36.44 | 16400 | 0.4091 | 0.1878 |
| 0.0245 | 37.33 | 16800 | 0.4012 | 0.1846 |
| 0.0243 | 38.22 | 17200 | 0.3996 | 0.1833 |
| 0.0223 | 39.11 | 17600 | 0.3962 | 0.1815 |
| 0.0223 | 40.0 | 18000 | 0.3898 | 0.1832 |
| 0.0219 | 40.89 | 18400 | 0.4019 | 0.1822 |
| 0.0211 | 41.78 | 18800 | 0.4035 | 0.1809 |
| 0.021 | 42.67 | 19200 | 0.3915 | 0.1826 |
| 0.0208 | 43.56 | 19600 | 0.3934 | 0.1784 |
| 0.0188 | 44.44 | 20000 | 0.3912 | 0.1787 |
| 0.0195 | 45.33 | 20400 | 0.3989 | 0.1766 |
| 0.0186 | 46.22 | 20800 | 0.3887 | 0.1773 |
| 0.0188 | 47.11 | 21200 | 0.3982 | 0.1758 |
| 0.0175 | 48.0 | 21600 | 0.3933 | 0.1755 |
| 0.0172 | 48.89 | 22000 | 0.3921 | 0.1749 |
| 0.0187 | 49.78 | 22400 | 0.3923 | 0.1748 |
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.17.1.dev0
- Tokenizers 0.11.0
| {"language": ["nl"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "common_voice", "generated_from_trainer", "hf-asr-leaderboard", "model_for_talk", "nl", "robust-speech-event"], "datasets": ["common_voice"], "model-index": [{"name": "wav2vec2-large-xls-r-300m-nl", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "common_voice", "type": "common_voice", "args": "nl"}, "metrics": [{"type": "wer", "value": 17.17, "name": "Test WER"}, {"type": "cer", "value": 5.13, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Dev Data", "type": "speech-recognition-community-v2/dev_data", "args": "nl"}, "metrics": [{"type": "wer", "value": 35.76, "name": "Test WER"}, {"type": "cer", "value": 13.99, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Test Data", "type": "speech-recognition-community-v2/eval_data", "args": "nl"}, "metrics": [{"type": "wer", "value": 37.19, "name": "Test WER"}]}]}]} | automatic-speech-recognition | RuudVelo/wav2vec2-large-xls-r-300m-nl | [
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"common_voice",
"generated_from_trainer",
"hf-asr-leaderboard",
"model_for_talk",
"nl",
"robust-speech-event",
"dataset:common_voice",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"nl"
] | TAGS
#transformers #pytorch #wav2vec2 #automatic-speech-recognition #common_voice #generated_from_trainer #hf-asr-leaderboard #model_for_talk #nl #robust-speech-event #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
| wav2vec2-large-xls-r-300m-nl
============================
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the common\_voice dataset.
It achieves the following results on the test set:
* Loss: 0.3923
* Wer: 0.1748
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 7.5e-05
* train\_batch\_size: 16
* eval\_batch\_size: 8
* seed: 42
* gradient\_accumulation\_steps: 2
* total\_train\_batch\_size: 32
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 500
* num\_epochs: 50
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.16.0.dev0
* Pytorch 1.10.1+cu102
* Datasets 1.17.1.dev0
* Tokenizers 0.11.0
| [
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 50\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0"
] | [
"TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #common_voice #generated_from_trainer #hf-asr-leaderboard #model_for_talk #nl #robust-speech-event #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 50\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0"
] | [
97,
159,
4,
41
] | [
"passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #common_voice #generated_from_trainer #hf-asr-leaderboard #model_for_talk #nl #robust-speech-event #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 50\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.17.1.dev0\n* Tokenizers 0.11.0"
] | [
-0.12411770969629288,
0.13828715682029724,
-0.005255791824311018,
0.06461717933416367,
0.08853042125701904,
0.004980088677257299,
0.08617793023586273,
0.1520177572965622,
-0.05570570379495621,
0.10699853301048279,
0.0866941437125206,
0.07255372405052185,
0.06973950564861298,
0.13898125290870667,
-0.019859077408909798,
-0.2825040817260742,
0.012120923958718777,
-0.033878132700920105,
-0.09232766926288605,
0.08834287524223328,
0.08248298615217209,
-0.09760065376758575,
0.027103355154395103,
0.013586536981165409,
-0.04714685305953026,
-0.012306475080549717,
-0.04982582479715347,
-0.05315126106142998,
0.09796606749296188,
0.02869976870715618,
0.029763806611299515,
0.020171675831079483,
0.10374853014945984,
-0.2599635124206543,
0.006985626649111509,
0.05259283632040024,
0.037637196481227875,
0.06633137166500092,
0.09795695543289185,
-0.00443910202011466,
0.10887343436479568,
-0.06076936796307564,
0.06655629724264145,
0.047920506447553635,
-0.09482213854789734,
-0.24779638648033142,
-0.07032153755426407,
0.04187903553247452,
0.14434662461280823,
0.0744185596704483,
-0.03085635043680668,
0.025835566222667694,
-0.08735417574644089,
0.0957978367805481,
0.19060207903385162,
-0.22176524996757507,
-0.04442534223198891,
-0.029282178729772568,
0.020446112379431725,
0.04641316831111908,
-0.09706559032201767,
-0.041682105511426926,
-0.001506212749518454,
0.024243837222456932,
0.09370987862348557,
-0.001065156189724803,
0.019863462075591087,
0.0023528789170086384,
-0.1454254537820816,
-0.04359658062458038,
0.0984262228012085,
0.0743684321641922,
-0.0026217757258564234,
-0.1237700805068016,
-0.038618847727775574,
-0.19447696208953857,
-0.056342195719480515,
0.019065245985984802,
0.023964745923876762,
-0.042129408568143845,
-0.04285011813044548,
0.040876563638448715,
-0.03821786865592003,
-0.08048578351736069,
0.059167325496673584,
0.10993678122758865,
0.04293300211429596,
-0.036673061549663544,
0.008004934526979923,
0.08948632329702377,
0.044904496520757675,
-0.1624433845281601,
-0.006316433195024729,
0.043477728962898254,
-0.11017151921987534,
0.004127958323806524,
0.007302715443074703,
0.00206603086553514,
0.048806119710206985,
0.11470410972833633,
-0.015046443790197372,
0.10374531149864197,
0.03870970755815506,
-0.005282965488731861,
-0.06824902445077896,
0.17083747684955597,
-0.05401446297764778,
-0.10044301301240921,
-0.04576418176293373,
0.13617242872714996,
0.0016816381830722094,
-0.006745876744389534,
-0.07650698721408844,
0.0259848665446043,
0.0923113077878952,
0.050950948148965836,
-0.011333059519529343,
0.015017719939351082,
-0.06292125582695007,
-0.021166473627090454,
0.05145492032170296,
-0.13463136553764343,
0.03961199149489403,
0.06528492271900177,
-0.07437869161367416,
-0.03241356834769249,
0.0030073216184973717,
-0.002725853817537427,
-0.05141858756542206,
0.09848953038454056,
-0.047281887382268906,
-0.019297288730740547,
-0.0553659163415432,
-0.08506137877702713,
0.02393305115401745,
-0.04341249540448189,
-0.0059003704227507114,
-0.033011648803949356,
-0.08491474390029907,
-0.07105077803134918,
0.03546289727091789,
-0.06734902411699295,
-0.07443635910749435,
-0.09916319698095322,
-0.08327528834342957,
0.05471248924732208,
-0.021991552785038948,
0.1315782219171524,
-0.06675588339567184,
0.07737653702497482,
0.028596019372344017,
0.057600513100624084,
0.10837728530168533,
0.07781872898340225,
-0.023117486387491226,
0.05388995260000229,
-0.13970190286636353,
0.12094231694936752,
-0.10935472697019577,
0.03870747983455658,
-0.13523495197296143,
-0.10375240445137024,
-0.0009646888938732445,
0.0015567797236144543,
0.09986045956611633,
0.12396339327096939,
-0.17459119856357574,
-0.0910661369562149,
0.18535448610782623,
-0.03683631122112274,
-0.07744043320417404,
0.12491806596517563,
-0.019724085927009583,
-0.06199992075562477,
0.02181527577340603,
0.19135254621505737,
0.11848031729459763,
-0.09932991117238998,
0.015816310420632362,
-0.026802346110343933,
0.10665250569581985,
0.03333066403865814,
0.06631476432085037,
-0.06271335482597351,
0.07175516337156296,
-0.00011120849376311526,
-0.03347206115722656,
0.0511600561439991,
-0.06292302161455154,
-0.07668497413396835,
-0.0030148858204483986,
-0.07248397171497345,
-0.0053406888619065285,
0.043975669890642166,
0.0032651990186423063,
-0.05639491602778435,
-0.13114310801029205,
-0.012849207036197186,
0.10707864165306091,
-0.10961253196001053,
0.01579609140753746,
-0.07403295487165451,
0.041437771171331406,
-0.00487833796069026,
0.0019207980949431658,
-0.13920946419239044,
0.006789847277104855,
0.03636178374290466,
-0.076225146651268,
0.023232897743582726,
-0.017194518819451332,
0.08158908784389496,
0.048728011548519135,
-0.02622021548449993,
-0.07079887390136719,
-0.019659124314785004,
0.00055742880795151,
-0.05930222570896149,
-0.2376314103603363,
-0.06836612522602081,
-0.019315915182232857,
0.1632077544927597,
-0.21093860268592834,
-0.0016504325903952122,
0.03824324160814285,
0.12201949954032898,
0.016398150473833084,
-0.05023616552352905,
0.036440081894397736,
0.05504639074206352,
-0.0032499213702976704,
-0.07905749976634979,
0.020634939894080162,
-0.0011224483605474234,
-0.11068765819072723,
0.013547915033996105,
-0.1678377240896225,
0.042229488492012024,
0.07220757007598877,
0.04899135231971741,
-0.07101897150278091,
-0.029746944084763527,
-0.04812140762805939,
-0.05933690443634987,
-0.009432215243577957,
0.006831411272287369,
0.20510348677635193,
0.043822746723890305,
0.1166362538933754,
-0.06159990653395653,
-0.057378899306058884,
0.03145993873476982,
0.01801760494709015,
0.022008005529642105,
0.16917946934700012,
0.04749694839119911,
-0.007704040966928005,
0.07888523489236832,
0.009103295393288136,
-0.049696993082761765,
0.13943526148796082,
-0.06836576014757156,
-0.09289330989122391,
-0.026062406599521637,
0.021451054140925407,
0.017236661165952682,
0.10690136253833771,
-0.1607041358947754,
-0.020493242889642715,
0.0324113555252552,
0.023531736806035042,
0.010412247851490974,
-0.17697229981422424,
0.00616546580567956,
0.0411493219435215,
-0.09303776174783707,
-0.02912122942507267,
0.0028177774511277676,
-0.005701088346540928,
0.07452033460140228,
0.0024309135042130947,
-0.11106525361537933,
-0.03841965273022652,
-0.04576825723052025,
-0.08601581305265427,
0.16600771248340607,
-0.08898723870515823,
-0.14420679211616516,
-0.11329568922519684,
-0.018388865515589714,
-0.012386603280901909,
-0.015997804701328278,
0.02887517772614956,
-0.11045730113983154,
-0.05484522506594658,
-0.06520139425992966,
0.03445541486144066,
-0.020049016922712326,
0.011324113234877586,
0.020545998588204384,
-0.009479868225753307,
0.06820110976696014,
-0.11557234823703766,
0.010534917935729027,
-0.019442187622189522,
-0.0049476949498057365,
0.004596307408064604,
0.03968667611479759,
0.08170897513628006,
0.17876441776752472,
0.059728287160396576,
0.03988396003842354,
-0.013853764161467552,
0.19435107707977295,
-0.13287357985973358,
0.00680138636380434,
0.0852961614727974,
-0.013326323591172695,
0.048648588359355927,
0.1568422168493271,
0.03822560980916023,
-0.07049399614334106,
0.0020611509680747986,
0.028720727190375328,
-0.01615491323173046,
-0.2246817648410797,
-0.04491829872131348,
-0.06852685660123825,
-0.007721785921603441,
0.10495580732822418,
0.037722934037446976,
-0.013276072219014168,
-0.0005236163851805031,
-0.001745216315612197,
-0.022373542189598083,
0.04410965368151665,
0.04127389192581177,
0.10301701724529266,
0.03697822615504265,
0.10655437409877777,
-0.003939391113817692,
-0.03960232436656952,
0.026277784258127213,
-0.0045390366576612,
0.19662541151046753,
-0.01488959789276123,
0.2000606805086136,
0.02572645992040634,
0.12305459380149841,
-0.006726990453898907,
0.05027579143643379,
0.005592747591435909,
0.01062733307480812,
0.030208483338356018,
-0.05005326494574547,
-0.03709414601325989,
0.026944037526845932,
0.1082981750369072,
-0.0035124935675412416,
-0.08056455850601196,
0.04753792658448219,
0.022553205490112305,
0.32897162437438965,
0.08487212657928467,
-0.2824353873729706,
-0.07161079347133636,
-0.0024392399936914444,
-0.08162937313318253,
-0.03556676581501961,
0.0349515862762928,
0.10596327483654022,
-0.07536333799362183,
0.07444294542074203,
-0.04301317408680916,
0.08438228815793991,
-0.09814323484897614,
0.006891326978802681,
0.07052696496248245,
0.10217129439115524,
0.007531171664595604,
0.046439748257398605,
-0.22490403056144714,
0.2513498067855835,
-0.011891119182109833,
0.07131042331457138,
-0.035804685205221176,
0.042652059346437454,
0.0150595149025321,
-0.05373215675354004,
0.08762086182832718,
-0.0018115362618118525,
-0.08577518165111542,
-0.15669824182987213,
-0.11574895679950714,
0.015366784296929836,
0.10984740406274796,
-0.027745507657527924,
0.11388306319713593,
-0.02754388563334942,
-0.058638304471969604,
0.04319670423865318,
-0.12607407569885254,
-0.07207363098859787,
-0.09598393738269806,
0.043206650763750076,
0.02048778347671032,
0.05480581149458885,
-0.05907663702964783,
-0.07855144888162613,
-0.06336042284965515,
0.10420796275138855,
-0.1209893673658371,
-0.0397697314620018,
-0.12114136666059494,
0.02980031818151474,
0.18630142509937286,
-0.07084107398986816,
0.045293938368558884,
0.02689702808856964,
0.10936667770147324,
0.03471679985523224,
-0.014164258725941181,
0.10107813775539398,
-0.07699641585350037,
-0.207847461104393,
-0.042129334062337875,
0.19484135508537292,
0.04879932105541229,
0.07016920298337936,
-0.032786138355731964,
0.03536992520093918,
-0.019915759563446045,
-0.07857819646596909,
0.0825423002243042,
0.032619722187519073,
-0.023670589551329613,
0.029780780896544456,
-0.01667718030512333,
0.012649190612137318,
-0.07289654016494751,
-0.030390169471502304,
0.11287658661603928,
0.2642216086387634,
-0.08181691914796829,
0.07900678366422653,
0.04498066380620003,
-0.057264409959316254,
-0.13945740461349487,
-0.03966584801673889,
0.12887674570083618,
0.039314284920692444,
-0.015508700162172318,
-0.2067086547613144,
0.04602184146642685,
0.061420850455760956,
-0.018744591623544693,
0.09639628976583481,
-0.3313200771808624,
-0.13824740052223206,
0.13063597679138184,
0.023831134662032127,
-0.0450003519654274,
-0.13836140930652618,
-0.05408240482211113,
-0.016944140195846558,
-0.09997472912073135,
0.02130017802119255,
0.013955451548099518,
0.1154133528470993,
-0.011422308161854744,
0.0744607076048851,
0.027155866846442223,
-0.04023375362157822,
0.11965696513652802,
0.03833978995680809,
0.029349518939852715,
-0.00719652883708477,
0.01767921820282936,
-0.04819408059120178,
-0.06904251873493195,
0.0408162958920002,
-0.0816793441772461,
0.023692339658737183,
-0.13551753759384155,
-0.024993252009153366,
-0.08552119880914688,
-0.00048161024460569024,
-0.032666150480508804,
-0.016264237463474274,
-0.004826987162232399,
0.022012533619999886,
0.10472704470157623,
0.011021163314580917,
0.1013132780790329,
-0.05946670472621918,
0.08271948993206024,
0.1208212822675705,
0.10530652105808258,
-0.0055052246898412704,
-0.13036739826202393,
0.00017005469999276102,
0.013819818384945393,
0.02295154705643654,
-0.11029694974422455,
0.06202833727002144,
0.1489730030298233,
0.05083537846803665,
0.1487673670053482,
0.04679868742823601,
-0.0788647010922432,
-0.014935863204300404,
0.04862810671329498,
-0.082651786506176,
-0.15663067996501923,
0.0007719698478467762,
-0.017695719376206398,
-0.12473397701978683,
-0.010765881277620792,
0.10374081879854202,
-0.04291224852204323,
-0.004376685246825218,
0.015342933125793934,
0.06763976812362671,
-0.03228878602385521,
0.2428293526172638,
0.04291878268122673,
0.08913362771272659,
-0.10711639374494553,
0.07937008142471313,
0.04955080524086952,
-0.09195173531770706,
0.05424528941512108,
0.1064838096499443,
-0.049600061029195786,
-0.011952334083616734,
0.029364148154854774,
0.08468018472194672,
0.03964369744062424,
-0.04409104213118553,
-0.11820706725120544,
-0.13964590430259705,
0.09636790305376053,
0.06638038903474808,
0.01770363189280033,
0.025195837020874023,
-0.037458449602127075,
0.03378501161932945,
-0.09662995487451553,
0.11760478466749191,
0.10691842436790466,
0.054025039076805115,
-0.11259648203849792,
0.11165086179971695,
-0.006228629499673843,
-0.007988004013895988,
0.0005280586774460971,
-0.008199797943234444,
-0.10137109458446503,
0.026521362364292145,
-0.10170421749353409,
-0.0030616160947829485,
-0.056940313428640366,
0.014426477253437042,
0.0014868280850350857,
-0.042846329510211945,
-0.03745429962873459,
0.030123606324195862,
-0.1126706451177597,
-0.04704224318265915,
-0.037231385707855225,
0.05780581384897232,
-0.11255726218223572,
-0.013409893028438091,
0.022543517872691154,
-0.11968480795621872,
0.09395075589418411,
0.04277920350432396,
-0.008472541347146034,
0.020226282998919487,
-0.07622752338647842,
-0.013484712690114975,
0.04523633047938347,
0.014621392823755741,
0.05018416792154312,
-0.17629319429397583,
-0.020878400653600693,
-0.027177385985851288,
0.003000356489792466,
-0.018883658573031425,
0.02489287219941616,
-0.1256120353937149,
0.008625350892543793,
-0.05419960990548134,
-0.03822416439652443,
-0.056661467999219894,
0.06758695840835571,
0.05951680615544319,
0.04050847515463829,
0.15672746300697327,
-0.07632048428058624,
0.06578655540943146,
-0.21582156419754028,
-0.006608310621231794,
-0.002939368598163128,
-0.06057744473218918,
-0.0413358211517334,
-0.005348568316549063,
0.11226701736450195,
-0.0651288703083992,
0.09154453128576279,
-0.015899643301963806,
0.04022439569234848,
0.019714079797267914,
-0.08891560137271881,
0.0000936743090278469,
0.04448696970939636,
0.130374476313591,
0.03983684629201889,
-0.03309521824121475,
0.07613953202962875,
-0.014507027342915535,
0.059223275631666183,
0.13234181702136993,
0.1411234587430954,
0.11810514330863953,
0.06013641133904457,
0.06902766972780228,
0.08733630180358887,
-0.1515394151210785,
-0.13923828303813934,
0.14378583431243896,
-0.07544227689504623,
0.1539146900177002,
-0.033233799040317535,
0.21913832426071167,
0.0682264119386673,
-0.18335694074630737,
0.08388067036867142,
-0.03855350986123085,
-0.10331230610609055,
-0.11576275527477264,
-0.0684526264667511,
-0.0763988271355629,
-0.16685925424098969,
0.015987373888492584,
-0.10438675433397293,
0.07095172256231308,
0.048431605100631714,
0.044427551329135895,
0.037539221346378326,
0.09841457009315491,
0.04293854534626007,
0.003048664191737771,
0.09837113320827484,
0.022808194160461426,
-0.012587003409862518,
-0.04169962927699089,
-0.06447871029376984,
0.05097978562116623,
-0.015949267894029617,
0.06832572817802429,
-0.04413348436355591,
-0.10746757686138153,
0.06043872982263565,
0.006755453534424305,
-0.09351763874292374,
0.025201182812452316,
-0.013809151947498322,
0.04684533178806305,
0.09158074855804443,
0.05440346151590347,
-0.012735530734062195,
-0.016177862882614136,
0.20033805072307587,
-0.09674285352230072,
-0.07018793374300003,
-0.12091311067342758,
0.19597625732421875,
0.016252847388386726,
-0.006638700608164072,
0.023068295791745186,
-0.07282281666994095,
-0.023217160254716873,
0.14061754941940308,
0.16022226214408875,
-0.030105015262961388,
-0.021215783432126045,
0.0021211940329521894,
0.00013774476246908307,
-0.017279967665672302,
0.06111270189285278,
0.1298036277294159,
0.09355724602937698,
-0.029305750504136086,
-0.014851661399006844,
-0.02871731109917164,
-0.0767974779009819,
-0.017429057508707047,
0.07425791025161743,
0.022001508623361588,
-0.00201138062402606,
-0.02444099448621273,
0.0963345617055893,
-0.09302913397550583,
-0.17152811586856842,
-0.015139324590563774,
-0.17475317418575287,
-0.18257491290569305,
-0.03966446965932846,
0.046403899788856506,
0.05673670396208763,
0.046791329979896545,
0.008361268788576126,
-0.040986090898513794,
0.1334073394536972,
0.00598478689789772,
-0.054140765219926834,
-0.08562561869621277,
0.06400953978300095,
-0.12335307151079178,
0.1657114028930664,
-0.024615898728370667,
0.042696110904216766,
0.097438283264637,
0.06535553187131882,
-0.0808069109916687,
0.039931509643793106,
0.08577553182840347,
-0.12682747840881348,
0.05045568570494652,
0.18596698343753815,
-0.04092039540410042,
0.11985474079847336,
0.035082653164863586,
-0.07659333944320679,
0.0031910669058561325,
-0.06830518692731857,
-0.05824708938598633,
-0.05366811901330948,
-0.019655829295516014,
-0.024917496368288994,
0.1339040994644165,
0.21567122638225555,
-0.07109907269477844,
-0.010771474801003933,
-0.052735585719347,
0.001879132236354053,
0.006624891422688961,
0.11224211752414703,
-0.05272047221660614,
-0.23887987434864044,
0.027526166290044785,
-0.009195062331855297,
0.021526413038372993,
-0.1680796593427658,
-0.07622415572404861,
0.02147371508181095,
-0.05513066053390503,
-0.055601704865694046,
0.1256210207939148,
0.04264139384031296,
0.05479511618614197,
-0.05543183162808418,
-0.034501053392887115,
-0.019135795533657074,
0.17471422255039215,
-0.18273033201694489,
-0.058089662343263626
] |
null | null | transformers |
## Evaluation on Common Voice Frisian Test
```python
import torchaudio
from datasets import load_dataset, load_metric
from transformers import (
Wav2Vec2ForCTC,
Wav2Vec2Processor,
)
import torch
import re
import sys
model_name = "RuudVelo/wav2vec2-large-xlsr-53-frisian"
device = "cuda"
chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"\“\%\‘\'\”\�]'
model = Wav2Vec2ForCTC.from_pretrained(model_name).to(device)
processor = Wav2Vec2Processor.from_pretrained(model_name)
ds = load_dataset("common_voice", "fy-NL", split="test", data_dir="./cv-corpus-6.1-2020-12-11")
resampler = torchaudio.transforms.Resample(orig_freq=48_000, new_freq=16_000)
def map_to_array(batch):
speech, _ = torchaudio.load(batch["path"])
batch["speech"] = resampler.forward(speech.squeeze(0)).numpy()
batch["sampling_rate"] = resampler.new_freq
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower() + " "
return batch
ds = ds.map(map_to_array)
def map_to_pred(batch):
features = processor(batch["speech"], sampling_rate=batch["sampling_rate"][0], padding=True, return_tensors="pt")
input_values = features.input_values.to(device)
attention_mask = features.attention_mask.to(device)
with torch.no_grad():
logits = model(input_values, attention_mask=attention_mask).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["predicted"] = processor.batch_decode(pred_ids)
batch["target"] = batch["sentence"]
return batch
result = ds.map(map_to_pred, batched=True, batch_size=16, remove_columns=list(ds.features.keys()))
wer = load_metric("wer")
print(wer.compute(predictions=result["predicted"], references=result["target"]))
```
**Result**: 18.73 % | {"language": "fy-NL", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "model-index": [{"name": "wav2vec2-large-xlsr-53-frisian by RuudVelo", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice fy-NL", "type": "common_voice", "args": "fy-NL"}, "metrics": [{"type": "wer", "value": 18.73, "name": "Test WER"}]}]}]} | automatic-speech-recognition | RuudVelo/wav2vec2-large-xlsr-53-frisian | [
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [
"fy-NL"
] | TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #license-apache-2.0 #model-index #endpoints_compatible #region-us
|
## Evaluation on Common Voice Frisian Test
Result: 18.73 % | [
"## Evaluation on Common Voice Frisian Test\n\n\n\nResult: 18.73 %"
] | [
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #license-apache-2.0 #model-index #endpoints_compatible #region-us \n",
"## Evaluation on Common Voice Frisian Test\n\n\n\nResult: 18.73 %"
] | [
69,
14
] | [
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #license-apache-2.0 #model-index #endpoints_compatible #region-us \n## Evaluation on Common Voice Frisian Test\n\n\n\nResult: 18.73 %"
] | [
-0.20635606348514557,
0.05325823649764061,
-0.005255421157926321,
-0.030978383496403694,
-0.0012512679677456617,
-0.07202351093292236,
0.052014268934726715,
0.11539166420698166,
0.03170552849769592,
0.04078655317425728,
0.017787577584385872,
0.12564072012901306,
0.031133459880948067,
-0.06423401832580566,
-0.13749505579471588,
-0.09464409947395325,
0.042347028851509094,
0.0710139349102974,
0.10177798569202423,
0.0933329239487648,
0.1062919870018959,
-0.010273924097418785,
0.00033246955717913806,
0.1212373822927475,
-0.08852478116750717,
0.007171424571424723,
0.10454606264829636,
-0.135179340839386,
0.12672913074493408,
0.07682005316019058,
-0.03263338282704353,
0.06315244734287262,
0.002081229817122221,
-0.18038906157016754,
0.017036793753504753,
-0.022916128858923912,
0.055464114993810654,
0.01675720140337944,
0.07465420663356781,
0.005586589686572552,
0.058340221643447876,
0.20401327311992645,
-0.05116325616836548,
0.09097025543451309,
0.002541886642575264,
-0.2220952957868576,
-0.07263936847448349,
0.008088369853794575,
0.06557939201593399,
0.10175169259309769,
-0.048477161675691605,
0.12719637155532837,
-0.13038696348667145,
0.11547288298606873,
0.12320650368928909,
-0.22750891745090485,
0.001490598893724382,
-0.011749456636607647,
0.0417293906211853,
-0.08962482213973999,
-0.05633885785937309,
0.02172299101948738,
0.0951119214296341,
0.032375507056713104,
-0.14334727823734283,
-0.06360228359699249,
-0.16525951027870178,
-0.024675723165273666,
-0.09149660170078278,
-0.027284422889351845,
0.2067531943321228,
0.06594572961330414,
-0.09455148875713348,
-0.06994316726922989,
0.016524240374565125,
-0.0006651892326772213,
-0.0307286586612463,
-0.07202021032571793,
0.0051923636347055435,
0.025176310911774635,
0.029714154079556465,
0.027727441862225533,
-0.09895289689302444,
-0.12040230631828308,
-0.1102745309472084,
0.12445235252380371,
0.039784811437129974,
0.010520684532821178,
-0.02990550547838211,
-0.0036243186332285404,
-0.13915391266345978,
-0.06103762239217758,
-0.07040107995271683,
0.05958666279911995,
-0.0016926629468798637,
-0.012484660372138023,
-0.048446107655763626,
-0.08143600076436996,
0.16268977522850037,
0.07544634491205215,
-0.028752420097589493,
0.0018682675436139107,
-0.09193491190671921,
0.030713317915797234,
-0.00967585388571024,
0.10578525811433792,
0.015898345038294792,
0.021182939410209656,
0.046592045575380325,
0.017260320484638214,
0.0022146757692098618,
0.02769557572901249,
-0.002985833678394556,
-0.04845023527741432,
0.07533017545938492,
0.054830875247716904,
-0.03599701076745987,
-0.08837220817804337,
-0.032313454896211624,
0.025805974379181862,
0.04051332175731659,
-0.11388765275478363,
-0.011076917871832848,
0.06932875514030457,
0.04884899780154228,
0.12128036469221115,
-0.0035202112048864365,
0.05112980306148529,
-0.0684504583477974,
0.01939711533486843,
0.06211407855153084,
0.021463099867105484,
0.08596248924732208,
-0.018071945756673813,
0.05597814917564392,
-0.06101687252521515,
0.03719376027584076,
-0.05106515437364578,
-0.025910476222634315,
-0.08473610877990723,
-0.032960448414087296,
-0.01647563837468624,
-0.16476160287857056,
-0.0313020758330822,
-0.07103879749774933,
0.00972471758723259,
-0.07775598764419556,
0.03157590329647064,
-0.07037875056266785,
0.09755842387676239,
0.12426401674747467,
-0.03599133715033531,
-0.030423466116189957,
0.07359514385461807,
-0.04585782811045647,
-0.03353375196456909,
0.06880113482475281,
0.07465123385190964,
-0.08934517949819565,
-0.04446861520409584,
-0.07087197154760361,
-0.08437662571668625,
-0.09731939435005188,
0.05608992278575897,
-0.006469521671533585,
0.07346448302268982,
-0.13542218506336212,
-0.11853289604187012,
0.09942906349897385,
-0.10046771168708801,
-0.0750749409198761,
0.22104662656784058,
0.05481981858611107,
-0.03662021458148956,
0.12369269132614136,
0.2252441644668579,
0.10240296274423599,
-0.20180930197238922,
0.020942311733961105,
0.10015304386615753,
0.019737452268600464,
-0.06858228147029877,
0.07630376517772675,
-0.11351848393678665,
-0.08360729366540909,
0.033482957631349564,
-0.013874949887394905,
0.050148338079452515,
-0.04598373547196388,
-0.03542475029826164,
-0.03970634192228317,
-0.07417532801628113,
-0.0718747079372406,
0.04793313518166542,
0.024554630741477013,
-0.1172277182340622,
-0.022736916318535805,
-0.0439266711473465,
0.10168705135583878,
-0.05524701252579689,
0.03456553816795349,
-0.11577264964580536,
0.17422610521316528,
-0.11751712113618851,
-0.05780049040913582,
-0.13770395517349243,
0.2906495928764343,
-0.00900034885853529,
0.03922948241233826,
0.0940585657954216,
0.09610462933778763,
0.0163105558604002,
-0.08307065069675446,
-0.004806399345397949,
-0.009868397377431393,
0.11974704265594482,
-0.011741134338080883,
-0.03258654847741127,
-0.15807166695594788,
0.08345984667539597,
-0.04171864315867424,
0.020620904862880707,
-0.09413183480501175,
-0.05623791739344597,
0.023776954039931297,
0.03444157913327217,
-0.03303127363324165,
0.026737339794635773,
0.031727612018585205,
0.028903095051646233,
0.00017740089970175177,
0.06519568711519241,
0.020444540306925774,
0.0275319442152977,
-0.009119559079408646,
0.2061825841665268,
-0.03682461380958557,
0.18387338519096375,
0.12205197662115097,
-0.07926148176193237,
0.01482164952903986,
0.12721794843673706,
-0.03364994004368782,
-0.007661544252187014,
-0.07698402553796768,
-0.01781182922422886,
0.21339301764965057,
-0.046214837580919266,
0.12555836141109467,
-0.11234015971422195,
-0.024154556915163994,
0.05101602524518967,
-0.06790889799594879,
0.004447164013981819,
0.11818776279687881,
-0.00021975752315483987,
-0.043391164392232895,
0.03950028494000435,
0.05423871427774429,
-0.1408272385597229,
0.11130025237798691,
-0.12023911625146866,
-0.10150978714227676,
0.05370813608169556,
0.0348467156291008,
-0.03500727191567421,
0.06441625207662582,
-0.1872153878211975,
-0.005013986956328154,
0.062159109860658646,
0.03626054897904396,
0.06959463655948639,
-0.1808214783668518,
0.04412581026554108,
0.021521732211112976,
-0.11704359203577042,
-0.11377393454313278,
0.11858309805393219,
-0.02319793589413166,
0.04956933856010437,
-0.15294399857521057,
-0.2368048131465912,
0.026253754273056984,
-0.036963075399398804,
-0.15645579993724823,
0.04849744215607643,
-0.012631640769541264,
-0.14424744248390198,
-0.10901928693056107,
-0.021864289417862892,
-0.032363735139369965,
-0.017298603430390358,
0.13631510734558105,
-0.0804361030459404,
-0.01892259158194065,
-0.05066360905766487,
-0.010415557771921158,
-0.046955641359090805,
0.049305111169815063,
-0.049851931631565094,
0.006936587858945131,
0.09490417689085007,
-0.12874193489551544,
-0.020514875650405884,
-0.058874569833278656,
0.015905002132058144,
-0.005628455430269241,
-0.02584565430879593,
0.005880026146769524,
0.17186491191387177,
0.07865185290575027,
0.0007972922758199275,
-0.011409263126552105,
0.20545917749404907,
-0.09227089583873749,
-0.09538832306861877,
0.19554439187049866,
-0.018770532682538033,
-0.0581730380654335,
0.1260923445224762,
0.010189010761678219,
-0.06509031355381012,
-0.057620301842689514,
0.022692682221531868,
-0.02024765871465206,
-0.2988431453704834,
-0.05440624803304672,
-0.09736953675746918,
0.0033652090933173895,
-0.08150245994329453,
0.05682315677404404,
0.12225249409675598,
-0.04827131703495979,
-0.0012113593984395266,
-0.11609559506177902,
0.03840382397174835,
-0.06160375475883484,
0.2648748457431793,
-0.019850729033350945,
0.0967794731259346,
-0.06439945101737976,
-0.032850395888090134,
0.04790211841464043,
0.0077423653565347195,
0.05909267067909241,
0.10108549892902374,
0.06525415182113647,
0.03423817828297615,
0.14170832931995392,
0.07832134515047073,
0.02240917459130287,
-0.02752157673239708,
-0.016005119308829308,
0.024895623326301575,
-0.022636983543634415,
0.005090703256428242,
0.014102877117693424,
0.2225910723209381,
0.011526587419211864,
-0.06545327603816986,
-0.10070563852787018,
0.004256827291101217,
0.2097887098789215,
0.14646555483341217,
-0.0931028351187706,
-0.03852942958474159,
-0.036673419177532196,
-0.16834886372089386,
-0.0067026037722826,
0.09974955767393112,
-0.019023654982447624,
-0.057340107858181,
0.07813483476638794,
0.035469383001327515,
0.10745622962713242,
0.05296981334686279,
0.017892757430672646,
-0.0911383181810379,
-0.016909949481487274,
0.03837801516056061,
0.06199394538998604,
-0.251453697681427,
0.20522071421146393,
0.02202954702079296,
0.1344255954027176,
0.018053606152534485,
0.038710981607437134,
0.047717317938804626,
0.05728330463171005,
0.1319657415151596,
-0.04804724082350731,
-0.0471460185945034,
-0.0026275033596903086,
-0.0748191624879837,
0.07859469205141068,
-0.004072331823408604,
0.1174406036734581,
-0.025147227570414543,
-0.00813575554639101,
-0.023663844913244247,
0.03792962804436684,
-0.10974837839603424,
-0.16466739773750305,
0.018618963658809662,
-0.004432507790625095,
0.17652831971645355,
-0.0024673971347510815,
-0.030390610918402672,
-0.10614246875047684,
-0.2996082007884979,
0.060071852058172226,
-0.05291273444890976,
0.02555112913250923,
-0.03271191567182541,
-0.05873013660311699,
0.14904268085956573,
-0.027551425620913506,
-0.025423910468816757,
-0.010585414245724678,
0.0010222949786111712,
0.00970476120710373,
-0.04118110239505768,
0.05593809112906456,
-0.09138541668653488,
-0.1027100458741188,
-0.01201997697353363,
0.3209523856639862,
-0.03493582084774971,
0.1155320554971695,
0.05653456598520279,
0.0055455961264669895,
0.0010847983649000525,
-0.004607556387782097,
0.0641067624092102,
-0.003285990795120597,
-0.17806124687194824,
0.011107825674116611,
0.08006129413843155,
-0.16663920879364014,
-0.12841829657554626,
-0.05837957188487053,
0.1825900375843048,
0.12399808317422867,
-0.0043745143339037895,
0.11332628130912781,
0.15761934220790863,
-0.05235243961215019,
-0.1732092946767807,
-0.07398056983947754,
0.06605562567710876,
0.09557673335075378,
-0.06756032258272171,
-0.06119774654507637,
0.16046249866485596,
0.016179176047444344,
-0.09526556730270386,
0.034277983009815216,
-0.17822661995887756,
-0.09444190561771393,
0.2665586471557617,
-0.12037156522274017,
0.16165626049041748,
-0.044377103447914124,
-0.08896023035049438,
0.026742497459053993,
-0.11105065047740936,
-0.033666353672742844,
-0.04129191115498543,
0.07246603071689606,
0.028663117438554764,
-0.02605251595377922,
0.038382675498723984,
0.008588101714849472,
0.10639151185750961,
0.06185232475399971,
-0.037466954439878464,
0.061455439776182175,
0.11158566176891327,
-0.13995997607707977,
0.08196486532688141,
0.12961198389530182,
-0.14435990154743195,
0.06285948306322098,
-0.02556760609149933,
-0.08011547476053238,
-0.070131815969944,
0.1177353486418724,
0.05349140241742134,
0.04659278318285942,
-0.037142280489206314,
-0.09867648780345917,
-0.03666682541370392,
0.006498665548861027,
0.1317836344242096,
-0.12185344845056534,
0.05453179404139519,
0.14339330792427063,
0.20159105956554413,
-0.14853008091449738,
-0.13788755238056183,
-0.025532197207212448,
-0.08876562863588333,
0.08236663043498993,
-0.06974244117736816,
0.05108579993247986,
0.09789419919252396,
0.07163294404745102,
0.11727852374315262,
0.012328983284533024,
-0.0978066474199295,
0.03636748716235161,
0.017478130757808685,
-0.006409324239939451,
-0.10907334089279175,
-0.036800410598516464,
-0.10757613927125931,
-0.024048088118433952,
0.04751642420887947,
0.11458483338356018,
-0.05585525929927826,
-0.0016759616555646062,
-0.0003433738020248711,
-0.019641801714897156,
-0.14762495458126068,
0.26642680168151855,
0.03923606500029564,
0.01381737645715475,
-0.15099510550498962,
0.03309786319732666,
-0.06931749731302261,
-0.13027819991111755,
-0.022043494507670403,
-0.0637526661157608,
-0.002728796796873212,
-0.07016436010599136,
-0.031308021396398544,
0.04588449373841286,
0.020780405029654503,
-0.1517307311296463,
-0.0912177562713623,
-0.19084897637367249,
0.08417759090662003,
0.1448773294687271,
0.10161660611629486,
0.07449552416801453,
-0.06093577668070793,
-0.07530814409255981,
-0.02077873796224594,
0.07072605192661285,
0.06356850266456604,
0.027338314801454544,
-0.14205238223075867,
0.007582326885312796,
0.02144448459148407,
0.08072693645954132,
-0.053425777703523636,
-0.04910828173160553,
0.02471816912293434,
0.08405749499797821,
-0.10936721414327621,
0.03457631915807724,
-0.04714250564575195,
0.014804134145379066,
0.09514201432466507,
-0.10083826631307602,
-0.03705626353621483,
0.07533598691225052,
-0.11585666239261627,
0.054490264505147934,
0.03221771493554115,
0.12012705206871033,
-0.056709278374910355,
0.04084324836730957,
0.05083286762237549,
-0.07087594270706177,
0.077536940574646,
0.15510669350624084,
-0.12192395329475403,
0.1288471221923828,
-0.22926867008209229,
-0.12766499817371368,
0.1415887027978897,
0.07328320294618607,
-0.05080016329884529,
-0.13330411911010742,
0.044636908918619156,
0.19659803807735443,
0.04044714570045471,
0.010168259963393211,
0.0034475738648325205,
-0.06429116427898407,
-0.13290372490882874,
-0.13982924818992615,
-0.05998824164271355,
0.00411823857575655,
-0.08578160405158997,
0.15144623816013336,
0.16029243171215057,
0.19536584615707397,
-0.06287181377410889,
0.02330023981630802,
-0.1154298484325409,
0.0547151193022728,
-0.08346932381391525,
-0.07711471617221832,
-0.2006462812423706,
0.003186148824170232,
0.05160088837146759,
-0.0095014413818717,
0.13584508001804352,
-0.03355170041322708,
0.008272700011730194,
0.01525372639298439,
-0.043611302971839905,
0.03162379562854767,
0.01576867140829563,
0.30963799357414246,
0.0767141729593277,
0.0015542958863079548,
-0.05041562020778656,
-0.08394476771354675,
0.023323051631450653,
0.0693516731262207,
-0.06448591500520706,
0.17958500981330872,
0.05645203962922096,
0.13639014959335327,
0.12553544342517853,
-0.06592261791229248,
0.056066371500492096,
0.0669911727309227,
-0.07535403221845627,
0.08304797112941742,
0.05502631887793541,
0.19803982973098755,
0.15177002549171448,
-0.013010815717279911,
0.06469441205263138,
-0.06374455243349075,
-0.04604322463274002,
-0.2083512395620346,
-0.024734381586313248,
-0.1359640657901764,
-0.13784219324588776,
0.06724865734577179,
-0.03828432783484459,
0.05790639668703079,
0.018926357850432396,
0.07539472728967667,
-0.036599088460206985,
0.0337800458073616,
-0.06329178810119629,
-0.07820602506399155,
0.14845944941043854,
-0.10140953958034515,
-0.04834717884659767,
-0.07161445915699005,
0.002265939489006996,
0.15221267938613892,
-0.009158473461866379,
0.016421250998973846,
-0.04609101265668869,
-0.06437654793262482,
0.03311318904161453,
-0.12319731712341309,
-0.06758517026901245,
0.02331715077161789,
-0.029667196795344353,
0.0875883474946022,
0.16936565935611725,
0.09859257936477661,
0.004176798276603222,
0.07590043544769287,
0.07629905641078949,
-0.06215614452958107,
-0.12688881158828735,
-0.14825043082237244,
0.12631283700466156,
0.003985580522567034,
0.04520439729094505,
0.01058286800980568,
-0.02301042713224888,
0.029529297724366188,
0.15737810730934143,
0.1807902604341507,
0.03967437893152237,
0.03796399012207985,
-0.034964315593242645,
-0.028949590399861336,
-0.039141446352005005,
-0.10305795818567276,
0.10377749055624008,
0.2101806104183197,
-0.0020476505160331726,
-0.07376806437969208,
-0.09313692897558212,
-0.024988146498799324,
0.03393952175974846,
0.07140029966831207,
-0.024429315701127052,
-0.1490793377161026,
0.04174044355750084,
0.14163102209568024,
-0.06890490651130676,
-0.08456474542617798,
-0.08795929700136185,
-0.03991256654262543,
-0.08557449281215668,
-0.0033821980468928814,
0.019242651760578156,
0.15741662681102753,
-0.07964276522397995,
-0.11572844535112381,
-0.025062978267669678,
0.08020085841417313,
-0.06899022310972214,
-0.14112746715545654,
-0.025065531954169273,
0.02853449247777462,
-0.023610109463334084,
-0.017631517723202705,
0.0742470920085907,
0.23554997146129608,
-0.009011617861688137,
0.130863219499588,
0.012951424345374107,
0.17548273503780365,
0.013786984607577324,
-0.14574496448040009,
0.10005558282136917,
0.13751529157161713,
0.0686715692281723,
0.10594238340854645,
0.06634887307882309,
-0.09791431576013565,
0.032448820769786835,
-0.16842590272426605,
-0.12703372538089752,
-0.1644483059644699,
0.05233524739742279,
-0.07475802302360535,
0.008947791531682014,
0.028660954907536507,
-0.03932686522603035,
-0.03715687617659569,
-0.016625696793198586,
0.03982244431972504,
0.05364441126585007,
-0.07422703504562378,
-0.014568564482033253,
-0.20034414529800415,
0.03906852751970291,
-0.12437605857849121,
-0.06660428643226624,
-0.10685594379901886,
-0.06013495847582817,
-0.04353094846010208,
-0.06935182213783264,
-0.0103401318192482,
0.003029456827789545,
0.11085749417543411,
0.007665317505598068,
0.022780900821089745,
0.001745578832924366,
0.057904478162527084,
0.08904106914997101,
-0.10892107337713242,
-0.06331723183393478
] |
null | null | transformers |
# Zeldabot | {"tags": ["conversational"]} | text-generation | Ryanar/DialoGPT-medium-Zelda | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Zeldabot | [
"# Zeldabot"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Zeldabot"
] | [
51,
4
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Zeldabot"
] | [
-0.0017051651375368237,
-0.06268022954463959,
-0.007324059493839741,
0.07290280610322952,
0.19243121147155762,
0.008583216927945614,
0.11676736921072006,
0.10416591912508011,
-0.038439784198999405,
0.004740663804113865,
0.12114942073822021,
0.14138354361057281,
-0.0031551062129437923,
0.06961355358362198,
-0.11342804878950119,
-0.2600174844264984,
0.08934766054153442,
0.03310830891132355,
0.026025153696537018,
0.1014825850725174,
0.06678980588912964,
-0.04399877041578293,
0.12165113538503647,
-0.02053866721689701,
-0.1447789967060089,
0.02298092469573021,
0.039817728102207184,
-0.10303694754838943,
0.12603594362735748,
0.04215885326266289,
0.08446121215820312,
-0.019773345440626144,
-0.06548302620649338,
-0.1364290565252304,
0.05012020096182823,
-0.012893473729491234,
-0.04962510988116264,
0.030830545350909233,
0.03789651021361351,
-0.0942821204662323,
0.08310679346323013,
0.06808482110500336,
0.00268100225366652,
0.11082109063863754,
-0.16043590009212494,
-0.003057247493416071,
-0.010125336237251759,
0.06995544582605362,
0.06922245770692825,
0.09849879145622253,
-0.022029731422662735,
0.08062111586332321,
-0.06152380257844925,
0.08577228337526321,
0.27474090456962585,
-0.2912736237049103,
-0.049260471016168594,
0.1314859241247177,
0.1188557893037796,
0.17875729501247406,
-0.043596185743808746,
0.07402078807353973,
0.03127213194966316,
0.025975774973630905,
-0.057421620935201645,
-0.10362558811903,
-0.055748555809259415,
0.04179973900318146,
-0.09877148270606995,
0.005037309601902962,
0.19579088687896729,
-0.07152332365512848,
0.07563605904579163,
-0.026923704892396927,
-0.12776705622673035,
-0.03911943733692169,
-0.02766381949186325,
-0.013306625187397003,
-0.07148028910160065,
0.047485318034887314,
0.06172214075922966,
-0.09945490211248398,
-0.11635376513004303,
-0.04389749467372894,
-0.13172203302383423,
0.15567363798618317,
0.050192613154649734,
0.04091697186231613,
-0.208138108253479,
0.07871691137552261,
0.05708429217338562,
-0.10155265778303146,
-0.026888463646173477,
-0.09470459818840027,
-0.0072769648395478725,
-0.01442671474069357,
-0.02648177370429039,
-0.07759308069944382,
0.10901110619306564,
0.16938385367393494,
-0.007934062741696835,
0.045706167817115784,
-0.011425711214542389,
0.08230133354663849,
0.041677992790937424,
0.04126033931970596,
0.0032639114651829004,
-0.10259263962507248,
0.03434813767671585,
-0.11225531995296478,
0.010284271091222763,
-0.08940748125314713,
-0.19789505004882812,
-0.05766656994819641,
0.007609294727444649,
0.08575314283370972,
0.012716750614345074,
0.10140285640954971,
-0.0348847433924675,
-0.026953326538205147,
-0.09231393784284592,
-0.04654354974627495,
-0.001640490023419261,
0.028202345594763756,
-0.0003285572165623307,
0.0863661915063858,
0.005209718365222216,
0.008669865317642689,
-0.12952083349227905,
0.05735902488231659,
-0.08826841413974762,
0.010910888202488422,
-0.03858756273984909,
-0.04673926532268524,
0.01302362885326147,
-0.025639349594712257,
-0.004876141436398029,
-0.16665959358215332,
-0.139181450009346,
-0.004208375234156847,
0.010010255500674248,
-0.046245623379945755,
-0.05505138263106346,
-0.0650443509221077,
-0.05304376408457756,
0.03895469009876251,
-0.04707232862710953,
-0.031437359750270844,
-0.047315895557403564,
0.10398320853710175,
-0.00875707808881998,
0.08787647634744644,
-0.13129253685474396,
0.05402500182390213,
-0.09798485040664673,
-0.023857183754444122,
-0.04632952809333801,
0.07389247417449951,
-0.02134268917143345,
0.10310108959674835,
0.0027665405068546534,
-0.017828917130827904,
0.008147467859089375,
0.06663615256547928,
-0.030010733753442764,
0.1887884885072708,
-0.10000257194042206,
-0.10050265491008759,
0.25447162985801697,
-0.09318453073501587,
-0.19385986030101776,
0.09333155304193497,
-0.005511455237865448,
0.06688757240772247,
0.10138372331857681,
0.1743270456790924,
-0.0022334905806928873,
-0.0745265930891037,
0.08770805597305298,
0.007100200280547142,
-0.13199448585510254,
-0.012151381000876427,
-0.0009757765219546854,
0.023934053257107735,
-0.03777909651398659,
0.04482566937804222,
-0.003319881623610854,
0.0765201598405838,
-0.05111812427639961,
-0.027166901156306267,
-0.02587544173002243,
-0.01771792769432068,
0.031729940325021744,
-0.023389654234051704,
0.10680700093507767,
-0.04747973755002022,
0.016675254330039024,
0.049216803163290024,
-0.007629770319908857,
-0.02027120627462864,
0.058126300573349,
-0.031590238213539124,
0.14321951568126678,
0.0308407973498106,
0.08305464684963226,
-0.14954873919487,
-0.08903329074382782,
-0.04476648569107056,
0.09862066805362701,
0.09210063517093658,
0.10265867412090302,
0.0439603365957737,
0.003423902438953519,
-0.022080853581428528,
-0.003911963198333979,
0.09604274481534958,
-0.01228643674403429,
-0.04777226969599724,
-0.10380526632070541,
0.06158216670155525,
-0.024532387033104897,
-0.05338183790445328,
-0.04023665189743042,
0.04346311464905739,
0.07561241090297699,
0.0918448194861412,
-0.01217092014849186,
0.03028387576341629,
-0.046365391463041306,
0.0021799032110720873,
-0.1075977236032486,
-0.01625680737197399,
0.08452128618955612,
-0.001566230203025043,
-0.03717208281159401,
0.19045226275920868,
-0.1841536909341812,
0.2145397663116455,
0.18690495193004608,
-0.2319406419992447,
-0.012327823787927628,
-0.05356256291270256,
-0.022106947377324104,
0.00036501570139080286,
0.02225271612405777,
-0.04506814852356911,
0.17289158701896667,
0.0098034692928195,
0.20122604072093964,
-0.05569254606962204,
0.004528794903308153,
-0.007142877671867609,
-0.08513396978378296,
0.020690029487013817,
0.07723676413297653,
0.1619988977909088,
-0.13858287036418915,
0.206996887922287,
0.07715118676424026,
0.012313101440668106,
0.24425311386585236,
0.052898962050676346,
0.022208288311958313,
0.059474535286426544,
-0.005983644165098667,
-0.01976616680622101,
-0.013978102244436741,
-0.15454596281051636,
-0.04771058261394501,
0.06076609715819359,
0.016215192154049873,
0.08187047392129898,
-0.061185065656900406,
-0.02967260405421257,
-0.02139841765165329,
-0.011071685701608658,
0.10291912406682968,
0.10400982946157455,
0.04730319231748581,
0.14845074713230133,
-0.012691440060734749,
-0.04165727645158768,
0.07125267386436462,
0.01798347942531109,
-0.06705334782600403,
0.14624100923538208,
-0.13929495215415955,
-0.32701271772384644,
-0.15751756727695465,
-0.19154267013072968,
-0.11767993867397308,
0.05340202897787094,
0.1180320531129837,
-0.11489088088274002,
-0.020275238901376724,
0.039179425686597824,
0.18704016506671906,
-0.06324146687984467,
-0.015147033147513866,
-0.06851719319820404,
0.037599533796310425,
-0.12561041116714478,
-0.09639108926057816,
-0.04473472386598587,
-0.03416420519351959,
-0.0769163966178894,
0.13375435769557953,
-0.09198442846536636,
0.09408693760633469,
0.17975613474845886,
0.03129581734538078,
0.08597432076931,
-0.03451206907629967,
0.16981106996536255,
-0.09349498152732849,
0.026683639734983444,
0.17768386006355286,
-0.03372891992330551,
0.08020620048046112,
0.1387018859386444,
-0.0003942247712984681,
-0.10840410739183426,
0.041157662868499756,
-0.022977227345108986,
-0.09001630544662476,
-0.1888449788093567,
-0.1284860372543335,
-0.14433899521827698,
0.14187447726726532,
0.04158470034599304,
0.06443003565073013,
0.19419516623020172,
0.0539962463080883,
-0.034008681774139404,
-0.01876819133758545,
0.062010206282138824,
0.12319599837064743,
0.22710832953453064,
-0.0658639594912529,
0.10828177630901337,
-0.0035007772967219353,
-0.1169784739613533,
0.09919410198926926,
0.06072746589779854,
0.10539441555738449,
0.0724242776632309,
0.12259086966514587,
0.013301911763846874,
0.04730686917901039,
0.13821080327033997,
0.0581565722823143,
0.06722734123468399,
-0.003392147831618786,
-0.009466241113841534,
-0.021491291001439095,
-0.03035910241305828,
0.06872929632663727,
0.020698122680187225,
-0.09542036801576614,
-0.0471823588013649,
-0.028188036754727364,
0.09674618393182755,
0.11320330947637558,
-0.012781163677573204,
-0.16398921608924866,
-0.0037291550543159246,
0.0717005580663681,
-0.05010528489947319,
-0.1566770076751709,
0.07900167256593704,
0.05308191105723381,
-0.15570852160453796,
0.03857317939400673,
-0.04502548649907112,
0.12727980315685272,
-0.016982518136501312,
0.09312227368354797,
-0.06368380784988403,
-0.053449228405952454,
0.008543763309717178,
0.09305866807699203,
-0.33749663829803467,
0.11502546072006226,
-0.02511865459382534,
-0.04640395939350128,
-0.07176869362592697,
-0.028650211170315742,
0.015862204134464264,
0.08107000589370728,
0.07815462350845337,
0.005827758461236954,
0.08834405988454819,
-0.06203398108482361,
-0.0024057545233517885,
0.006557386368513107,
0.11391868442296982,
-0.08283154666423798,
-0.014022232964634895,
-0.06920202821493149,
0.003431265940889716,
-0.059179969131946564,
-0.0234373789280653,
0.04705573618412018,
-0.16920827329158783,
0.08185186982154846,
0.06875881552696228,
0.12012960016727448,
0.018938452005386353,
-0.02783360332250595,
-0.03499097749590874,
0.25525081157684326,
-0.10444533824920654,
-0.10367722064256668,
-0.13335688412189484,
-0.039506591856479645,
0.021839557215571404,
-0.0694078579545021,
0.05171304941177368,
-0.07514785975217819,
0.0781787559390068,
-0.14340774714946747,
-0.1699434071779251,
0.08931532502174377,
-0.0716693177819252,
-0.0835583359003067,
-0.04283440485596657,
0.18128520250320435,
-0.009653905406594276,
-0.007005297113209963,
0.008800754323601723,
0.0156352948397398,
-0.10690253973007202,
-0.10670652985572815,
-0.045135773718357086,
-0.011542897671461105,
0.06491810828447342,
-0.0025215265341103077,
-0.027671387419104576,
-0.03698981553316116,
-0.06446067988872528,
-0.019802864640951157,
0.2879193425178528,
0.12900395691394806,
-0.02666035108268261,
0.16721530258655548,
0.05672068148851395,
-0.05300500988960266,
-0.30968359112739563,
-0.11706333607435226,
-0.06487459689378738,
-0.03700779750943184,
-0.06064033508300781,
-0.1526128649711609,
0.05353442206978798,
-0.027412189170718193,
-0.0025466671213507652,
0.1363670825958252,
-0.23401391506195068,
-0.11261056363582611,
0.13841897249221802,
0.03553514927625656,
0.37556010484695435,
-0.1502075493335724,
-0.07122542709112167,
-0.030651314184069633,
-0.11090046912431717,
0.17366673052310944,
0.005266227759420872,
0.12618671357631683,
-0.045526184141635895,
0.15650993585586548,
0.04791312664747238,
-0.028345005586743355,
0.0867660790681839,
0.033850330859422684,
-0.0219844039529562,
-0.1446637511253357,
-0.09892035275697708,
0.016168082132935524,
-0.0191227737814188,
0.02288874238729477,
-0.08183685690164566,
0.030121229588985443,
-0.04078969731926918,
-0.021168150007724762,
-0.08761177957057953,
0.07691866904497147,
0.029482854530215263,
-0.07080383598804474,
-0.05194171518087387,
-0.0434487983584404,
-0.03639364615082741,
0.04009947180747986,
0.2506447732448578,
-0.07450580596923828,
0.08692161738872528,
0.05620824918150902,
0.06508844345808029,
-0.17781895399093628,
-0.043545667082071304,
-0.07061757892370224,
-0.057061683386564255,
0.10525916516780853,
-0.03783643618226051,
0.04845280200242996,
0.13771645724773407,
-0.03810218349099159,
0.08801805227994919,
0.10756975412368774,
-0.005370950326323509,
0.007773626130074263,
0.10895675420761108,
-0.2795773446559906,
-0.01883380487561226,
-0.07968036085367203,
-0.03281605616211891,
0.112424336373806,
0.05884084850549698,
0.20618100464344025,
-0.003642492461949587,
-0.04814677685499191,
0.020639291033148766,
0.030321279540657997,
-0.03717934712767601,
0.04615377262234688,
-0.0033545847982168198,
0.03161124140024185,
-0.1605571210384369,
0.03365959972143173,
0.01849179156124592,
-0.08036527037620544,
0.04181210324168205,
0.226236954331398,
-0.11829724907875061,
-0.1046188622713089,
0.005092978943139315,
0.14544199407100677,
-0.17527949810028076,
-0.012465359643101692,
-0.0646238923072815,
-0.16377784311771393,
0.05548519268631935,
0.24249212443828583,
0.07018932700157166,
0.07749347388744354,
-0.05643097683787346,
-0.011739443987607956,
-0.025463437661528587,
0.0033376773353666067,
-0.001272064633667469,
0.022488275542855263,
-0.10783228278160095,
0.04786612093448639,
0.0015523580368608236,
0.11864382028579712,
-0.0852082371711731,
-0.08551342785358429,
-0.16720400750637054,
0.031516801565885544,
-0.18925949931144714,
-0.06885538250207901,
-0.04640635848045349,
-0.05457593500614166,
-0.01603829674422741,
-0.0530882328748703,
-0.03561015427112579,
-0.022119048982858658,
-0.08951111882925034,
0.007890276610851288,
-0.048285070806741714,
0.022065773606300354,
-0.10043066740036011,
0.02728138491511345,
0.09336844831705093,
-0.049506254494190216,
0.12052498757839203,
0.14021535217761993,
-0.12378627061843872,
0.08625590801239014,
-0.13100874423980713,
-0.07742750644683838,
0.13331536948680878,
0.021624060347676277,
0.07608935981988907,
0.09650453925132751,
-0.0037480208557099104,
0.05049800127744675,
0.06691605597734451,
0.04612582176923752,
0.038708023726940155,
-0.10361870378255844,
0.08024623245000839,
-0.046493932604789734,
-0.13612428307533264,
-0.01981194131076336,
-0.06679826229810715,
0.060045275837183,
0.050631843507289886,
0.09144455194473267,
-0.046729087829589844,
0.1264009177684784,
-0.04768374189734459,
0.030186759307980537,
0.04049278050661087,
-0.16168199479579926,
0.008559636771678925,
-0.09749773144721985,
0.006421999074518681,
-0.008571269921958447,
0.1605437844991684,
0.019945142790675163,
-0.005502407904714346,
0.037986818701028824,
0.1284772753715515,
-0.014967918395996094,
0.04091835767030716,
0.10753513872623444,
0.06485172361135483,
-0.048118580132722855,
-0.07846056669950485,
0.0844000056385994,
0.0784500390291214,
-0.037852466106414795,
0.16336490213871002,
-0.03663382679224014,
-0.05287247151136398,
0.05498308315873146,
0.010535781271755695,
0.036512721329927444,
-0.07103272527456284,
-0.16385039687156677,
-0.0465230792760849,
0.009223327971994877,
-0.02984350360929966,
0.12758339941501617,
0.17847275733947754,
-0.03506609424948692,
0.009129882790148258,
-0.015857473015785217,
-0.05784257873892784,
-0.13019409775733948,
-0.09259790927171707,
-0.0723065510392189,
-0.13998006284236908,
0.007322024554014206,
-0.1132122203707695,
0.03222993388772011,
-0.009538250043988228,
0.06411191076040268,
-0.0554068461060524,
0.13673792779445648,
0.05730956792831421,
-0.10991118103265762,
0.02363266795873642,
-0.02929699793457985,
0.0403413325548172,
-0.017949815839529037,
-0.027000948786735535,
-0.09755231440067291,
0.00906107947230339,
-0.005149655509740114,
0.09153874218463898,
-0.05382930487394333,
0.013218986801803112,
-0.16661518812179565,
-0.0795660987496376,
-0.03150469437241554,
0.05344747379422188,
-0.06285642832517624,
0.14554856717586517,
0.010530511848628521,
-0.04154568538069725,
0.03531545400619507,
0.21686577796936035,
-0.03558675944805145,
-0.11842712759971619,
-0.041508760303258896,
0.14503620564937592,
0.054092828184366226,
0.08660484105348587,
-0.04857993125915527,
0.012027719058096409,
-0.12234146147966385,
0.3628547191619873,
0.23758403956890106,
-0.113970547914505,
-0.0010316227562725544,
-0.017892986536026,
0.04734296351671219,
0.1207411140203476,
0.09312539547681808,
0.06811992824077606,
0.351734459400177,
-0.05956466495990753,
0.003752410179004073,
-0.02138179913163185,
-0.022081594914197922,
-0.08475876599550247,
0.1152636706829071,
0.036240942776203156,
-0.05016376078128815,
-0.060312069952487946,
0.05871155485510826,
-0.27000224590301514,
0.11211301386356354,
-0.06050358712673187,
-0.17076633870601654,
-0.052588243037462234,
0.010238281451165676,
0.14075981080532074,
-0.035588398575782776,
0.08023975789546967,
0.030487583950161934,
-0.10247459262609482,
0.03624308481812477,
0.012794030830264091,
-0.22842127084732056,
-0.0478111207485199,
0.08968883752822876,
-0.2084040492773056,
0.02192377671599388,
-0.05168972164392471,
-0.010685263201594353,
0.06260685622692108,
0.0516921803355217,
-0.016452856361865997,
0.02399170957505703,
-0.008136486634612083,
-0.04589851200580597,
-0.012712475843727589,
0.07231123745441437,
0.007835127413272858,
-0.0776398703455925,
0.05517669394612312,
-0.16146564483642578,
0.01053821761161089,
0.01978037692606449,
-0.009134159423410892,
-0.007609064690768719,
0.04341210424900055,
-0.03723232075572014,
0.041424330323934555,
0.04028449207544327,
-0.02011338248848915,
0.023006362840533257,
-0.06843256205320358,
-0.0009209752315655351,
-0.003485884051769972,
-0.05593422055244446,
-0.056344348937273026,
-0.18765072524547577,
-0.13190405070781708,
0.022702891379594803,
-0.029579047113656998,
-0.1542743742465973,
-0.021877428516745567,
-0.12973274290561676,
0.09784332662820816,
-0.18280726671218872,
0.08853758871555328,
0.09333726018667221,
-0.0032202580478042364,
0.014017263427376747,
-0.0003317735972814262,
0.05553226172924042,
0.11906469613313675,
-0.09680069983005524,
-0.0980832651257515
] |
null | null | null | Wkwkwkwk
| {} | null | Ryannandi/Test | [
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#region-us
| Wkwkwkwk
| [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] | [
0.024608636274933815,
-0.026205500587821007,
-0.009666500613093376,
-0.10395516455173492,
0.08638657629489899,
0.059816278517246246,
0.01882290467619896,
0.020661840215325356,
0.23975107073783875,
-0.005599027033895254,
0.1219947561621666,
0.0015615287702530622,
-0.037353623658418655,
0.03733762726187706,
-0.0035912662278860807,
-0.17583473026752472,
0.03876631706953049,
-0.018274923786520958,
0.01843859627842903,
0.026470553129911423,
-0.07776834815740585,
-0.07564429938793182,
0.015296397730708122,
-0.10247814655303955,
-0.083692267537117,
0.11002834886312485,
0.031466204673051834,
-0.019670886918902397,
0.10779199749231339,
-0.04243955761194229,
0.18699054419994354,
-0.011512263678014278,
-0.11213519424200058,
-0.2536850869655609,
0.021806683391332626,
-0.01765260472893715,
-0.08747660368680954,
0.01506110467016697,
0.0665089413523674,
-0.09014441072940826,
-0.0588928684592247,
0.0795099288225174,
-0.01132340170443058,
0.04246443510055542,
-0.27593839168548584,
-0.12684126198291779,
-0.05297930911183357,
-0.1421966552734375,
0.08651168644428253,
0.04035491496324539,
0.008764253929257393,
0.15506891906261444,
-0.20897391438484192,
0.004104613792151213,
0.08255259692668915,
-0.2538507878780365,
0.05591634660959244,
0.17671173810958862,
0.03623908758163452,
0.18037272989749908,
0.0060391901060938835,
0.11029672622680664,
0.0716743916273117,
-0.024263937026262283,
-0.17590197920799255,
-0.08127854019403458,
-0.04696211963891983,
0.16642488539218903,
-0.06727185100317001,
-0.14248386025428772,
0.34701237082481384,
0.00015008423360995948,
0.009657775051891804,
0.16921205818653107,
-0.059524230659008026,
-0.09972117841243744,
0.07259953022003174,
0.016484731808304787,
0.018492350354790688,
0.1471305936574936,
0.16307872533798218,
-0.0458691343665123,
-0.13837823271751404,
-0.018630273640155792,
-0.22798998653888702,
0.17510560154914856,
-0.03248048573732376,
0.13137903809547424,
-0.27447956800460815,
0.01684025302529335,
-0.2570667266845703,
0.0032130838371813297,
0.04178816080093384,
-0.06004921346902847,
-0.0226522795855999,
-0.013265985064208508,
-0.08018817007541656,
0.004899587947875261,
0.06192673370242119,
0.1266920566558838,
-0.06128726154565811,
0.06128238886594772,
-0.09319206327199936,
0.141696035861969,
0.07166698575019836,
0.07868369668722153,
0.13037432730197906,
0.041205424815416336,
-0.07187089323997498,
-0.21872246265411377,
-0.0026476888451725245,
-0.06275863200426102,
-0.09502086788415909,
-0.0020165652967989445,
-0.11606067419052124,
0.17244569957256317,
-0.030802514404058456,
-0.09825427830219269,
-0.11208184063434601,
0.09148659557104111,
-0.032992321997880936,
-0.03437839448451996,
-0.03552987426519394,
-0.020977836102247238,
0.019381176680326462,
0.04704452306032181,
-0.1548958420753479,
-0.005131472367793322,
0.07039852440357208,
0.11502562463283539,
-0.1346137970685959,
-0.003783059772104025,
-0.07908964157104492,
0.03039063885807991,
0.07654735445976257,
-0.16510222852230072,
0.03158547356724739,
-0.1124754324555397,
-0.07531405985355377,
0.002912673633545637,
-0.015710093080997467,
-0.016202643513679504,
0.166526660323143,
-0.0020451415330171585,
0.0714716836810112,
-0.026345307007431984,
-0.05890209600329399,
-0.11243434250354767,
-0.08489254862070084,
0.05390460044145584,
0.03670717030763626,
0.03266148269176483,
-0.2193479984998703,
0.014805203303694725,
-0.12762966752052307,
0.1360815018415451,
-0.10566820204257965,
-0.04705966264009476,
-0.022842247039079666,
0.20562705397605896,
0.037286072969436646,
0.08762791007757187,
-0.22171171009540558,
0.039756543934345245,
-0.05404696613550186,
0.18480908870697021,
-0.1502426266670227,
-0.0799463614821434,
0.20813211798667908,
-0.07964949309825897,
-0.10115210711956024,
0.021235812455415726,
0.020391687750816345,
0.026287272572517395,
0.0766737088561058,
0.4564172327518463,
-0.09766800701618195,
-0.09146861732006073,
0.10178250074386597,
0.17055274546146393,
-0.12427149713039398,
-0.1827561855316162,
0.06446871906518936,
-0.16666454076766968,
-0.1973118633031845,
0.0018917324487119913,
0.09222044050693512,
0.038269978016614914,
-0.07875611633062363,
-0.020746968686580658,
0.06325206160545349,
-0.0007678253459744155,
0.09095914661884308,
0.03755716234445572,
0.09034032374620438,
-0.08716782182455063,
0.11115926504135132,
-0.05017651244997978,
0.004037132486701012,
0.1343354731798172,
0.027325427159667015,
-0.03223329409956932,
0.08694463223218918,
-0.0485352948307991,
0.05295134335756302,
-0.1662379503250122,
-0.15068690478801727,
0.03398871049284935,
0.06283251196146011,
0.03186952322721481,
0.1280253529548645,
0.08141885697841644,
-0.10732853412628174,
0.022690722718834877,
-0.004228927195072174,
0.058398615568876266,
0.03891623765230179,
0.006107209715992212,
0.008764320984482765,
0.0961301177740097,
-0.10607069730758667,
-0.13589619100093842,
-0.07336436957120895,
-0.014715781435370445,
0.14371353387832642,
-0.0302802175283432,
0.07690227776765823,
-0.004240254405885935,
0.00013200697139836848,
0.06930823624134064,
0.08137880265712738,
0.016412746161222458,
0.08971183747053146,
-0.05237193778157234,
-0.05160155147314072,
0.10863113403320312,
-0.13533565402030945,
0.17837053537368774,
0.14053137600421906,
-0.20532016456127167,
0.029453208670020103,
-0.06838275492191315,
0.03670361638069153,
-0.008162540383636951,
0.0975119024515152,
-0.08272241055965424,
-0.02106042578816414,
0.013134466484189034,
0.0052274600602686405,
-0.013007243163883686,
0.017682146281003952,
-0.07295988500118256,
-0.07787393033504486,
-0.10233919322490692,
0.08436838537454605,
0.11562882363796234,
-0.10282530635595322,
0.14214380085468292,
0.4384984076023102,
0.11495281755924225,
0.21582984924316406,
-0.09581480920314789,
-0.0412987545132637,
0.007486371789127588,
0.0001535322517156601,
-0.04476691037416458,
0.08031861484050751,
-0.15973517298698425,
-0.038901735097169876,
0.027348900213837624,
0.07128690183162689,
0.11475157737731934,
-0.14959022402763367,
-0.09639324247837067,
-0.00793045200407505,
0.0022841424215584993,
-0.1249532699584961,
0.023905446752905846,
-0.03974650055170059,
0.04015624523162842,
0.07232289016246796,
-0.021535737439990044,
0.13939237594604492,
-0.04166141897439957,
-0.0639561116695404,
0.07585346698760986,
-0.2017085999250412,
-0.23179671168327332,
-0.12309670448303223,
-0.14680525660514832,
0.04366797208786011,
0.05154111236333847,
0.01726446859538555,
-0.17635835707187653,
-0.015074856579303741,
0.07706750929355621,
0.07820965349674225,
-0.20886357128620148,
-0.022814949974417686,
-0.004290030337870121,
0.0895976573228836,
-0.10227091610431671,
-0.0017130117630586028,
-0.04419664293527603,
-0.10150232166051865,
0.0017003051470965147,
0.07279510796070099,
-0.137485533952713,
0.13807645440101624,
0.21589438617229462,
0.07225540280342102,
0.07359948754310608,
-0.019093448296189308,
0.09936179965734482,
-0.10856141895055771,
-0.16549113392829895,
0.08348225057125092,
-0.06234746053814888,
0.047262318432331085,
0.17534415423870087,
0.03307317942380905,
-0.13904969394207,
-0.015682822093367577,
-0.0402069091796875,
-0.15603256225585938,
-0.238995760679245,
-0.09178274869918823,
-0.1182505264878273,
0.16442428529262543,
0.0009358620154671371,
0.06651917099952698,
0.08258313685655594,
-0.022042419761419296,
0.16447891294956207,
-0.07379321753978729,
-0.07578866183757782,
-0.006978808436542749,
0.12375060468912125,
-0.056660156697034836,
-0.03080669604241848,
-0.10566964000463486,
-0.008295975625514984,
0.1151021271944046,
0.15304014086723328,
0.12214863300323486,
0.2957419455051422,
0.08268889784812927,
0.026645636186003685,
0.08958091586828232,
0.17622539401054382,
0.09495089203119278,
0.07838419824838638,
-0.045413073152303696,
-0.014814783819019794,
0.014317171648144722,
-0.04022889584302902,
0.010141594335436821,
0.14683100581169128,
-0.2679629921913147,
-0.006678564939647913,
-0.2710230350494385,
0.0965198427438736,
-0.10913380235433578,
0.11837165057659149,
-0.01015760749578476,
0.10194015502929688,
0.11082887649536133,
0.03233652561903,
-0.03858073800802231,
0.16613617539405823,
0.08450309932231903,
-0.11277695000171661,
0.001758623169735074,
0.03737903758883476,
0.09715615212917328,
-0.02818971499800682,
0.12721189856529236,
-0.11048974841833115,
-0.1464834064245224,
0.013753619976341724,
0.07152791321277618,
-0.15373679995536804,
0.3138748109340668,
0.012069208547472954,
-0.13481520116329193,
-0.01481647603213787,
-0.09957809001207352,
-0.006440147757530212,
0.1254177987575531,
0.09333524852991104,
0.07935678958892822,
-0.2185502052307129,
-0.13339371979236603,
0.05872276425361633,
-0.00575496768578887,
0.22408108413219452,
-0.034034017473459244,
-0.11356475204229355,
-0.027013886719942093,
0.04241163283586502,
-0.06043251231312752,
0.08524788916110992,
0.023536119610071182,
-0.08113526552915573,
-0.032957352697849274,
0.05323701351881027,
0.012368366122245789,
0.00524376705288887,
0.09360801428556442,
0.020107939839363098,
-0.0009265501867048442,
0.01785753294825554,
0.047885000705718994,
-0.0675911232829094,
-0.1984109878540039,
0.09357594698667526,
-0.05215044692158699,
0.0015536568826064467,
-0.08013670891523361,
-0.15122665464878082,
-0.08837161958217621,
-0.16009655594825745,
0.12540200352668762,
-0.034406669437885284,
0.12700119614601135,
-0.06619787961244583,
0.17341409623622894,
-0.07871770113706589,
0.04481020197272301,
-0.047349292784929276,
0.050332702696323395,
-0.007268077693879604,
-0.07756082713603973,
0.16585899889469147,
-0.15564003586769104,
0.01809087023139,
0.19572502374649048,
-0.018915493041276932,
0.07177707552909851,
0.021322092041373253,
-0.0636206790804863,
0.23147478699684143,
0.3014698624610901,
0.008138049393892288,
0.1665448248386383,
0.3018903136253357,
-0.07466315478086472,
-0.2642788887023926,
-0.05505012720823288,
-0.2841376066207886,
-0.05371501296758652,
0.10716094076633453,
-0.22523896396160126,
0.06986407935619354,
0.14383509755134583,
-0.06471995264291763,
0.30228954553604126,
-0.21825523674488068,
0.012589273042976856,
0.15434536337852478,
-0.08868814259767532,
0.5515313148498535,
-0.1133413165807724,
-0.17677772045135498,
-0.008122089318931103,
-0.08741296827793121,
0.10602109134197235,
-0.0340677872300148,
0.06877441704273224,
0.013465235009789467,
0.04797380417585373,
0.048932258039712906,
-0.03111894056200981,
0.22701001167297363,
0.008710170164704323,
0.09015397727489471,
-0.07378865778446198,
-0.18624304234981537,
0.11639340221881866,
-0.04359482601284981,
-0.08891059458255768,
0.0849778801202774,
-0.05942516401410103,
-0.11078983545303345,
0.04663389176130295,
-0.07950539886951447,
-0.024862350896000862,
0.08423490077257156,
-0.04678233340382576,
-0.042606171220541,
-0.008054176345467567,
-0.1618063747882843,
-0.0002289071271661669,
0.31360217928886414,
-0.07096036523580551,
0.16695955395698547,
0.03677211329340935,
0.00038613268407061696,
-0.11027684062719345,
0.030288029462099075,
-0.05203165486454964,
-0.021576624363660812,
0.09578979015350342,
-0.11096979677677155,
0.03204701095819473,
0.14160704612731934,
-0.04864364117383957,
0.05846960097551346,
0.09256096184253693,
-0.0849417969584465,
0.007583672646433115,
0.17753590643405914,
-0.17537221312522888,
-0.1273445188999176,
-0.006135711446404457,
-0.09862716495990753,
0.14055661857128143,
0.04394126310944557,
0.05191568285226822,
0.16669964790344238,
0.03967129811644554,
-0.029474308714270592,
-0.02817419543862343,
-0.1153380498290062,
-0.0201893113553524,
0.040153320878744125,
0.00045633706031367183,
-0.08791285753250122,
0.2262638509273529,
0.06409153342247009,
-0.1328488290309906,
-0.051157206296920776,
0.2161225974559784,
-0.06805316358804703,
-0.04911920800805092,
-0.223562553524971,
0.10752306133508682,
-0.07112517952919006,
-0.0965060144662857,
0.05453834682703018,
-0.02270081453025341,
0.005106312222778797,
0.181985542178154,
0.03941008821129799,
0.11070270836353302,
0.03738937899470329,
-0.02448922023177147,
0.15798696875572205,
-0.142850860953331,
-0.14191335439682007,
-0.025354057550430298,
-0.08757315576076508,
-0.13844476640224457,
-0.026804137974977493,
0.1617041826248169,
-0.09177309274673462,
-0.14772607386112213,
-0.2621181011199951,
0.10968475043773651,
-0.16432365775108337,
-0.10192688554525375,
-0.03469514101743698,
-0.08968492597341537,
0.0696166530251503,
0.030301768332719803,
-0.03093348816037178,
-0.06706760823726654,
-0.18593791127204895,
0.0816768929362297,
0.06349513679742813,
0.045533183962106705,
-0.017847947776317596,
0.0067379772663116455,
0.1720137596130371,
0.025955144315958023,
0.10040043294429779,
0.16762186586856842,
0.011397695168852806,
0.2246655523777008,
-0.1671202927827835,
-0.11496317386627197,
0.1336962729692459,
-0.026543032377958298,
0.06762003898620605,
0.16792191565036774,
-0.0772583931684494,
0.015526676550507545,
-0.028136352077126503,
0.07066910713911057,
-0.11003983020782471,
-0.105624258518219,
0.007937257178127766,
0.02567129209637642,
-0.2755882740020752,
-0.005599735304713249,
-0.19717298448085785,
0.14788752794265747,
0.02579621411859989,
0.03297143429517746,
0.10257530212402344,
0.10404334217309952,
0.08312062919139862,
-0.0017710148822516203,
0.03226327523589134,
-0.1176818460226059,
0.02753005363047123,
-0.059239376336336136,
-0.020663779228925705,
0.017624232918024063,
0.36952024698257446,
-0.03603357449173927,
-0.046802736818790436,
0.003710439894348383,
0.1307835876941681,
-0.02139742486178875,
0.017395347356796265,
0.13209912180900574,
0.12607666850090027,
-0.08595693111419678,
-0.1504845917224884,
0.04888554662466049,
-0.04565655067563057,
-0.02836887165904045,
0.1464131623506546,
0.05905961990356445,
0.1050296202301979,
0.0908031314611435,
-0.014463032595813274,
-0.00318976235575974,
0.012856799177825451,
-0.15486004948616028,
0.06223496049642563,
-0.010558074340224266,
0.012565906159579754,
0.017934376373887062,
0.15238402783870697,
-0.005540105979889631,
0.07739730179309845,
-0.09889880567789078,
0.004208535887300968,
-0.13498884439468384,
-0.07913459837436676,
0.03617347031831741,
-0.13393273949623108,
0.04141177982091904,
-0.01871878281235695,
0.029611799865961075,
0.30386561155319214,
0.02558239921927452,
-0.020639164373278618,
0.12512871623039246,
-0.1214587539434433,
-0.12050267308950424,
-0.001594188273884356,
-0.029960084706544876,
0.0791488066315651,
-0.02633434161543846,
-0.0997740775346756,
-0.1001306027173996,
-0.15166029334068298,
-0.09759195148944855,
0.05182836204767227,
-0.04993441700935364,
-0.059362251311540604,
-0.17634081840515137,
-0.05707859992980957,
-0.05147340148687363,
0.14025864005088806,
-0.12263951450586319,
0.15159130096435547,
-0.014490418136119843,
0.004084470681846142,
0.04405883327126503,
0.1950942426919937,
-0.03644494712352753,
0.08714226633310318,
0.0154351145029068,
0.1522706001996994,
-0.05119588226079941,
0.14720745384693146,
-0.10931728035211563,
-0.04014137014746666,
-0.06710435450077057,
0.21513493359088898,
0.25630924105644226,
-0.06136954948306084,
-0.008937356993556023,
-0.012760217301547527,
0.058654606342315674,
0.1073930487036705,
0.16049085557460785,
0.002326392102986574,
0.2802925705909729,
-0.03133585304021835,
0.04815128445625305,
0.02901598811149597,
0.013607407920062542,
-0.06336209923028946,
0.03397751972079277,
0.07539387792348862,
-0.035039983689785004,
-0.1412304788827896,
0.15837742388248444,
-0.21980468928813934,
0.18157227337360382,
0.11640069633722305,
-0.19996967911720276,
-0.013728445395827293,
-0.04882071167230606,
0.1689416468143463,
-0.0856364443898201,
0.1637246012687683,
-0.0903693437576294,
-0.2108195722103119,
-0.2056000679731369,
0.03867346793413162,
-0.34623071551322937,
-0.254462867975235,
0.10422009229660034,
0.1488201916217804,
0.04015883058309555,
-0.018507536500692368,
-0.019967829808592796,
-0.018367022275924683,
0.04877542704343796,
-0.0067357709631323814,
0.06014643982052803,
0.031397558748722076,
-0.02988368645310402,
-0.24127542972564697,
-0.029804671183228493,
0.023964406922459602,
-0.07093082368373871,
0.07464958727359772,
-0.06874357163906097,
-0.022495782002806664,
0.08059766888618469,
-0.03066304884850979,
0.03298592567443848,
-0.035373736172914505,
-0.16326889395713806,
0.027529051527380943,
0.03900543600320816,
0.036012712866067886,
0.00634160777553916,
0.0008072225609794259,
-0.03455270454287529,
0.0644603744149208,
-0.16716794669628143,
-0.16015739738941193,
0.14140215516090393,
-0.06745140254497528,
0.2779497504234314,
-0.05812826007604599,
-0.0809100940823555,
0.04766704887151718,
-0.03426874056458473,
0.1807648241519928,
-0.07756473124027252,
0.047254521399736404,
0.12766779959201813,
0.011127962730824947,
0.03121316432952881,
-0.3092964291572571,
0.11082969605922699,
-0.000795336440205574,
-0.006093299947679043,
-0.07581598311662674
] |
null | null | transformers |
# Rick DialoGPT model
| {"tags": ["conversational"]} | text-generation | Ryukie/DialoGPT-small-Rick | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Rick DialoGPT model
| [
"# Rick DialoGPT model"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Rick DialoGPT model"
] | [
51,
7
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Rick DialoGPT model"
] | [
-0.026307471096515656,
0.09394720941781998,
-0.0053693922236561775,
0.013341153040528297,
0.13122393190860748,
-0.0001271809305762872,
0.15153969824314117,
0.13532139360904694,
-0.012149355374276638,
-0.04893339425325394,
0.1376495659351349,
0.20727398991584778,
-0.009282008744776249,
0.06632285565137863,
-0.07689982652664185,
-0.3263074457645416,
0.0541488416492939,
0.06074703484773636,
-0.028742507100105286,
0.11803784966468811,
0.1016596332192421,
-0.034433167427778244,
0.07546462118625641,
0.007671080995351076,
-0.14394399523735046,
0.009073670022189617,
0.018343722447752953,
-0.11193355917930603,
0.11017581075429916,
0.07047469913959503,
0.032935649156570435,
0.04516134411096573,
-0.045610230416059494,
-0.12734128534793854,
0.04444316774606705,
-0.0007074338500387967,
-0.044366732239723206,
0.05960826575756073,
0.017957383766770363,
-0.08949330449104309,
0.11716856807470322,
0.12799543142318726,
-0.013094340451061726,
0.03880066052079201,
-0.15538695454597473,
-0.0017994411755353212,
-0.012053873389959335,
0.0691082701086998,
0.06047862395644188,
0.10783222317695618,
-0.03909517079591751,
0.11795765906572342,
-0.06376834213733673,
0.1156197339296341,
0.1099555715918541,
-0.29319506883621216,
-0.015066631138324738,
0.1453801691532135,
0.04404941946268082,
0.04553242400288582,
-0.03810311481356621,
0.0971064567565918,
0.018297934904694557,
-0.009197480045258999,
-0.04527812451124191,
-0.07937059551477432,
-0.0785442367196083,
0.022468965500593185,
-0.08652039617300034,
-0.01278676837682724,
0.25185099244117737,
-0.034276217222213745,
0.07775937020778656,
-0.07710754126310349,
-0.0882781445980072,
-0.015914008021354675,
-0.035491663962602615,
-0.03474625200033188,
-0.10360077768564224,
0.07954097539186478,
-0.03638206794857979,
-0.09738437831401825,
-0.11328651010990143,
-0.030514540150761604,
-0.16600777208805084,
0.18022817373275757,
0.028647636994719505,
0.033171504735946655,
-0.22669106721878052,
0.09721154719591141,
-0.014040542766451836,
-0.09817730635404587,
0.021137695759534836,
-0.0875508189201355,
0.01331646554172039,
0.01670416072010994,
-0.025239771232008934,
-0.0016303120646625757,
0.08401314169168472,
0.11467883735895157,
0.018850738182663918,
0.01854829490184784,
-0.013761704787611961,
0.05011213198304176,
0.04207006096839905,
0.0688997432589531,
-0.02060912922024727,
-0.02804442308843136,
0.026006873697042465,
-0.0941164419054985,
-0.013511229306459427,
-0.06520073860883713,
-0.1999543458223343,
-0.008704649284482002,
0.05683811381459236,
0.06101927533745766,
0.040125854313373566,
0.1364995390176773,
0.006989735644310713,
-0.04964334890246391,
0.04230933263897896,
-0.01836281083524227,
-0.01648184284567833,
0.011581381782889366,
0.0030591548420488834,
0.1451946645975113,
0.013021725229918957,
0.04964533448219299,
-0.11686590313911438,
0.006823766510933638,
-0.04556811600923538,
-0.021192600950598717,
-0.035848744213581085,
-0.0534006804227829,
-0.011447814293205738,
-0.023781660944223404,
0.01566874049603939,
-0.138215109705925,
-0.16868597269058228,
-0.010750913992524147,
-0.0054442137479782104,
-0.04368012771010399,
-0.11985860019922256,
-0.10691071301698685,
-0.03188803046941757,
0.04130145534873009,
-0.06226013973355293,
-0.0016179383965209126,
-0.04824138432741165,
0.09329771250486374,
-0.03543361276388168,
0.07572447508573532,
-0.09963777661323547,
0.08339086174964905,
-0.07048105448484421,
-0.04222751781344414,
-0.07689815759658813,
0.13199064135551453,
0.012147852219641209,
0.05355143919587135,
-0.033320385962724686,
-0.020182406529784203,
-0.10112729668617249,
0.07901765406131744,
-0.043776735663414,
0.23530089855194092,
-0.09424618631601334,
-0.10172633081674576,
0.2707865536212921,
-0.056338634341955185,
-0.1417124718427658,
0.10922730714082718,
-0.01708115078508854,
0.11796516180038452,
0.12533797323703766,
0.18160980939865112,
0.06915249675512314,
0.007078608963638544,
0.07667984813451767,
0.11271748691797256,
-0.07647018134593964,
-0.019419031217694283,
0.013309413567185402,
-0.019119203090667725,
-0.07986441254615784,
0.023063944652676582,
0.0750250369310379,
0.05241791531443596,
-0.05419831722974777,
-0.01576150581240654,
0.004038041457533836,
0.0046436842530965805,
0.05485425144433975,
-0.027098819613456726,
0.12380187958478928,
-0.026550456881523132,
-0.07160753011703491,
-0.02819196693599224,
0.02900422178208828,
-0.05955371633172035,
0.03213973343372345,
-0.08394158631563187,
0.03701850399374962,
-0.01614903286099434,
0.0714651495218277,
-0.16583749651908875,
-0.09424386918544769,
-0.05107835307717323,
0.18867318332195282,
0.06760090589523315,
0.12099006772041321,
0.05622575059533119,
-0.06824762374162674,
-0.002587922615930438,
0.017642609775066376,
0.19762735068798065,
-0.01621684804558754,
-0.07617255300283432,
-0.09709285944700241,
0.10087143629789352,
-0.07233986258506775,
0.06148630753159523,
-0.04969713091850281,
0.01726907305419445,
0.020254770293831825,
0.10257863998413086,
-0.03401493653655052,
0.04154540225863457,
0.01299088355153799,
-0.03159681335091591,
-0.06118636205792427,
-0.004885436967015266,
0.09689072519540787,
0.0024414120707660913,
-0.10660914331674576,
0.24094536900520325,
-0.1953991800546646,
0.1240597665309906,
0.17735153436660767,
-0.19901056587696075,
0.0003240618098061532,
-0.1216038390994072,
-0.02617034688591957,
0.011324791237711906,
0.03988710418343544,
-0.03907903656363487,
0.24041828513145447,
-0.009465505369007587,
0.16597111523151398,
-0.035453081130981445,
-0.04237295687198639,
-0.04102778434753418,
-0.049226678907871246,
0.010536805726587772,
0.11326293647289276,
0.10443516075611115,
-0.17280073463916779,
0.17910873889923096,
0.056295476853847504,
0.046068836003541946,
0.16674117743968964,
0.017407290637493134,
0.02044367603957653,
0.06752368807792664,
0.0022293536458164454,
-0.033244188874959946,
-0.07607504725456238,
-0.21106134355068207,
-0.02076302096247673,
0.07983379065990448,
0.049628254026174545,
0.10880406200885773,
-0.10326145589351654,
-0.03147902712225914,
-0.011183910071849823,
-0.022843707352876663,
0.033011820167303085,
0.13939867913722992,
0.012945251539349556,
0.12727203965187073,
-0.023822369053959846,
-0.06920848041772842,
0.07053513079881668,
0.015494134277105331,
-0.08573633432388306,
0.19461224973201752,
-0.10708330571651459,
-0.3392099142074585,
-0.10560698807239532,
-0.18618375062942505,
-0.058684613555669785,
0.046016938984394073,
0.11348334699869156,
-0.13952569663524628,
-0.02209191396832466,
0.004345404449850321,
0.06573960930109024,
-0.11287117004394531,
0.009793287143111229,
-0.03862380236387253,
-0.016277508810162544,
-0.13408011198043823,
-0.10326490551233292,
-0.05437853932380676,
-0.04450221359729767,
-0.05730228126049042,
0.12242799997329712,
-0.1566833108663559,
0.02020316757261753,
0.23720775544643402,
0.059179581701755524,
0.07124503701925278,
-0.03769661858677864,
0.1781967580318451,
-0.1026381328701973,
0.011783134192228317,
0.21348711848258972,
-0.03863108903169632,
0.06649045646190643,
0.11449605971574783,
-0.015088263899087906,
-0.06868837028741837,
0.037556882947683334,
-0.010503022000193596,
-0.07316595315933228,
-0.21276232600212097,
-0.1191236600279808,
-0.10972695052623749,
0.05501949414610863,
0.04922853037714958,
0.05032423511147499,
0.16298291087150574,
0.07668996602296829,
-0.04904390871524811,
0.001584999030455947,
0.059760358184576035,
0.08361341804265976,
0.2510400712490082,
-0.06171604245901108,
0.14055393636226654,
-0.025399193167686462,
-0.17000524699687958,
0.06215828284621239,
0.07098336517810822,
0.09605202078819275,
0.058832839131355286,
0.10240022093057632,
0.008178656920790672,
0.008458747528493404,
0.12824994325637817,
0.07315782457590103,
0.008122536353766918,
-0.03451789915561676,
-0.04043253883719444,
-0.03696576505899429,
-0.016908906400203705,
0.031117239966988564,
0.045386575162410736,
-0.16926494240760803,
-0.019026830792427063,
0.010437270626425743,
0.05824199318885803,
0.019692573696374893,
0.08370969444513321,
-0.18680240213871002,
-0.014447773806750774,
0.06624781340360641,
-0.00982279609888792,
-0.11607903242111206,
0.08325130492448807,
-0.00006138256867416203,
-0.11141069233417511,
0.035668812692165375,
-0.028373252600431442,
0.13211531937122345,
-0.09077433496713638,
0.07334637641906738,
-0.12124873697757721,
-0.037298139184713364,
-0.010899096727371216,
0.12348722666501999,
-0.29471105337142944,
0.19157877564430237,
-0.010260001756250858,
-0.0427047498524189,
-0.10745837539434433,
-0.014660571701824665,
0.02802271954715252,
0.10659189522266388,
0.10804633796215057,
-0.019849471747875214,
-0.023438751697540283,
0.060438092797994614,
-0.07315054535865784,
0.03891922906041145,
0.097873255610466,
-0.06623189896345139,
-0.01136084459722042,
-0.05090735852718353,
0.0011880729580298066,
0.009862029924988747,
-0.11200348287820816,
0.025104360654950142,
-0.19207465648651123,
0.08702947199344635,
0.07904490828514099,
0.0971948653459549,
0.037171486765146255,
-0.027824964374303818,
-0.0743110403418541,
0.2623932957649231,
0.012094305828213692,
-0.10155316442251205,
-0.10871628671884537,
0.007302213925868273,
0.04922667145729065,
-0.07690352946519852,
-0.01569143310189247,
-0.0692969411611557,
0.04146941378712654,
-0.06448405981063843,
-0.18443478643894196,
0.11724456399679184,
-0.09758520871400833,
-0.030543897300958633,
-0.03396443650126457,
0.20980419218540192,
-0.0315423421561718,
0.01623861864209175,
0.04493803158402443,
-0.008569825440645218,
-0.1175440177321434,
-0.10942834615707397,
-0.002397046657279134,
-0.004322136752307415,
0.0162108913064003,
0.025200236588716507,
-0.03236880153417587,
-0.007920696400105953,
-0.06642638146877289,
-0.013425208628177643,
0.3191275894641876,
0.12687638401985168,
-0.04024256765842438,
0.1487187147140503,
0.10082816332578659,
-0.06358374655246735,
-0.29419586062431335,
-0.1110854372382164,
-0.07446591556072235,
-0.056805163621902466,
-0.09995641559362411,
-0.181660994887352,
0.08668994903564453,
-0.05448218435049057,
-0.013539828360080719,
0.09316461533308029,
-0.2523694932460785,
-0.10253608971834183,
0.2015150934457779,
-0.03351643308997154,
0.4285542964935303,
-0.1115080714225769,
-0.0785619467496872,
-0.048229627311229706,
-0.14018340408802032,
0.191275492310524,
0.0058190408162772655,
0.10601352155208588,
-0.0002009188465308398,
0.19864386320114136,
0.0566282719373703,
-0.0012837350368499756,
0.0701557844877243,
0.020472673699259758,
-0.05497356504201889,
-0.09246667474508286,
-0.09015149623155594,
-0.03168601170182228,
0.010109071619808674,
0.031798794865608215,
-0.07328334450721741,
0.04371890053153038,
-0.13138733804225922,
-0.05411645025014877,
-0.08432826399803162,
0.037648119032382965,
0.027296165004372597,
-0.06869357079267502,
-0.002916166791692376,
-0.04816993325948715,
0.0016994841862469912,
0.007918842136859894,
0.20620933175086975,
-0.10779520869255066,
0.14090755581855774,
0.033961765468120575,
0.15093906223773956,
-0.09600444883108139,
-0.047604095190763474,
-0.06139858439564705,
-0.05390927195549011,
0.07211017608642578,
-0.12229318171739578,
0.03165985643863678,
0.10379531234502792,
-0.029136769473552704,
0.08749980479478836,
0.1109134703874588,
-0.011824870482087135,
0.005104703363031149,
0.09050401300191879,
-0.24321404099464417,
-0.06898721307516098,
-0.0817435085773468,
0.05277859792113304,
0.058722641319036484,
0.1027708351612091,
0.20902559161186218,
0.006313250865787268,
-0.028814585879445076,
0.021288873627781868,
0.02745843306183815,
-0.01872197911143303,
0.061469677835702896,
0.009840353392064571,
0.029417859390378,
-0.14658574759960175,
0.04447735473513603,
-0.01224545668810606,
-0.09172675013542175,
0.025912616401910782,
0.14410899579524994,
-0.11102084815502167,
-0.122079037129879,
-0.0382813923060894,
0.13729313015937805,
-0.14815719425678253,
-0.0117354029789567,
-0.04719110578298569,
-0.12447933107614517,
0.06743268668651581,
0.10455726832151413,
0.04603260010480881,
0.04043993353843689,
-0.09203767031431198,
-0.026612624526023865,
-0.05527783930301666,
-0.0005033746710978448,
0.03146493062376976,
-0.018926745280623436,
-0.0531751811504364,
0.04221292957663536,
-0.036750372499227524,
0.11927161365747452,
-0.08585978299379349,
-0.10035143792629242,
-0.1677810102701187,
0.03577928990125656,
-0.06987136602401733,
-0.08834805339574814,
-0.087708480656147,
-0.03590716794133186,
0.006336551625281572,
-0.0390448272228241,
-0.028196200728416443,
-0.03421054035425186,
-0.11329267919063568,
0.03182552382349968,
-0.04582337662577629,
0.002282749628648162,
-0.06941496580839157,
0.02934737130999565,
0.0527348555624485,
-0.03012561798095703,
0.14879247546195984,
0.14134863018989563,
-0.1120486781001091,
0.09441060572862625,
-0.14837662875652313,
-0.06795312464237213,
0.09710580110549927,
0.020345306023955345,
0.05206388980150223,
0.05022000893950462,
0.009869642555713654,
0.052856672555208206,
0.06019807979464531,
0.042121678590774536,
0.018030282109975815,
-0.07670415937900543,
0.067986860871315,
-0.06001577153801918,
-0.10277701914310455,
-0.05126209184527397,
-0.0008493204950354993,
0.01807907409965992,
0.07281779497861862,
0.10074610263109207,
-0.05552934482693672,
0.09382583200931549,
-0.05530373752117157,
0.04429418221116066,
0.025981837883591652,
-0.1773741990327835,
0.035779327154159546,
-0.08758020401000977,
0.048719268292188644,
0.010204999707639217,
0.17491371929645538,
0.022077683359384537,
-0.021517012268304825,
0.021784497424960136,
0.07043316960334778,
0.04651212692260742,
-0.014028544537723064,
0.19185031950473785,
0.10516584664583206,
-0.038725994527339935,
-0.0838618278503418,
0.09701808542013168,
0.04364638775587082,
0.04230160266160965,
0.14381137490272522,
-0.05909604951739311,
-0.03927518054842949,
0.08041166514158249,
-0.004114777781069279,
0.010739325545728207,
-0.10045387595891953,
-0.13616950809955597,
-0.0242715273052454,
0.03822272643446922,
-0.03613738715648651,
0.10484866797924042,
0.15927128493785858,
-0.003432835452258587,
0.018933836370706558,
-0.014741482213139534,
-0.059460677206516266,
-0.1985488384962082,
-0.1984495222568512,
-0.08465894311666489,
-0.13742054998874664,
0.0053448243997991085,
-0.13740120828151703,
0.04168961942195892,
0.0266414787620306,
0.09802933782339096,
-0.04659690335392952,
0.053891927003860474,
0.0365489162504673,
-0.11079959571361542,
0.05490271374583244,
-0.04384707659482956,
0.0921567976474762,
-0.03172128275036812,
0.013953625224530697,
-0.061154644936323166,
0.03533269092440605,
0.015680816024541855,
0.04335138946771622,
-0.02730260044336319,
0.020079512149095535,
-0.12365023791790009,
-0.08511123806238174,
-0.0687124952673912,
0.06635203957557678,
0.006271391175687313,
0.1730542927980423,
0.01871834136545658,
-0.030511673539876938,
0.029879208654165268,
0.23771421611309052,
-0.07247906178236008,
-0.0998891219496727,
-0.07002349942922592,
0.2104433923959732,
-0.008601457811892033,
0.08981921523809433,
-0.036056190729141235,
0.010331481695175171,
-0.08574481308460236,
0.34977883100509644,
0.29154229164123535,
-0.09423467516899109,
0.010944767855107784,
-0.0018926061457023025,
0.041570257395505905,
0.1271265298128128,
0.09409935772418976,
0.10600660741329193,
0.2914230525493622,
-0.06560413539409637,
-0.034995418041944504,
-0.00870916061103344,
-0.024741534143686295,
-0.056739576160907745,
0.05402141064405441,
0.055680010467767715,
-0.06467366218566895,
-0.017051953822374344,
0.11837512999773026,
-0.2507799565792084,
0.060523539781570435,
-0.1586569845676422,
-0.15863612294197083,
-0.07149337232112885,
-0.0005927369347773492,
0.09427593648433685,
0.013758930377662182,
0.09756780415773392,
-0.01269946526736021,
-0.06760623306035995,
0.04297604411840439,
0.020308442413806915,
-0.20682843029499054,
0.010318391025066376,
0.0697605311870575,
-0.047236718237400055,
-0.05558694526553154,
-0.016078975051641464,
0.07278840243816376,
0.08733747154474258,
0.02829739637672901,
-0.02315342053771019,
0.04390132799744606,
-0.011941233649849892,
-0.07413468509912491,
0.03934718668460846,
0.023901786655187607,
0.004186868667602539,
-0.08726628869771957,
0.07557696849107742,
-0.16110792756080627,
0.033335130661726,
-0.0028987054247409105,
-0.0468905046582222,
-0.015854276716709137,
0.027530863881111145,
-0.062161169946193695,
0.08502498269081116,
0.08475040644407272,
-0.017108825966715813,
-0.014891952276229858,
-0.02289215475320816,
-0.010616268031299114,
-0.021719465032219887,
-0.08192946016788483,
-0.09613069146871567,
-0.15460480749607086,
-0.12901735305786133,
0.08913417160511017,
-0.0020764595828950405,
-0.20097172260284424,
0.030461549758911133,
-0.12281855940818787,
0.0440378300845623,
-0.12149405479431152,
0.0972503200173378,
0.08258824050426483,
0.023108746856451035,
-0.004436559043824673,
0.00674563180655241,
0.03467189148068428,
0.07904570549726486,
-0.13723783195018768,
-0.08609636127948761
] |
null | null | transformers |
# DialoGPT chat bot model using discord messages as data | {"tags": ["conversational"]} | text-generation | S34NtheGuy/DialoGPT-medium-Glass_Of_Water | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# DialoGPT chat bot model using discord messages as data | [
"# DialoGPT chat bot model using discord messages as data"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# DialoGPT chat bot model using discord messages as data"
] | [
51,
13
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# DialoGPT chat bot model using discord messages as data"
] | [
-0.02911895327270031,
0.003995177801698446,
-0.0046684942208230495,
-0.012396533973515034,
0.11202579736709595,
-0.020793797448277473,
0.19453662633895874,
0.0904533639550209,
0.11296340823173523,
-0.04555835947394371,
0.05478145554661751,
0.14494860172271729,
0.02601793222129345,
0.2345304638147354,
-0.07001583278179169,
-0.2198491394519806,
0.06904508173465729,
0.0020564962178468704,
0.07120723277330399,
0.12327651679515839,
0.0893358588218689,
-0.027994820848107338,
0.09009858965873718,
0.016869334504008293,
-0.19000788033008575,
-0.005467485636472702,
0.01664290763437748,
-0.08414720743894577,
0.0951712355017662,
0.05692970007658005,
0.027007173746824265,
0.022588100284337997,
-0.054826367646455765,
-0.07715894281864166,
0.04409089311957359,
-0.015045171603560448,
-0.03291803225874901,
0.042206596583127975,
-0.06107683107256889,
-0.10459832847118378,
0.18928758800029755,
0.11292500793933868,
0.02833588793873787,
0.07569966465234756,
-0.1312641203403473,
0.011802727356553078,
-0.04876156896352768,
0.02216631919145584,
0.14013226330280304,
0.12497154623270035,
-0.07691969722509384,
0.16699260473251343,
-0.07476947456598282,
0.10180847346782684,
0.05837265029549599,
-0.3295818567276001,
-0.010758930817246437,
0.12642543017864227,
0.030477117747068405,
0.09977107495069504,
-0.018700912594795227,
0.04719923064112663,
0.011075117625296116,
0.027997151017189026,
-0.0939258560538292,
-0.03323792293667793,
-0.14734824001789093,
-0.03544525429606438,
-0.09491617977619171,
-0.024447239935398102,
0.269292950630188,
0.005437672603875399,
0.05558373034000397,
-0.11964231729507446,
-0.07344776391983032,
-0.07583962380886078,
-0.07977471500635147,
-0.021205732598900795,
-0.11365412920713425,
0.06104221194982529,
-0.011485567316412926,
-0.08311361074447632,
-0.11278795450925827,
-0.043469205498695374,
-0.18289406597614288,
0.0821399837732315,
0.0335225835442543,
0.07892793416976929,
-0.2801816165447235,
0.08420786261558533,
0.04518672078847885,
-0.0835626944899559,
0.019823530688881874,
-0.08192607015371323,
-0.04286707937717438,
-0.006278423126786947,
-0.03693044185638428,
-0.0884459912776947,
0.10716810822486877,
0.1580084264278412,
-0.05083105340600014,
0.04353693500161171,
-0.07717008143663406,
0.0446944534778595,
0.10556173324584961,
0.03715191408991814,
-0.00912248995155096,
0.006465920712798834,
0.05436272174119949,
-0.11213027685880661,
0.01778799295425415,
-0.03295382484793663,
-0.1692671924829483,
0.01611638441681862,
0.02191336080431938,
0.08422008901834488,
0.042106952518224716,
0.13374918699264526,
-0.045255158096551895,
-0.03810238465666771,
0.12856028974056244,
-0.003518170677125454,
-0.0027266035322099924,
0.04263519123196602,
-0.07405722886323929,
0.06224050372838974,
0.03224530816078186,
0.06507550925016403,
-0.08838954567909241,
-0.07766801118850708,
-0.03798297420144081,
0.007619286421686411,
-0.030212605372071266,
-0.029250554740428925,
0.02620057761669159,
0.053402043879032135,
-0.0241620484739542,
-0.1678251177072525,
-0.13773390650749207,
0.00567244365811348,
-0.03463026136159897,
-0.06034352630376816,
-0.09172911196947098,
-0.11689257621765137,
0.022298447787761688,
0.01672283187508583,
-0.07708073407411575,
-0.054256051778793335,
-0.059729307889938354,
0.04205011948943138,
-0.07092487066984177,
0.14112763106822968,
-0.11467497050762177,
0.04234171658754349,
-0.10254870355129242,
-0.04991884157061577,
-0.1932521015405655,
0.15436524152755737,
-0.053409744054079056,
0.12730517983436584,
-0.048516929149627686,
0.03334582969546318,
-0.08455422520637512,
0.00886745285242796,
-0.024523701518774033,
0.2663744390010834,
-0.1233612671494484,
-0.08157475292682648,
0.3245835304260254,
-0.0866551399230957,
-0.13101184368133545,
0.17166416347026825,
-0.00581461563706398,
0.06078298017382622,
0.17557644844055176,
0.18741080164909363,
-0.07020799070596695,
-0.009106408804655075,
-0.006908034905791283,
0.08536610007286072,
-0.1426268219947815,
0.015105406753718853,
-0.0069095678627491,
-0.009230674244463444,
-0.013791845180094242,
0.007894156500697136,
0.23989658057689667,
0.135564386844635,
-0.0794452428817749,
-0.030347473919391632,
0.012585587799549103,
-0.03232159465551376,
0.0846608355641365,
-0.027062173932790756,
0.09626356512308121,
-0.024425217881798744,
-0.0443769246339798,
-0.041578181087970734,
0.060682933777570724,
0.0039953915402293205,
0.011674324050545692,
-0.18290074169635773,
0.03165009990334511,
0.05584390088915825,
0.08656518161296844,
-0.12307853996753693,
-0.176169753074646,
-0.024073531851172447,
0.16490675508975983,
0.07964789122343063,
0.0825846791267395,
0.07539273798465729,
-0.08541291952133179,
0.017652546986937523,
0.04020996764302254,
0.17037948966026306,
-0.015576590783894062,
-0.10870205610990524,
-0.10396617650985718,
0.06813302636146545,
-0.0645066425204277,
0.226262167096138,
-0.04232753813266754,
0.019181568175554276,
0.07608156651258469,
0.15684202313423157,
0.011756951920688152,
0.015583735890686512,
0.06726869940757751,
-0.030799711123108864,
-0.01088039018213749,
-0.023927779868245125,
0.03538373112678528,
-0.01441163569688797,
-0.11656852066516876,
0.2350115180015564,
-0.10605599731206894,
0.06379605829715729,
0.18888333439826965,
-0.08889171481132507,
-0.012959163635969162,
-0.043133217841386795,
-0.046505268663167953,
0.004694156814366579,
0.062076911330223083,
-0.03991788998246193,
0.22515693306922913,
0.01117341686040163,
0.13349102437496185,
-0.007687731646001339,
-0.018896888941526413,
-0.014094335027039051,
-0.07590913772583008,
0.00629182206466794,
0.0627899020910263,
0.028147637844085693,
-0.15793262422084808,
0.10818427056074142,
0.013195234350860119,
0.0805935263633728,
0.2113124579191208,
0.0355629026889801,
0.0680663213133812,
0.020914744585752487,
0.0031901278998702765,
-0.06738251447677612,
-0.06983086466789246,
-0.321178674697876,
-0.030278725549578667,
0.049556665122509,
0.05528012290596962,
0.1350831389427185,
-0.042795684188604355,
-0.019094472751021385,
-0.03944716975092888,
-0.022730743512511253,
0.05761689692735672,
0.15371432900428772,
0.04195815697312355,
0.15377937257289886,
-0.006703630555421114,
-0.07485831528902054,
0.04169466346502304,
0.003349226899445057,
-0.11153942346572876,
0.1410476267337799,
-0.17577725648880005,
-0.3476906418800354,
-0.03124120645225048,
-0.130113884806633,
-0.06320375204086304,
0.039843443781137466,
0.08997023105621338,
-0.19903790950775146,
-0.0006238986970856786,
0.017847519367933273,
0.09509206563234329,
0.02413397654891014,
0.016077861189842224,
0.07153959572315216,
-0.06139683723449707,
-0.10136960446834564,
-0.11373108625411987,
-0.05264159291982651,
-0.06683648377656937,
-0.11798607558012009,
0.12459578365087509,
-0.16452273726463318,
0.017987580969929695,
0.21677619218826294,
0.04113182798027992,
0.06206201761960983,
-0.023023243993520737,
0.25566840171813965,
-0.10939715057611465,
0.0286073237657547,
0.17293484508991241,
0.01686146669089794,
0.02669895999133587,
0.10997503250837326,
-0.019446026533842087,
-0.12817075848579407,
0.06026293337345123,
-0.01656145602464676,
-0.07096251100301743,
-0.20454192161560059,
-0.21024994552135468,
-0.08096782863140106,
0.09014924615621567,
-0.018005413934588432,
0.06904570758342743,
0.1414383202791214,
0.02970438078045845,
-0.026131540536880493,
-0.0490809828042984,
0.09899041801691055,
0.051192380487918854,
0.17059782147407532,
-0.09738970547914505,
0.12161380052566528,
-0.0215446837246418,
-0.09003637731075287,
0.07071530818939209,
0.008625643327832222,
0.07919701933860779,
0.05664544552564621,
0.056681711226701736,
0.03477178141474724,
0.05184609070420265,
0.13479608297348022,
0.004874000791460276,
0.01784246601164341,
-0.08477703481912613,
0.005763623397797346,
-0.022313527762889862,
-0.09581872075796127,
0.007024643011391163,
0.05425998196005821,
-0.16158020496368408,
-0.028891153633594513,
-0.0017159533454105258,
0.10731203109025955,
0.04837334528565407,
0.09739865362644196,
-0.16217289865016937,
-0.09366218745708466,
0.044000640511512756,
-0.03462972491979599,
-0.07541224360466003,
0.07639168947935104,
0.06741161644458771,
-0.1383616030216217,
0.054750069975852966,
-0.011487936601042747,
0.09841711074113846,
-0.11301963031291962,
0.04922184348106384,
-0.1383095234632492,
-0.040402695536613464,
-0.005755018442869186,
0.07621902227401733,
-0.2630176544189453,
0.13395322859287262,
-0.03546271100640297,
-0.03840550407767296,
-0.1174018606543541,
-0.017455484718084335,
-0.0038811310660094023,
0.10694295167922974,
0.04950512945652008,
0.01964215375483036,
0.04950443282723427,
0.014236033894121647,
-0.10705028474330902,
0.03218303993344307,
0.010299173183739185,
-0.023959863930940628,
-0.05948719382286072,
0.02074790745973587,
-0.024005096405744553,
-0.018881697207689285,
-0.12932980060577393,
-0.03812210261821747,
-0.15508605539798737,
0.041080720722675323,
0.18053974211215973,
0.11493601649999619,
0.037637773901224136,
-0.018357248976826668,
-0.006955144926905632,
0.26158514618873596,
0.09794305264949799,
-0.13049659132957458,
-0.06943461298942566,
0.025171175599098206,
0.03627297282218933,
-0.06180146709084511,
-0.002944500418379903,
-0.03472563624382019,
0.05261370167136192,
-0.06326436251401901,
-0.17609041929244995,
0.10805560648441315,
-0.11141735315322876,
-0.029353909194469452,
-0.007894911803305149,
0.16547085344791412,
0.11611169576644897,
-0.0017626271583139896,
0.06191926822066307,
-0.04753774031996727,
-0.07226573675870895,
-0.09732077270746231,
-0.010933701880276203,
0.10964737087488174,
-0.03223278373479843,
0.11675872653722763,
0.05453915521502495,
-0.22489312291145325,
-0.07166748493909836,
-0.017499007284641266,
0.28386232256889343,
0.10852590948343277,
-0.03674788773059845,
0.17725762724876404,
0.11806492507457733,
-0.005635837558656931,
-0.21437020599842072,
-0.0862417072057724,
-0.056013092398643494,
-0.04754536226391792,
-0.08656063675880432,
-0.15293635427951813,
0.05119415372610092,
-0.03897181153297424,
-0.02094240114092827,
0.07458508759737015,
-0.3477265536785126,
-0.09044193476438522,
0.1824522167444229,
-0.06594786792993546,
0.38817891478538513,
-0.03777773678302765,
-0.0718904659152031,
-0.026183942332863808,
-0.15700702369213104,
0.1885446012020111,
-0.0038575793150812387,
0.05461110919713974,
0.008353380486369133,
0.2613135874271393,
0.04980030655860901,
0.015743695199489594,
0.051713887602090836,
0.08028706163167953,
-0.04876833036541939,
-0.1043628379702568,
-0.04317786917090416,
-0.0030902179423719645,
0.026795990765094757,
0.10658130794763565,
-0.04968138039112091,
0.0011742463102564216,
-0.14351791143417358,
-0.022059278562664986,
-0.1123029813170433,
0.034258145838975906,
0.048715740442276,
-0.01737889274954796,
-0.02300572209060192,
-0.04134640842676163,
-0.0317239984869957,
0.04755343124270439,
0.12839727103710175,
-0.07772476226091385,
0.17595504224300385,
0.0972408801317215,
0.1043165922164917,
-0.20370347797870636,
0.0401637889444828,
-0.014201825484633446,
-0.04263058304786682,
0.06336291879415512,
-0.07573235780000687,
0.0012032645754516125,
0.08475054055452347,
-0.06958045810461044,
0.11022278666496277,
0.05037990212440491,
-0.05388018116354942,
0.06973491609096527,
0.10324688255786896,
-0.22402584552764893,
-0.13858133554458618,
-0.0032234573736786842,
0.10255797952413559,
0.09593109786510468,
0.1335451602935791,
0.22795435786247253,
0.00013332384696695954,
-0.05698217451572418,
-0.014078579843044281,
0.06873737275600433,
-0.07360608130693436,
0.0630730614066124,
-0.08198481798171997,
0.012562907300889492,
-0.18125148117542267,
0.030582746490836143,
0.02556220255792141,
-0.07756127417087555,
0.08051129430532455,
0.1610197126865387,
-0.1543751358985901,
-0.12179239839315414,
-0.1505657136440277,
0.09805983304977417,
-0.04721866175532341,
0.0007801863248459995,
-0.005245056003332138,
-0.10755717009305954,
0.03880735859274864,
0.07905635237693787,
0.02388819307088852,
0.09664785116910934,
-0.050732966512441635,
-0.026460805907845497,
-0.028535734862089157,
-0.026952167972922325,
0.06222646310925484,
-0.060951024293899536,
-0.029420386999845505,
0.05857784301042557,
-0.01321637723594904,
0.11563537269830704,
-0.08114197105169296,
-0.1311916708946228,
-0.17651109397411346,
0.04375500977039337,
-0.12273527681827545,
-0.11150697618722916,
-0.12684613466262817,
-0.034054260700941086,
-0.026793459430336952,
-0.05424526333808899,
-0.020681774243712425,
-0.035995762795209885,
-0.09855540841817856,
0.05007615685462952,
-0.021738771349191666,
0.019113000482320786,
-0.11898110806941986,
0.047154951840639114,
0.023072753101587296,
0.00952209997922182,
0.22900183498859406,
0.23967400193214417,
-0.10501338541507721,
0.07970281690359116,
-0.13300393521785736,
-0.06916272640228271,
0.12155511975288391,
0.015302781015634537,
0.11593975126743317,
0.06520570814609528,
-0.012209351174533367,
0.07093607634305954,
0.06603305041790009,
0.07040399312973022,
0.09451892226934433,
-0.09665215760469437,
0.03132275491952896,
0.005647694226354361,
-0.054901495575904846,
-0.047769151628017426,
-0.008121064864099026,
0.06017419323325157,
0.06088278442621231,
0.08712935447692871,
-0.06329122185707092,
0.047178879380226135,
-0.06907396763563156,
0.0009565642685629427,
0.020402390509843826,
-0.10111594200134277,
0.05090409144759178,
-0.05048583075404167,
0.04585586115717888,
-0.0038993079215288162,
0.11841075867414474,
0.05489125847816467,
-0.028396034613251686,
0.03150009363889694,
0.07694239914417267,
-0.04186859354376793,
-0.011592810042202473,
-0.006134168244898319,
0.054798588156700134,
-0.043219733983278275,
-0.05355124920606613,
0.04504793509840965,
0.03471345081925392,
-0.05936227738857269,
0.11973360180854797,
-0.09357963502407074,
0.002641031751409173,
0.041799262166023254,
0.028395522385835648,
0.00043721101246774197,
-0.1502877026796341,
-0.14501814544200897,
-0.2070295363664627,
0.04020696133375168,
-0.0972619280219078,
0.06980330497026443,
0.0823310986161232,
0.020472751930356026,
-0.0072508519515395164,
-0.015675708651542664,
-0.054576434195041656,
-0.1250603049993515,
-0.17620421946048737,
-0.05234035849571228,
-0.19658192992210388,
0.009567178785800934,
-0.10002416372299194,
0.051153987646102905,
-0.018896259367465973,
0.08556535094976425,
-0.05606051906943321,
0.11778818815946579,
-0.009087399579584599,
-0.07489084452390671,
0.03840646520256996,
-0.04640626534819603,
0.03561873733997345,
-0.02484917640686035,
0.022134236991405487,
-0.045144930481910706,
0.08696575462818146,
0.020124923437833786,
0.06914480775594711,
-0.037223152816295624,
0.05012250691652298,
-0.1036195158958435,
-0.07003312557935715,
-0.0553266704082489,
0.06647614389657974,
-0.02439258247613907,
0.09582629799842834,
0.08311638981103897,
0.0007597408257424831,
0.01473532896488905,
0.25121182203292847,
-0.05404425039887428,
-0.10183586925268173,
-0.13247138261795044,
0.2002287209033966,
-0.03368125855922699,
0.0583178848028183,
-0.055855244398117065,
-0.00426106620579958,
-0.15232570469379425,
0.2420482039451599,
0.29523324966430664,
-0.08089511096477509,
0.013313311152160168,
-0.08143246918916702,
0.03359610587358475,
0.013623652048408985,
0.10647078603506088,
0.14291693270206451,
0.3189051151275635,
-0.012403322383761406,
0.07417552173137665,
0.00009350610343972221,
-0.060927875339984894,
-0.05881281569600105,
-0.05669590085744858,
-0.02383190393447876,
-0.006449878215789795,
-0.0066023580729961395,
0.1395004689693451,
-0.26645973324775696,
0.016834815964102745,
-0.18420034646987915,
-0.20412065088748932,
-0.08061134070158005,
0.018009180203080177,
0.12304088473320007,
0.044550321996212006,
0.1389375925064087,
0.00005718287866329774,
-0.06459779292345047,
0.1276867389678955,
-0.006128685083240271,
-0.15249358117580414,
-0.10721327364444733,
0.11934854090213776,
-0.12627308070659637,
-0.04108234494924545,
-0.041037049144506454,
0.09749634563922882,
0.08271387219429016,
0.03395570069551468,
-0.07119182497262955,
0.01351496484130621,
-0.03403165191411972,
-0.021414151415228844,
-0.005451269913464785,
0.06813238561153412,
0.02072959579527378,
0.025263439863920212,
0.05354813113808632,
-0.18906964361667633,
0.01142108254134655,
-0.08744467794895172,
0.07205761224031448,
-0.032799381762742996,
0.07564347237348557,
-0.03142198547720909,
0.06004147604107857,
0.0745621845126152,
-0.055871736258268356,
0.03904830664396286,
0.04052802547812462,
-0.020588697865605354,
-0.04159161075949669,
-0.061948951333761215,
-0.1403578817844391,
-0.22841989994049072,
-0.11440495401620865,
0.09922145307064056,
0.029091045260429382,
-0.0986558049917221,
0.048632487654685974,
-0.15799014270305634,
0.05539434030652046,
-0.07917378842830658,
0.11045927554368973,
0.08010590076446533,
0.027341295033693314,
0.0007489612326025963,
0.05805785581469536,
0.06042911112308502,
0.09924639761447906,
-0.11275497078895569,
-0.08823683857917786
] |
null | null | transformers |
# DialoGPT chat bot model using discord messages as data | {"tags": ["conversational"]} | text-generation | S34NtheGuy/DialoGPT-medium-Mona | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# DialoGPT chat bot model using discord messages as data | [
"# DialoGPT chat bot model using discord messages as data"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# DialoGPT chat bot model using discord messages as data"
] | [
51,
13
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# DialoGPT chat bot model using discord messages as data"
] | [
-0.02911895327270031,
0.003995177801698446,
-0.0046684942208230495,
-0.012396533973515034,
0.11202579736709595,
-0.020793797448277473,
0.19453662633895874,
0.0904533639550209,
0.11296340823173523,
-0.04555835947394371,
0.05478145554661751,
0.14494860172271729,
0.02601793222129345,
0.2345304638147354,
-0.07001583278179169,
-0.2198491394519806,
0.06904508173465729,
0.0020564962178468704,
0.07120723277330399,
0.12327651679515839,
0.0893358588218689,
-0.027994820848107338,
0.09009858965873718,
0.016869334504008293,
-0.19000788033008575,
-0.005467485636472702,
0.01664290763437748,
-0.08414720743894577,
0.0951712355017662,
0.05692970007658005,
0.027007173746824265,
0.022588100284337997,
-0.054826367646455765,
-0.07715894281864166,
0.04409089311957359,
-0.015045171603560448,
-0.03291803225874901,
0.042206596583127975,
-0.06107683107256889,
-0.10459832847118378,
0.18928758800029755,
0.11292500793933868,
0.02833588793873787,
0.07569966465234756,
-0.1312641203403473,
0.011802727356553078,
-0.04876156896352768,
0.02216631919145584,
0.14013226330280304,
0.12497154623270035,
-0.07691969722509384,
0.16699260473251343,
-0.07476947456598282,
0.10180847346782684,
0.05837265029549599,
-0.3295818567276001,
-0.010758930817246437,
0.12642543017864227,
0.030477117747068405,
0.09977107495069504,
-0.018700912594795227,
0.04719923064112663,
0.011075117625296116,
0.027997151017189026,
-0.0939258560538292,
-0.03323792293667793,
-0.14734824001789093,
-0.03544525429606438,
-0.09491617977619171,
-0.024447239935398102,
0.269292950630188,
0.005437672603875399,
0.05558373034000397,
-0.11964231729507446,
-0.07344776391983032,
-0.07583962380886078,
-0.07977471500635147,
-0.021205732598900795,
-0.11365412920713425,
0.06104221194982529,
-0.011485567316412926,
-0.08311361074447632,
-0.11278795450925827,
-0.043469205498695374,
-0.18289406597614288,
0.0821399837732315,
0.0335225835442543,
0.07892793416976929,
-0.2801816165447235,
0.08420786261558533,
0.04518672078847885,
-0.0835626944899559,
0.019823530688881874,
-0.08192607015371323,
-0.04286707937717438,
-0.006278423126786947,
-0.03693044185638428,
-0.0884459912776947,
0.10716810822486877,
0.1580084264278412,
-0.05083105340600014,
0.04353693500161171,
-0.07717008143663406,
0.0446944534778595,
0.10556173324584961,
0.03715191408991814,
-0.00912248995155096,
0.006465920712798834,
0.05436272174119949,
-0.11213027685880661,
0.01778799295425415,
-0.03295382484793663,
-0.1692671924829483,
0.01611638441681862,
0.02191336080431938,
0.08422008901834488,
0.042106952518224716,
0.13374918699264526,
-0.045255158096551895,
-0.03810238465666771,
0.12856028974056244,
-0.003518170677125454,
-0.0027266035322099924,
0.04263519123196602,
-0.07405722886323929,
0.06224050372838974,
0.03224530816078186,
0.06507550925016403,
-0.08838954567909241,
-0.07766801118850708,
-0.03798297420144081,
0.007619286421686411,
-0.030212605372071266,
-0.029250554740428925,
0.02620057761669159,
0.053402043879032135,
-0.0241620484739542,
-0.1678251177072525,
-0.13773390650749207,
0.00567244365811348,
-0.03463026136159897,
-0.06034352630376816,
-0.09172911196947098,
-0.11689257621765137,
0.022298447787761688,
0.01672283187508583,
-0.07708073407411575,
-0.054256051778793335,
-0.059729307889938354,
0.04205011948943138,
-0.07092487066984177,
0.14112763106822968,
-0.11467497050762177,
0.04234171658754349,
-0.10254870355129242,
-0.04991884157061577,
-0.1932521015405655,
0.15436524152755737,
-0.053409744054079056,
0.12730517983436584,
-0.048516929149627686,
0.03334582969546318,
-0.08455422520637512,
0.00886745285242796,
-0.024523701518774033,
0.2663744390010834,
-0.1233612671494484,
-0.08157475292682648,
0.3245835304260254,
-0.0866551399230957,
-0.13101184368133545,
0.17166416347026825,
-0.00581461563706398,
0.06078298017382622,
0.17557644844055176,
0.18741080164909363,
-0.07020799070596695,
-0.009106408804655075,
-0.006908034905791283,
0.08536610007286072,
-0.1426268219947815,
0.015105406753718853,
-0.0069095678627491,
-0.009230674244463444,
-0.013791845180094242,
0.007894156500697136,
0.23989658057689667,
0.135564386844635,
-0.0794452428817749,
-0.030347473919391632,
0.012585587799549103,
-0.03232159465551376,
0.0846608355641365,
-0.027062173932790756,
0.09626356512308121,
-0.024425217881798744,
-0.0443769246339798,
-0.041578181087970734,
0.060682933777570724,
0.0039953915402293205,
0.011674324050545692,
-0.18290074169635773,
0.03165009990334511,
0.05584390088915825,
0.08656518161296844,
-0.12307853996753693,
-0.176169753074646,
-0.024073531851172447,
0.16490675508975983,
0.07964789122343063,
0.0825846791267395,
0.07539273798465729,
-0.08541291952133179,
0.017652546986937523,
0.04020996764302254,
0.17037948966026306,
-0.015576590783894062,
-0.10870205610990524,
-0.10396617650985718,
0.06813302636146545,
-0.0645066425204277,
0.226262167096138,
-0.04232753813266754,
0.019181568175554276,
0.07608156651258469,
0.15684202313423157,
0.011756951920688152,
0.015583735890686512,
0.06726869940757751,
-0.030799711123108864,
-0.01088039018213749,
-0.023927779868245125,
0.03538373112678528,
-0.01441163569688797,
-0.11656852066516876,
0.2350115180015564,
-0.10605599731206894,
0.06379605829715729,
0.18888333439826965,
-0.08889171481132507,
-0.012959163635969162,
-0.043133217841386795,
-0.046505268663167953,
0.004694156814366579,
0.062076911330223083,
-0.03991788998246193,
0.22515693306922913,
0.01117341686040163,
0.13349102437496185,
-0.007687731646001339,
-0.018896888941526413,
-0.014094335027039051,
-0.07590913772583008,
0.00629182206466794,
0.0627899020910263,
0.028147637844085693,
-0.15793262422084808,
0.10818427056074142,
0.013195234350860119,
0.0805935263633728,
0.2113124579191208,
0.0355629026889801,
0.0680663213133812,
0.020914744585752487,
0.0031901278998702765,
-0.06738251447677612,
-0.06983086466789246,
-0.321178674697876,
-0.030278725549578667,
0.049556665122509,
0.05528012290596962,
0.1350831389427185,
-0.042795684188604355,
-0.019094472751021385,
-0.03944716975092888,
-0.022730743512511253,
0.05761689692735672,
0.15371432900428772,
0.04195815697312355,
0.15377937257289886,
-0.006703630555421114,
-0.07485831528902054,
0.04169466346502304,
0.003349226899445057,
-0.11153942346572876,
0.1410476267337799,
-0.17577725648880005,
-0.3476906418800354,
-0.03124120645225048,
-0.130113884806633,
-0.06320375204086304,
0.039843443781137466,
0.08997023105621338,
-0.19903790950775146,
-0.0006238986970856786,
0.017847519367933273,
0.09509206563234329,
0.02413397654891014,
0.016077861189842224,
0.07153959572315216,
-0.06139683723449707,
-0.10136960446834564,
-0.11373108625411987,
-0.05264159291982651,
-0.06683648377656937,
-0.11798607558012009,
0.12459578365087509,
-0.16452273726463318,
0.017987580969929695,
0.21677619218826294,
0.04113182798027992,
0.06206201761960983,
-0.023023243993520737,
0.25566840171813965,
-0.10939715057611465,
0.0286073237657547,
0.17293484508991241,
0.01686146669089794,
0.02669895999133587,
0.10997503250837326,
-0.019446026533842087,
-0.12817075848579407,
0.06026293337345123,
-0.01656145602464676,
-0.07096251100301743,
-0.20454192161560059,
-0.21024994552135468,
-0.08096782863140106,
0.09014924615621567,
-0.018005413934588432,
0.06904570758342743,
0.1414383202791214,
0.02970438078045845,
-0.026131540536880493,
-0.0490809828042984,
0.09899041801691055,
0.051192380487918854,
0.17059782147407532,
-0.09738970547914505,
0.12161380052566528,
-0.0215446837246418,
-0.09003637731075287,
0.07071530818939209,
0.008625643327832222,
0.07919701933860779,
0.05664544552564621,
0.056681711226701736,
0.03477178141474724,
0.05184609070420265,
0.13479608297348022,
0.004874000791460276,
0.01784246601164341,
-0.08477703481912613,
0.005763623397797346,
-0.022313527762889862,
-0.09581872075796127,
0.007024643011391163,
0.05425998196005821,
-0.16158020496368408,
-0.028891153633594513,
-0.0017159533454105258,
0.10731203109025955,
0.04837334528565407,
0.09739865362644196,
-0.16217289865016937,
-0.09366218745708466,
0.044000640511512756,
-0.03462972491979599,
-0.07541224360466003,
0.07639168947935104,
0.06741161644458771,
-0.1383616030216217,
0.054750069975852966,
-0.011487936601042747,
0.09841711074113846,
-0.11301963031291962,
0.04922184348106384,
-0.1383095234632492,
-0.040402695536613464,
-0.005755018442869186,
0.07621902227401733,
-0.2630176544189453,
0.13395322859287262,
-0.03546271100640297,
-0.03840550407767296,
-0.1174018606543541,
-0.017455484718084335,
-0.0038811310660094023,
0.10694295167922974,
0.04950512945652008,
0.01964215375483036,
0.04950443282723427,
0.014236033894121647,
-0.10705028474330902,
0.03218303993344307,
0.010299173183739185,
-0.023959863930940628,
-0.05948719382286072,
0.02074790745973587,
-0.024005096405744553,
-0.018881697207689285,
-0.12932980060577393,
-0.03812210261821747,
-0.15508605539798737,
0.041080720722675323,
0.18053974211215973,
0.11493601649999619,
0.037637773901224136,
-0.018357248976826668,
-0.006955144926905632,
0.26158514618873596,
0.09794305264949799,
-0.13049659132957458,
-0.06943461298942566,
0.025171175599098206,
0.03627297282218933,
-0.06180146709084511,
-0.002944500418379903,
-0.03472563624382019,
0.05261370167136192,
-0.06326436251401901,
-0.17609041929244995,
0.10805560648441315,
-0.11141735315322876,
-0.029353909194469452,
-0.007894911803305149,
0.16547085344791412,
0.11611169576644897,
-0.0017626271583139896,
0.06191926822066307,
-0.04753774031996727,
-0.07226573675870895,
-0.09732077270746231,
-0.010933701880276203,
0.10964737087488174,
-0.03223278373479843,
0.11675872653722763,
0.05453915521502495,
-0.22489312291145325,
-0.07166748493909836,
-0.017499007284641266,
0.28386232256889343,
0.10852590948343277,
-0.03674788773059845,
0.17725762724876404,
0.11806492507457733,
-0.005635837558656931,
-0.21437020599842072,
-0.0862417072057724,
-0.056013092398643494,
-0.04754536226391792,
-0.08656063675880432,
-0.15293635427951813,
0.05119415372610092,
-0.03897181153297424,
-0.02094240114092827,
0.07458508759737015,
-0.3477265536785126,
-0.09044193476438522,
0.1824522167444229,
-0.06594786792993546,
0.38817891478538513,
-0.03777773678302765,
-0.0718904659152031,
-0.026183942332863808,
-0.15700702369213104,
0.1885446012020111,
-0.0038575793150812387,
0.05461110919713974,
0.008353380486369133,
0.2613135874271393,
0.04980030655860901,
0.015743695199489594,
0.051713887602090836,
0.08028706163167953,
-0.04876833036541939,
-0.1043628379702568,
-0.04317786917090416,
-0.0030902179423719645,
0.026795990765094757,
0.10658130794763565,
-0.04968138039112091,
0.0011742463102564216,
-0.14351791143417358,
-0.022059278562664986,
-0.1123029813170433,
0.034258145838975906,
0.048715740442276,
-0.01737889274954796,
-0.02300572209060192,
-0.04134640842676163,
-0.0317239984869957,
0.04755343124270439,
0.12839727103710175,
-0.07772476226091385,
0.17595504224300385,
0.0972408801317215,
0.1043165922164917,
-0.20370347797870636,
0.0401637889444828,
-0.014201825484633446,
-0.04263058304786682,
0.06336291879415512,
-0.07573235780000687,
0.0012032645754516125,
0.08475054055452347,
-0.06958045810461044,
0.11022278666496277,
0.05037990212440491,
-0.05388018116354942,
0.06973491609096527,
0.10324688255786896,
-0.22402584552764893,
-0.13858133554458618,
-0.0032234573736786842,
0.10255797952413559,
0.09593109786510468,
0.1335451602935791,
0.22795435786247253,
0.00013332384696695954,
-0.05698217451572418,
-0.014078579843044281,
0.06873737275600433,
-0.07360608130693436,
0.0630730614066124,
-0.08198481798171997,
0.012562907300889492,
-0.18125148117542267,
0.030582746490836143,
0.02556220255792141,
-0.07756127417087555,
0.08051129430532455,
0.1610197126865387,
-0.1543751358985901,
-0.12179239839315414,
-0.1505657136440277,
0.09805983304977417,
-0.04721866175532341,
0.0007801863248459995,
-0.005245056003332138,
-0.10755717009305954,
0.03880735859274864,
0.07905635237693787,
0.02388819307088852,
0.09664785116910934,
-0.050732966512441635,
-0.026460805907845497,
-0.028535734862089157,
-0.026952167972922325,
0.06222646310925484,
-0.060951024293899536,
-0.029420386999845505,
0.05857784301042557,
-0.01321637723594904,
0.11563537269830704,
-0.08114197105169296,
-0.1311916708946228,
-0.17651109397411346,
0.04375500977039337,
-0.12273527681827545,
-0.11150697618722916,
-0.12684613466262817,
-0.034054260700941086,
-0.026793459430336952,
-0.05424526333808899,
-0.020681774243712425,
-0.035995762795209885,
-0.09855540841817856,
0.05007615685462952,
-0.021738771349191666,
0.019113000482320786,
-0.11898110806941986,
0.047154951840639114,
0.023072753101587296,
0.00952209997922182,
0.22900183498859406,
0.23967400193214417,
-0.10501338541507721,
0.07970281690359116,
-0.13300393521785736,
-0.06916272640228271,
0.12155511975288391,
0.015302781015634537,
0.11593975126743317,
0.06520570814609528,
-0.012209351174533367,
0.07093607634305954,
0.06603305041790009,
0.07040399312973022,
0.09451892226934433,
-0.09665215760469437,
0.03132275491952896,
0.005647694226354361,
-0.054901495575904846,
-0.047769151628017426,
-0.008121064864099026,
0.06017419323325157,
0.06088278442621231,
0.08712935447692871,
-0.06329122185707092,
0.047178879380226135,
-0.06907396763563156,
0.0009565642685629427,
0.020402390509843826,
-0.10111594200134277,
0.05090409144759178,
-0.05048583075404167,
0.04585586115717888,
-0.0038993079215288162,
0.11841075867414474,
0.05489125847816467,
-0.028396034613251686,
0.03150009363889694,
0.07694239914417267,
-0.04186859354376793,
-0.011592810042202473,
-0.006134168244898319,
0.054798588156700134,
-0.043219733983278275,
-0.05355124920606613,
0.04504793509840965,
0.03471345081925392,
-0.05936227738857269,
0.11973360180854797,
-0.09357963502407074,
0.002641031751409173,
0.041799262166023254,
0.028395522385835648,
0.00043721101246774197,
-0.1502877026796341,
-0.14501814544200897,
-0.2070295363664627,
0.04020696133375168,
-0.0972619280219078,
0.06980330497026443,
0.0823310986161232,
0.020472751930356026,
-0.0072508519515395164,
-0.015675708651542664,
-0.054576434195041656,
-0.1250603049993515,
-0.17620421946048737,
-0.05234035849571228,
-0.19658192992210388,
0.009567178785800934,
-0.10002416372299194,
0.051153987646102905,
-0.018896259367465973,
0.08556535094976425,
-0.05606051906943321,
0.11778818815946579,
-0.009087399579584599,
-0.07489084452390671,
0.03840646520256996,
-0.04640626534819603,
0.03561873733997345,
-0.02484917640686035,
0.022134236991405487,
-0.045144930481910706,
0.08696575462818146,
0.020124923437833786,
0.06914480775594711,
-0.037223152816295624,
0.05012250691652298,
-0.1036195158958435,
-0.07003312557935715,
-0.0553266704082489,
0.06647614389657974,
-0.02439258247613907,
0.09582629799842834,
0.08311638981103897,
0.0007597408257424831,
0.01473532896488905,
0.25121182203292847,
-0.05404425039887428,
-0.10183586925268173,
-0.13247138261795044,
0.2002287209033966,
-0.03368125855922699,
0.0583178848028183,
-0.055855244398117065,
-0.00426106620579958,
-0.15232570469379425,
0.2420482039451599,
0.29523324966430664,
-0.08089511096477509,
0.013313311152160168,
-0.08143246918916702,
0.03359610587358475,
0.013623652048408985,
0.10647078603506088,
0.14291693270206451,
0.3189051151275635,
-0.012403322383761406,
0.07417552173137665,
0.00009350610343972221,
-0.060927875339984894,
-0.05881281569600105,
-0.05669590085744858,
-0.02383190393447876,
-0.006449878215789795,
-0.0066023580729961395,
0.1395004689693451,
-0.26645973324775696,
0.016834815964102745,
-0.18420034646987915,
-0.20412065088748932,
-0.08061134070158005,
0.018009180203080177,
0.12304088473320007,
0.044550321996212006,
0.1389375925064087,
0.00005718287866329774,
-0.06459779292345047,
0.1276867389678955,
-0.006128685083240271,
-0.15249358117580414,
-0.10721327364444733,
0.11934854090213776,
-0.12627308070659637,
-0.04108234494924545,
-0.041037049144506454,
0.09749634563922882,
0.08271387219429016,
0.03395570069551468,
-0.07119182497262955,
0.01351496484130621,
-0.03403165191411972,
-0.021414151415228844,
-0.005451269913464785,
0.06813238561153412,
0.02072959579527378,
0.025263439863920212,
0.05354813113808632,
-0.18906964361667633,
0.01142108254134655,
-0.08744467794895172,
0.07205761224031448,
-0.032799381762742996,
0.07564347237348557,
-0.03142198547720909,
0.06004147604107857,
0.0745621845126152,
-0.055871736258268356,
0.03904830664396286,
0.04052802547812462,
-0.020588697865605354,
-0.04159161075949669,
-0.061948951333761215,
-0.1403578817844391,
-0.22841989994049072,
-0.11440495401620865,
0.09922145307064056,
0.029091045260429382,
-0.0986558049917221,
0.048632487654685974,
-0.15799014270305634,
0.05539434030652046,
-0.07917378842830658,
0.11045927554368973,
0.08010590076446533,
0.027341295033693314,
0.0007489612326025963,
0.05805785581469536,
0.06042911112308502,
0.09924639761447906,
-0.11275497078895569,
-0.08823683857917786
] |
null | null | transformers |
# DialoGPT chat bot model using discord messages as data | {"tags": ["conversational"]} | text-generation | S34NtheGuy/DialoGPT-small-Harry282 | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# DialoGPT chat bot model using discord messages as data | [
"# DialoGPT chat bot model using discord messages as data"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# DialoGPT chat bot model using discord messages as data"
] | [
51,
13
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# DialoGPT chat bot model using discord messages as data"
] | [
-0.02911895327270031,
0.003995177801698446,
-0.0046684942208230495,
-0.012396533973515034,
0.11202579736709595,
-0.020793797448277473,
0.19453662633895874,
0.0904533639550209,
0.11296340823173523,
-0.04555835947394371,
0.05478145554661751,
0.14494860172271729,
0.02601793222129345,
0.2345304638147354,
-0.07001583278179169,
-0.2198491394519806,
0.06904508173465729,
0.0020564962178468704,
0.07120723277330399,
0.12327651679515839,
0.0893358588218689,
-0.027994820848107338,
0.09009858965873718,
0.016869334504008293,
-0.19000788033008575,
-0.005467485636472702,
0.01664290763437748,
-0.08414720743894577,
0.0951712355017662,
0.05692970007658005,
0.027007173746824265,
0.022588100284337997,
-0.054826367646455765,
-0.07715894281864166,
0.04409089311957359,
-0.015045171603560448,
-0.03291803225874901,
0.042206596583127975,
-0.06107683107256889,
-0.10459832847118378,
0.18928758800029755,
0.11292500793933868,
0.02833588793873787,
0.07569966465234756,
-0.1312641203403473,
0.011802727356553078,
-0.04876156896352768,
0.02216631919145584,
0.14013226330280304,
0.12497154623270035,
-0.07691969722509384,
0.16699260473251343,
-0.07476947456598282,
0.10180847346782684,
0.05837265029549599,
-0.3295818567276001,
-0.010758930817246437,
0.12642543017864227,
0.030477117747068405,
0.09977107495069504,
-0.018700912594795227,
0.04719923064112663,
0.011075117625296116,
0.027997151017189026,
-0.0939258560538292,
-0.03323792293667793,
-0.14734824001789093,
-0.03544525429606438,
-0.09491617977619171,
-0.024447239935398102,
0.269292950630188,
0.005437672603875399,
0.05558373034000397,
-0.11964231729507446,
-0.07344776391983032,
-0.07583962380886078,
-0.07977471500635147,
-0.021205732598900795,
-0.11365412920713425,
0.06104221194982529,
-0.011485567316412926,
-0.08311361074447632,
-0.11278795450925827,
-0.043469205498695374,
-0.18289406597614288,
0.0821399837732315,
0.0335225835442543,
0.07892793416976929,
-0.2801816165447235,
0.08420786261558533,
0.04518672078847885,
-0.0835626944899559,
0.019823530688881874,
-0.08192607015371323,
-0.04286707937717438,
-0.006278423126786947,
-0.03693044185638428,
-0.0884459912776947,
0.10716810822486877,
0.1580084264278412,
-0.05083105340600014,
0.04353693500161171,
-0.07717008143663406,
0.0446944534778595,
0.10556173324584961,
0.03715191408991814,
-0.00912248995155096,
0.006465920712798834,
0.05436272174119949,
-0.11213027685880661,
0.01778799295425415,
-0.03295382484793663,
-0.1692671924829483,
0.01611638441681862,
0.02191336080431938,
0.08422008901834488,
0.042106952518224716,
0.13374918699264526,
-0.045255158096551895,
-0.03810238465666771,
0.12856028974056244,
-0.003518170677125454,
-0.0027266035322099924,
0.04263519123196602,
-0.07405722886323929,
0.06224050372838974,
0.03224530816078186,
0.06507550925016403,
-0.08838954567909241,
-0.07766801118850708,
-0.03798297420144081,
0.007619286421686411,
-0.030212605372071266,
-0.029250554740428925,
0.02620057761669159,
0.053402043879032135,
-0.0241620484739542,
-0.1678251177072525,
-0.13773390650749207,
0.00567244365811348,
-0.03463026136159897,
-0.06034352630376816,
-0.09172911196947098,
-0.11689257621765137,
0.022298447787761688,
0.01672283187508583,
-0.07708073407411575,
-0.054256051778793335,
-0.059729307889938354,
0.04205011948943138,
-0.07092487066984177,
0.14112763106822968,
-0.11467497050762177,
0.04234171658754349,
-0.10254870355129242,
-0.04991884157061577,
-0.1932521015405655,
0.15436524152755737,
-0.053409744054079056,
0.12730517983436584,
-0.048516929149627686,
0.03334582969546318,
-0.08455422520637512,
0.00886745285242796,
-0.024523701518774033,
0.2663744390010834,
-0.1233612671494484,
-0.08157475292682648,
0.3245835304260254,
-0.0866551399230957,
-0.13101184368133545,
0.17166416347026825,
-0.00581461563706398,
0.06078298017382622,
0.17557644844055176,
0.18741080164909363,
-0.07020799070596695,
-0.009106408804655075,
-0.006908034905791283,
0.08536610007286072,
-0.1426268219947815,
0.015105406753718853,
-0.0069095678627491,
-0.009230674244463444,
-0.013791845180094242,
0.007894156500697136,
0.23989658057689667,
0.135564386844635,
-0.0794452428817749,
-0.030347473919391632,
0.012585587799549103,
-0.03232159465551376,
0.0846608355641365,
-0.027062173932790756,
0.09626356512308121,
-0.024425217881798744,
-0.0443769246339798,
-0.041578181087970734,
0.060682933777570724,
0.0039953915402293205,
0.011674324050545692,
-0.18290074169635773,
0.03165009990334511,
0.05584390088915825,
0.08656518161296844,
-0.12307853996753693,
-0.176169753074646,
-0.024073531851172447,
0.16490675508975983,
0.07964789122343063,
0.0825846791267395,
0.07539273798465729,
-0.08541291952133179,
0.017652546986937523,
0.04020996764302254,
0.17037948966026306,
-0.015576590783894062,
-0.10870205610990524,
-0.10396617650985718,
0.06813302636146545,
-0.0645066425204277,
0.226262167096138,
-0.04232753813266754,
0.019181568175554276,
0.07608156651258469,
0.15684202313423157,
0.011756951920688152,
0.015583735890686512,
0.06726869940757751,
-0.030799711123108864,
-0.01088039018213749,
-0.023927779868245125,
0.03538373112678528,
-0.01441163569688797,
-0.11656852066516876,
0.2350115180015564,
-0.10605599731206894,
0.06379605829715729,
0.18888333439826965,
-0.08889171481132507,
-0.012959163635969162,
-0.043133217841386795,
-0.046505268663167953,
0.004694156814366579,
0.062076911330223083,
-0.03991788998246193,
0.22515693306922913,
0.01117341686040163,
0.13349102437496185,
-0.007687731646001339,
-0.018896888941526413,
-0.014094335027039051,
-0.07590913772583008,
0.00629182206466794,
0.0627899020910263,
0.028147637844085693,
-0.15793262422084808,
0.10818427056074142,
0.013195234350860119,
0.0805935263633728,
0.2113124579191208,
0.0355629026889801,
0.0680663213133812,
0.020914744585752487,
0.0031901278998702765,
-0.06738251447677612,
-0.06983086466789246,
-0.321178674697876,
-0.030278725549578667,
0.049556665122509,
0.05528012290596962,
0.1350831389427185,
-0.042795684188604355,
-0.019094472751021385,
-0.03944716975092888,
-0.022730743512511253,
0.05761689692735672,
0.15371432900428772,
0.04195815697312355,
0.15377937257289886,
-0.006703630555421114,
-0.07485831528902054,
0.04169466346502304,
0.003349226899445057,
-0.11153942346572876,
0.1410476267337799,
-0.17577725648880005,
-0.3476906418800354,
-0.03124120645225048,
-0.130113884806633,
-0.06320375204086304,
0.039843443781137466,
0.08997023105621338,
-0.19903790950775146,
-0.0006238986970856786,
0.017847519367933273,
0.09509206563234329,
0.02413397654891014,
0.016077861189842224,
0.07153959572315216,
-0.06139683723449707,
-0.10136960446834564,
-0.11373108625411987,
-0.05264159291982651,
-0.06683648377656937,
-0.11798607558012009,
0.12459578365087509,
-0.16452273726463318,
0.017987580969929695,
0.21677619218826294,
0.04113182798027992,
0.06206201761960983,
-0.023023243993520737,
0.25566840171813965,
-0.10939715057611465,
0.0286073237657547,
0.17293484508991241,
0.01686146669089794,
0.02669895999133587,
0.10997503250837326,
-0.019446026533842087,
-0.12817075848579407,
0.06026293337345123,
-0.01656145602464676,
-0.07096251100301743,
-0.20454192161560059,
-0.21024994552135468,
-0.08096782863140106,
0.09014924615621567,
-0.018005413934588432,
0.06904570758342743,
0.1414383202791214,
0.02970438078045845,
-0.026131540536880493,
-0.0490809828042984,
0.09899041801691055,
0.051192380487918854,
0.17059782147407532,
-0.09738970547914505,
0.12161380052566528,
-0.0215446837246418,
-0.09003637731075287,
0.07071530818939209,
0.008625643327832222,
0.07919701933860779,
0.05664544552564621,
0.056681711226701736,
0.03477178141474724,
0.05184609070420265,
0.13479608297348022,
0.004874000791460276,
0.01784246601164341,
-0.08477703481912613,
0.005763623397797346,
-0.022313527762889862,
-0.09581872075796127,
0.007024643011391163,
0.05425998196005821,
-0.16158020496368408,
-0.028891153633594513,
-0.0017159533454105258,
0.10731203109025955,
0.04837334528565407,
0.09739865362644196,
-0.16217289865016937,
-0.09366218745708466,
0.044000640511512756,
-0.03462972491979599,
-0.07541224360466003,
0.07639168947935104,
0.06741161644458771,
-0.1383616030216217,
0.054750069975852966,
-0.011487936601042747,
0.09841711074113846,
-0.11301963031291962,
0.04922184348106384,
-0.1383095234632492,
-0.040402695536613464,
-0.005755018442869186,
0.07621902227401733,
-0.2630176544189453,
0.13395322859287262,
-0.03546271100640297,
-0.03840550407767296,
-0.1174018606543541,
-0.017455484718084335,
-0.0038811310660094023,
0.10694295167922974,
0.04950512945652008,
0.01964215375483036,
0.04950443282723427,
0.014236033894121647,
-0.10705028474330902,
0.03218303993344307,
0.010299173183739185,
-0.023959863930940628,
-0.05948719382286072,
0.02074790745973587,
-0.024005096405744553,
-0.018881697207689285,
-0.12932980060577393,
-0.03812210261821747,
-0.15508605539798737,
0.041080720722675323,
0.18053974211215973,
0.11493601649999619,
0.037637773901224136,
-0.018357248976826668,
-0.006955144926905632,
0.26158514618873596,
0.09794305264949799,
-0.13049659132957458,
-0.06943461298942566,
0.025171175599098206,
0.03627297282218933,
-0.06180146709084511,
-0.002944500418379903,
-0.03472563624382019,
0.05261370167136192,
-0.06326436251401901,
-0.17609041929244995,
0.10805560648441315,
-0.11141735315322876,
-0.029353909194469452,
-0.007894911803305149,
0.16547085344791412,
0.11611169576644897,
-0.0017626271583139896,
0.06191926822066307,
-0.04753774031996727,
-0.07226573675870895,
-0.09732077270746231,
-0.010933701880276203,
0.10964737087488174,
-0.03223278373479843,
0.11675872653722763,
0.05453915521502495,
-0.22489312291145325,
-0.07166748493909836,
-0.017499007284641266,
0.28386232256889343,
0.10852590948343277,
-0.03674788773059845,
0.17725762724876404,
0.11806492507457733,
-0.005635837558656931,
-0.21437020599842072,
-0.0862417072057724,
-0.056013092398643494,
-0.04754536226391792,
-0.08656063675880432,
-0.15293635427951813,
0.05119415372610092,
-0.03897181153297424,
-0.02094240114092827,
0.07458508759737015,
-0.3477265536785126,
-0.09044193476438522,
0.1824522167444229,
-0.06594786792993546,
0.38817891478538513,
-0.03777773678302765,
-0.0718904659152031,
-0.026183942332863808,
-0.15700702369213104,
0.1885446012020111,
-0.0038575793150812387,
0.05461110919713974,
0.008353380486369133,
0.2613135874271393,
0.04980030655860901,
0.015743695199489594,
0.051713887602090836,
0.08028706163167953,
-0.04876833036541939,
-0.1043628379702568,
-0.04317786917090416,
-0.0030902179423719645,
0.026795990765094757,
0.10658130794763565,
-0.04968138039112091,
0.0011742463102564216,
-0.14351791143417358,
-0.022059278562664986,
-0.1123029813170433,
0.034258145838975906,
0.048715740442276,
-0.01737889274954796,
-0.02300572209060192,
-0.04134640842676163,
-0.0317239984869957,
0.04755343124270439,
0.12839727103710175,
-0.07772476226091385,
0.17595504224300385,
0.0972408801317215,
0.1043165922164917,
-0.20370347797870636,
0.0401637889444828,
-0.014201825484633446,
-0.04263058304786682,
0.06336291879415512,
-0.07573235780000687,
0.0012032645754516125,
0.08475054055452347,
-0.06958045810461044,
0.11022278666496277,
0.05037990212440491,
-0.05388018116354942,
0.06973491609096527,
0.10324688255786896,
-0.22402584552764893,
-0.13858133554458618,
-0.0032234573736786842,
0.10255797952413559,
0.09593109786510468,
0.1335451602935791,
0.22795435786247253,
0.00013332384696695954,
-0.05698217451572418,
-0.014078579843044281,
0.06873737275600433,
-0.07360608130693436,
0.0630730614066124,
-0.08198481798171997,
0.012562907300889492,
-0.18125148117542267,
0.030582746490836143,
0.02556220255792141,
-0.07756127417087555,
0.08051129430532455,
0.1610197126865387,
-0.1543751358985901,
-0.12179239839315414,
-0.1505657136440277,
0.09805983304977417,
-0.04721866175532341,
0.0007801863248459995,
-0.005245056003332138,
-0.10755717009305954,
0.03880735859274864,
0.07905635237693787,
0.02388819307088852,
0.09664785116910934,
-0.050732966512441635,
-0.026460805907845497,
-0.028535734862089157,
-0.026952167972922325,
0.06222646310925484,
-0.060951024293899536,
-0.029420386999845505,
0.05857784301042557,
-0.01321637723594904,
0.11563537269830704,
-0.08114197105169296,
-0.1311916708946228,
-0.17651109397411346,
0.04375500977039337,
-0.12273527681827545,
-0.11150697618722916,
-0.12684613466262817,
-0.034054260700941086,
-0.026793459430336952,
-0.05424526333808899,
-0.020681774243712425,
-0.035995762795209885,
-0.09855540841817856,
0.05007615685462952,
-0.021738771349191666,
0.019113000482320786,
-0.11898110806941986,
0.047154951840639114,
0.023072753101587296,
0.00952209997922182,
0.22900183498859406,
0.23967400193214417,
-0.10501338541507721,
0.07970281690359116,
-0.13300393521785736,
-0.06916272640228271,
0.12155511975288391,
0.015302781015634537,
0.11593975126743317,
0.06520570814609528,
-0.012209351174533367,
0.07093607634305954,
0.06603305041790009,
0.07040399312973022,
0.09451892226934433,
-0.09665215760469437,
0.03132275491952896,
0.005647694226354361,
-0.054901495575904846,
-0.047769151628017426,
-0.008121064864099026,
0.06017419323325157,
0.06088278442621231,
0.08712935447692871,
-0.06329122185707092,
0.047178879380226135,
-0.06907396763563156,
0.0009565642685629427,
0.020402390509843826,
-0.10111594200134277,
0.05090409144759178,
-0.05048583075404167,
0.04585586115717888,
-0.0038993079215288162,
0.11841075867414474,
0.05489125847816467,
-0.028396034613251686,
0.03150009363889694,
0.07694239914417267,
-0.04186859354376793,
-0.011592810042202473,
-0.006134168244898319,
0.054798588156700134,
-0.043219733983278275,
-0.05355124920606613,
0.04504793509840965,
0.03471345081925392,
-0.05936227738857269,
0.11973360180854797,
-0.09357963502407074,
0.002641031751409173,
0.041799262166023254,
0.028395522385835648,
0.00043721101246774197,
-0.1502877026796341,
-0.14501814544200897,
-0.2070295363664627,
0.04020696133375168,
-0.0972619280219078,
0.06980330497026443,
0.0823310986161232,
0.020472751930356026,
-0.0072508519515395164,
-0.015675708651542664,
-0.054576434195041656,
-0.1250603049993515,
-0.17620421946048737,
-0.05234035849571228,
-0.19658192992210388,
0.009567178785800934,
-0.10002416372299194,
0.051153987646102905,
-0.018896259367465973,
0.08556535094976425,
-0.05606051906943321,
0.11778818815946579,
-0.009087399579584599,
-0.07489084452390671,
0.03840646520256996,
-0.04640626534819603,
0.03561873733997345,
-0.02484917640686035,
0.022134236991405487,
-0.045144930481910706,
0.08696575462818146,
0.020124923437833786,
0.06914480775594711,
-0.037223152816295624,
0.05012250691652298,
-0.1036195158958435,
-0.07003312557935715,
-0.0553266704082489,
0.06647614389657974,
-0.02439258247613907,
0.09582629799842834,
0.08311638981103897,
0.0007597408257424831,
0.01473532896488905,
0.25121182203292847,
-0.05404425039887428,
-0.10183586925268173,
-0.13247138261795044,
0.2002287209033966,
-0.03368125855922699,
0.0583178848028183,
-0.055855244398117065,
-0.00426106620579958,
-0.15232570469379425,
0.2420482039451599,
0.29523324966430664,
-0.08089511096477509,
0.013313311152160168,
-0.08143246918916702,
0.03359610587358475,
0.013623652048408985,
0.10647078603506088,
0.14291693270206451,
0.3189051151275635,
-0.012403322383761406,
0.07417552173137665,
0.00009350610343972221,
-0.060927875339984894,
-0.05881281569600105,
-0.05669590085744858,
-0.02383190393447876,
-0.006449878215789795,
-0.0066023580729961395,
0.1395004689693451,
-0.26645973324775696,
0.016834815964102745,
-0.18420034646987915,
-0.20412065088748932,
-0.08061134070158005,
0.018009180203080177,
0.12304088473320007,
0.044550321996212006,
0.1389375925064087,
0.00005718287866329774,
-0.06459779292345047,
0.1276867389678955,
-0.006128685083240271,
-0.15249358117580414,
-0.10721327364444733,
0.11934854090213776,
-0.12627308070659637,
-0.04108234494924545,
-0.041037049144506454,
0.09749634563922882,
0.08271387219429016,
0.03395570069551468,
-0.07119182497262955,
0.01351496484130621,
-0.03403165191411972,
-0.021414151415228844,
-0.005451269913464785,
0.06813238561153412,
0.02072959579527378,
0.025263439863920212,
0.05354813113808632,
-0.18906964361667633,
0.01142108254134655,
-0.08744467794895172,
0.07205761224031448,
-0.032799381762742996,
0.07564347237348557,
-0.03142198547720909,
0.06004147604107857,
0.0745621845126152,
-0.055871736258268356,
0.03904830664396286,
0.04052802547812462,
-0.020588697865605354,
-0.04159161075949669,
-0.061948951333761215,
-0.1403578817844391,
-0.22841989994049072,
-0.11440495401620865,
0.09922145307064056,
0.029091045260429382,
-0.0986558049917221,
0.048632487654685974,
-0.15799014270305634,
0.05539434030652046,
-0.07917378842830658,
0.11045927554368973,
0.08010590076446533,
0.027341295033693314,
0.0007489612326025963,
0.05805785581469536,
0.06042911112308502,
0.09924639761447906,
-0.11275497078895569,
-0.08823683857917786
] |
null | null | transformers |
# DialoGPT chat bot model using discord messages as data | {"tags": ["conversational"]} | text-generation | S34NtheGuy/DialoGPT-small-MJOLNIR_Soul | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# DialoGPT chat bot model using discord messages as data | [
"# DialoGPT chat bot model using discord messages as data"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# DialoGPT chat bot model using discord messages as data"
] | [
51,
13
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# DialoGPT chat bot model using discord messages as data"
] | [
-0.02911895327270031,
0.003995177801698446,
-0.0046684942208230495,
-0.012396533973515034,
0.11202579736709595,
-0.020793797448277473,
0.19453662633895874,
0.0904533639550209,
0.11296340823173523,
-0.04555835947394371,
0.05478145554661751,
0.14494860172271729,
0.02601793222129345,
0.2345304638147354,
-0.07001583278179169,
-0.2198491394519806,
0.06904508173465729,
0.0020564962178468704,
0.07120723277330399,
0.12327651679515839,
0.0893358588218689,
-0.027994820848107338,
0.09009858965873718,
0.016869334504008293,
-0.19000788033008575,
-0.005467485636472702,
0.01664290763437748,
-0.08414720743894577,
0.0951712355017662,
0.05692970007658005,
0.027007173746824265,
0.022588100284337997,
-0.054826367646455765,
-0.07715894281864166,
0.04409089311957359,
-0.015045171603560448,
-0.03291803225874901,
0.042206596583127975,
-0.06107683107256889,
-0.10459832847118378,
0.18928758800029755,
0.11292500793933868,
0.02833588793873787,
0.07569966465234756,
-0.1312641203403473,
0.011802727356553078,
-0.04876156896352768,
0.02216631919145584,
0.14013226330280304,
0.12497154623270035,
-0.07691969722509384,
0.16699260473251343,
-0.07476947456598282,
0.10180847346782684,
0.05837265029549599,
-0.3295818567276001,
-0.010758930817246437,
0.12642543017864227,
0.030477117747068405,
0.09977107495069504,
-0.018700912594795227,
0.04719923064112663,
0.011075117625296116,
0.027997151017189026,
-0.0939258560538292,
-0.03323792293667793,
-0.14734824001789093,
-0.03544525429606438,
-0.09491617977619171,
-0.024447239935398102,
0.269292950630188,
0.005437672603875399,
0.05558373034000397,
-0.11964231729507446,
-0.07344776391983032,
-0.07583962380886078,
-0.07977471500635147,
-0.021205732598900795,
-0.11365412920713425,
0.06104221194982529,
-0.011485567316412926,
-0.08311361074447632,
-0.11278795450925827,
-0.043469205498695374,
-0.18289406597614288,
0.0821399837732315,
0.0335225835442543,
0.07892793416976929,
-0.2801816165447235,
0.08420786261558533,
0.04518672078847885,
-0.0835626944899559,
0.019823530688881874,
-0.08192607015371323,
-0.04286707937717438,
-0.006278423126786947,
-0.03693044185638428,
-0.0884459912776947,
0.10716810822486877,
0.1580084264278412,
-0.05083105340600014,
0.04353693500161171,
-0.07717008143663406,
0.0446944534778595,
0.10556173324584961,
0.03715191408991814,
-0.00912248995155096,
0.006465920712798834,
0.05436272174119949,
-0.11213027685880661,
0.01778799295425415,
-0.03295382484793663,
-0.1692671924829483,
0.01611638441681862,
0.02191336080431938,
0.08422008901834488,
0.042106952518224716,
0.13374918699264526,
-0.045255158096551895,
-0.03810238465666771,
0.12856028974056244,
-0.003518170677125454,
-0.0027266035322099924,
0.04263519123196602,
-0.07405722886323929,
0.06224050372838974,
0.03224530816078186,
0.06507550925016403,
-0.08838954567909241,
-0.07766801118850708,
-0.03798297420144081,
0.007619286421686411,
-0.030212605372071266,
-0.029250554740428925,
0.02620057761669159,
0.053402043879032135,
-0.0241620484739542,
-0.1678251177072525,
-0.13773390650749207,
0.00567244365811348,
-0.03463026136159897,
-0.06034352630376816,
-0.09172911196947098,
-0.11689257621765137,
0.022298447787761688,
0.01672283187508583,
-0.07708073407411575,
-0.054256051778793335,
-0.059729307889938354,
0.04205011948943138,
-0.07092487066984177,
0.14112763106822968,
-0.11467497050762177,
0.04234171658754349,
-0.10254870355129242,
-0.04991884157061577,
-0.1932521015405655,
0.15436524152755737,
-0.053409744054079056,
0.12730517983436584,
-0.048516929149627686,
0.03334582969546318,
-0.08455422520637512,
0.00886745285242796,
-0.024523701518774033,
0.2663744390010834,
-0.1233612671494484,
-0.08157475292682648,
0.3245835304260254,
-0.0866551399230957,
-0.13101184368133545,
0.17166416347026825,
-0.00581461563706398,
0.06078298017382622,
0.17557644844055176,
0.18741080164909363,
-0.07020799070596695,
-0.009106408804655075,
-0.006908034905791283,
0.08536610007286072,
-0.1426268219947815,
0.015105406753718853,
-0.0069095678627491,
-0.009230674244463444,
-0.013791845180094242,
0.007894156500697136,
0.23989658057689667,
0.135564386844635,
-0.0794452428817749,
-0.030347473919391632,
0.012585587799549103,
-0.03232159465551376,
0.0846608355641365,
-0.027062173932790756,
0.09626356512308121,
-0.024425217881798744,
-0.0443769246339798,
-0.041578181087970734,
0.060682933777570724,
0.0039953915402293205,
0.011674324050545692,
-0.18290074169635773,
0.03165009990334511,
0.05584390088915825,
0.08656518161296844,
-0.12307853996753693,
-0.176169753074646,
-0.024073531851172447,
0.16490675508975983,
0.07964789122343063,
0.0825846791267395,
0.07539273798465729,
-0.08541291952133179,
0.017652546986937523,
0.04020996764302254,
0.17037948966026306,
-0.015576590783894062,
-0.10870205610990524,
-0.10396617650985718,
0.06813302636146545,
-0.0645066425204277,
0.226262167096138,
-0.04232753813266754,
0.019181568175554276,
0.07608156651258469,
0.15684202313423157,
0.011756951920688152,
0.015583735890686512,
0.06726869940757751,
-0.030799711123108864,
-0.01088039018213749,
-0.023927779868245125,
0.03538373112678528,
-0.01441163569688797,
-0.11656852066516876,
0.2350115180015564,
-0.10605599731206894,
0.06379605829715729,
0.18888333439826965,
-0.08889171481132507,
-0.012959163635969162,
-0.043133217841386795,
-0.046505268663167953,
0.004694156814366579,
0.062076911330223083,
-0.03991788998246193,
0.22515693306922913,
0.01117341686040163,
0.13349102437496185,
-0.007687731646001339,
-0.018896888941526413,
-0.014094335027039051,
-0.07590913772583008,
0.00629182206466794,
0.0627899020910263,
0.028147637844085693,
-0.15793262422084808,
0.10818427056074142,
0.013195234350860119,
0.0805935263633728,
0.2113124579191208,
0.0355629026889801,
0.0680663213133812,
0.020914744585752487,
0.0031901278998702765,
-0.06738251447677612,
-0.06983086466789246,
-0.321178674697876,
-0.030278725549578667,
0.049556665122509,
0.05528012290596962,
0.1350831389427185,
-0.042795684188604355,
-0.019094472751021385,
-0.03944716975092888,
-0.022730743512511253,
0.05761689692735672,
0.15371432900428772,
0.04195815697312355,
0.15377937257289886,
-0.006703630555421114,
-0.07485831528902054,
0.04169466346502304,
0.003349226899445057,
-0.11153942346572876,
0.1410476267337799,
-0.17577725648880005,
-0.3476906418800354,
-0.03124120645225048,
-0.130113884806633,
-0.06320375204086304,
0.039843443781137466,
0.08997023105621338,
-0.19903790950775146,
-0.0006238986970856786,
0.017847519367933273,
0.09509206563234329,
0.02413397654891014,
0.016077861189842224,
0.07153959572315216,
-0.06139683723449707,
-0.10136960446834564,
-0.11373108625411987,
-0.05264159291982651,
-0.06683648377656937,
-0.11798607558012009,
0.12459578365087509,
-0.16452273726463318,
0.017987580969929695,
0.21677619218826294,
0.04113182798027992,
0.06206201761960983,
-0.023023243993520737,
0.25566840171813965,
-0.10939715057611465,
0.0286073237657547,
0.17293484508991241,
0.01686146669089794,
0.02669895999133587,
0.10997503250837326,
-0.019446026533842087,
-0.12817075848579407,
0.06026293337345123,
-0.01656145602464676,
-0.07096251100301743,
-0.20454192161560059,
-0.21024994552135468,
-0.08096782863140106,
0.09014924615621567,
-0.018005413934588432,
0.06904570758342743,
0.1414383202791214,
0.02970438078045845,
-0.026131540536880493,
-0.0490809828042984,
0.09899041801691055,
0.051192380487918854,
0.17059782147407532,
-0.09738970547914505,
0.12161380052566528,
-0.0215446837246418,
-0.09003637731075287,
0.07071530818939209,
0.008625643327832222,
0.07919701933860779,
0.05664544552564621,
0.056681711226701736,
0.03477178141474724,
0.05184609070420265,
0.13479608297348022,
0.004874000791460276,
0.01784246601164341,
-0.08477703481912613,
0.005763623397797346,
-0.022313527762889862,
-0.09581872075796127,
0.007024643011391163,
0.05425998196005821,
-0.16158020496368408,
-0.028891153633594513,
-0.0017159533454105258,
0.10731203109025955,
0.04837334528565407,
0.09739865362644196,
-0.16217289865016937,
-0.09366218745708466,
0.044000640511512756,
-0.03462972491979599,
-0.07541224360466003,
0.07639168947935104,
0.06741161644458771,
-0.1383616030216217,
0.054750069975852966,
-0.011487936601042747,
0.09841711074113846,
-0.11301963031291962,
0.04922184348106384,
-0.1383095234632492,
-0.040402695536613464,
-0.005755018442869186,
0.07621902227401733,
-0.2630176544189453,
0.13395322859287262,
-0.03546271100640297,
-0.03840550407767296,
-0.1174018606543541,
-0.017455484718084335,
-0.0038811310660094023,
0.10694295167922974,
0.04950512945652008,
0.01964215375483036,
0.04950443282723427,
0.014236033894121647,
-0.10705028474330902,
0.03218303993344307,
0.010299173183739185,
-0.023959863930940628,
-0.05948719382286072,
0.02074790745973587,
-0.024005096405744553,
-0.018881697207689285,
-0.12932980060577393,
-0.03812210261821747,
-0.15508605539798737,
0.041080720722675323,
0.18053974211215973,
0.11493601649999619,
0.037637773901224136,
-0.018357248976826668,
-0.006955144926905632,
0.26158514618873596,
0.09794305264949799,
-0.13049659132957458,
-0.06943461298942566,
0.025171175599098206,
0.03627297282218933,
-0.06180146709084511,
-0.002944500418379903,
-0.03472563624382019,
0.05261370167136192,
-0.06326436251401901,
-0.17609041929244995,
0.10805560648441315,
-0.11141735315322876,
-0.029353909194469452,
-0.007894911803305149,
0.16547085344791412,
0.11611169576644897,
-0.0017626271583139896,
0.06191926822066307,
-0.04753774031996727,
-0.07226573675870895,
-0.09732077270746231,
-0.010933701880276203,
0.10964737087488174,
-0.03223278373479843,
0.11675872653722763,
0.05453915521502495,
-0.22489312291145325,
-0.07166748493909836,
-0.017499007284641266,
0.28386232256889343,
0.10852590948343277,
-0.03674788773059845,
0.17725762724876404,
0.11806492507457733,
-0.005635837558656931,
-0.21437020599842072,
-0.0862417072057724,
-0.056013092398643494,
-0.04754536226391792,
-0.08656063675880432,
-0.15293635427951813,
0.05119415372610092,
-0.03897181153297424,
-0.02094240114092827,
0.07458508759737015,
-0.3477265536785126,
-0.09044193476438522,
0.1824522167444229,
-0.06594786792993546,
0.38817891478538513,
-0.03777773678302765,
-0.0718904659152031,
-0.026183942332863808,
-0.15700702369213104,
0.1885446012020111,
-0.0038575793150812387,
0.05461110919713974,
0.008353380486369133,
0.2613135874271393,
0.04980030655860901,
0.015743695199489594,
0.051713887602090836,
0.08028706163167953,
-0.04876833036541939,
-0.1043628379702568,
-0.04317786917090416,
-0.0030902179423719645,
0.026795990765094757,
0.10658130794763565,
-0.04968138039112091,
0.0011742463102564216,
-0.14351791143417358,
-0.022059278562664986,
-0.1123029813170433,
0.034258145838975906,
0.048715740442276,
-0.01737889274954796,
-0.02300572209060192,
-0.04134640842676163,
-0.0317239984869957,
0.04755343124270439,
0.12839727103710175,
-0.07772476226091385,
0.17595504224300385,
0.0972408801317215,
0.1043165922164917,
-0.20370347797870636,
0.0401637889444828,
-0.014201825484633446,
-0.04263058304786682,
0.06336291879415512,
-0.07573235780000687,
0.0012032645754516125,
0.08475054055452347,
-0.06958045810461044,
0.11022278666496277,
0.05037990212440491,
-0.05388018116354942,
0.06973491609096527,
0.10324688255786896,
-0.22402584552764893,
-0.13858133554458618,
-0.0032234573736786842,
0.10255797952413559,
0.09593109786510468,
0.1335451602935791,
0.22795435786247253,
0.00013332384696695954,
-0.05698217451572418,
-0.014078579843044281,
0.06873737275600433,
-0.07360608130693436,
0.0630730614066124,
-0.08198481798171997,
0.012562907300889492,
-0.18125148117542267,
0.030582746490836143,
0.02556220255792141,
-0.07756127417087555,
0.08051129430532455,
0.1610197126865387,
-0.1543751358985901,
-0.12179239839315414,
-0.1505657136440277,
0.09805983304977417,
-0.04721866175532341,
0.0007801863248459995,
-0.005245056003332138,
-0.10755717009305954,
0.03880735859274864,
0.07905635237693787,
0.02388819307088852,
0.09664785116910934,
-0.050732966512441635,
-0.026460805907845497,
-0.028535734862089157,
-0.026952167972922325,
0.06222646310925484,
-0.060951024293899536,
-0.029420386999845505,
0.05857784301042557,
-0.01321637723594904,
0.11563537269830704,
-0.08114197105169296,
-0.1311916708946228,
-0.17651109397411346,
0.04375500977039337,
-0.12273527681827545,
-0.11150697618722916,
-0.12684613466262817,
-0.034054260700941086,
-0.026793459430336952,
-0.05424526333808899,
-0.020681774243712425,
-0.035995762795209885,
-0.09855540841817856,
0.05007615685462952,
-0.021738771349191666,
0.019113000482320786,
-0.11898110806941986,
0.047154951840639114,
0.023072753101587296,
0.00952209997922182,
0.22900183498859406,
0.23967400193214417,
-0.10501338541507721,
0.07970281690359116,
-0.13300393521785736,
-0.06916272640228271,
0.12155511975288391,
0.015302781015634537,
0.11593975126743317,
0.06520570814609528,
-0.012209351174533367,
0.07093607634305954,
0.06603305041790009,
0.07040399312973022,
0.09451892226934433,
-0.09665215760469437,
0.03132275491952896,
0.005647694226354361,
-0.054901495575904846,
-0.047769151628017426,
-0.008121064864099026,
0.06017419323325157,
0.06088278442621231,
0.08712935447692871,
-0.06329122185707092,
0.047178879380226135,
-0.06907396763563156,
0.0009565642685629427,
0.020402390509843826,
-0.10111594200134277,
0.05090409144759178,
-0.05048583075404167,
0.04585586115717888,
-0.0038993079215288162,
0.11841075867414474,
0.05489125847816467,
-0.028396034613251686,
0.03150009363889694,
0.07694239914417267,
-0.04186859354376793,
-0.011592810042202473,
-0.006134168244898319,
0.054798588156700134,
-0.043219733983278275,
-0.05355124920606613,
0.04504793509840965,
0.03471345081925392,
-0.05936227738857269,
0.11973360180854797,
-0.09357963502407074,
0.002641031751409173,
0.041799262166023254,
0.028395522385835648,
0.00043721101246774197,
-0.1502877026796341,
-0.14501814544200897,
-0.2070295363664627,
0.04020696133375168,
-0.0972619280219078,
0.06980330497026443,
0.0823310986161232,
0.020472751930356026,
-0.0072508519515395164,
-0.015675708651542664,
-0.054576434195041656,
-0.1250603049993515,
-0.17620421946048737,
-0.05234035849571228,
-0.19658192992210388,
0.009567178785800934,
-0.10002416372299194,
0.051153987646102905,
-0.018896259367465973,
0.08556535094976425,
-0.05606051906943321,
0.11778818815946579,
-0.009087399579584599,
-0.07489084452390671,
0.03840646520256996,
-0.04640626534819603,
0.03561873733997345,
-0.02484917640686035,
0.022134236991405487,
-0.045144930481910706,
0.08696575462818146,
0.020124923437833786,
0.06914480775594711,
-0.037223152816295624,
0.05012250691652298,
-0.1036195158958435,
-0.07003312557935715,
-0.0553266704082489,
0.06647614389657974,
-0.02439258247613907,
0.09582629799842834,
0.08311638981103897,
0.0007597408257424831,
0.01473532896488905,
0.25121182203292847,
-0.05404425039887428,
-0.10183586925268173,
-0.13247138261795044,
0.2002287209033966,
-0.03368125855922699,
0.0583178848028183,
-0.055855244398117065,
-0.00426106620579958,
-0.15232570469379425,
0.2420482039451599,
0.29523324966430664,
-0.08089511096477509,
0.013313311152160168,
-0.08143246918916702,
0.03359610587358475,
0.013623652048408985,
0.10647078603506088,
0.14291693270206451,
0.3189051151275635,
-0.012403322383761406,
0.07417552173137665,
0.00009350610343972221,
-0.060927875339984894,
-0.05881281569600105,
-0.05669590085744858,
-0.02383190393447876,
-0.006449878215789795,
-0.0066023580729961395,
0.1395004689693451,
-0.26645973324775696,
0.016834815964102745,
-0.18420034646987915,
-0.20412065088748932,
-0.08061134070158005,
0.018009180203080177,
0.12304088473320007,
0.044550321996212006,
0.1389375925064087,
0.00005718287866329774,
-0.06459779292345047,
0.1276867389678955,
-0.006128685083240271,
-0.15249358117580414,
-0.10721327364444733,
0.11934854090213776,
-0.12627308070659637,
-0.04108234494924545,
-0.041037049144506454,
0.09749634563922882,
0.08271387219429016,
0.03395570069551468,
-0.07119182497262955,
0.01351496484130621,
-0.03403165191411972,
-0.021414151415228844,
-0.005451269913464785,
0.06813238561153412,
0.02072959579527378,
0.025263439863920212,
0.05354813113808632,
-0.18906964361667633,
0.01142108254134655,
-0.08744467794895172,
0.07205761224031448,
-0.032799381762742996,
0.07564347237348557,
-0.03142198547720909,
0.06004147604107857,
0.0745621845126152,
-0.055871736258268356,
0.03904830664396286,
0.04052802547812462,
-0.020588697865605354,
-0.04159161075949669,
-0.061948951333761215,
-0.1403578817844391,
-0.22841989994049072,
-0.11440495401620865,
0.09922145307064056,
0.029091045260429382,
-0.0986558049917221,
0.048632487654685974,
-0.15799014270305634,
0.05539434030652046,
-0.07917378842830658,
0.11045927554368973,
0.08010590076446533,
0.027341295033693314,
0.0007489612326025963,
0.05805785581469536,
0.06042911112308502,
0.09924639761447906,
-0.11275497078895569,
-0.08823683857917786
] |
null | null | transformers |
# DialoGPT chat bot model using discord messages as data | {"tags": ["conversational"]} | text-generation | S34NtheGuy/DialoGPT-small-cursedryno | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# DialoGPT chat bot model using discord messages as data | [
"# DialoGPT chat bot model using discord messages as data"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# DialoGPT chat bot model using discord messages as data"
] | [
51,
13
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# DialoGPT chat bot model using discord messages as data"
] | [
-0.02911895327270031,
0.003995177801698446,
-0.0046684942208230495,
-0.012396533973515034,
0.11202579736709595,
-0.020793797448277473,
0.19453662633895874,
0.0904533639550209,
0.11296340823173523,
-0.04555835947394371,
0.05478145554661751,
0.14494860172271729,
0.02601793222129345,
0.2345304638147354,
-0.07001583278179169,
-0.2198491394519806,
0.06904508173465729,
0.0020564962178468704,
0.07120723277330399,
0.12327651679515839,
0.0893358588218689,
-0.027994820848107338,
0.09009858965873718,
0.016869334504008293,
-0.19000788033008575,
-0.005467485636472702,
0.01664290763437748,
-0.08414720743894577,
0.0951712355017662,
0.05692970007658005,
0.027007173746824265,
0.022588100284337997,
-0.054826367646455765,
-0.07715894281864166,
0.04409089311957359,
-0.015045171603560448,
-0.03291803225874901,
0.042206596583127975,
-0.06107683107256889,
-0.10459832847118378,
0.18928758800029755,
0.11292500793933868,
0.02833588793873787,
0.07569966465234756,
-0.1312641203403473,
0.011802727356553078,
-0.04876156896352768,
0.02216631919145584,
0.14013226330280304,
0.12497154623270035,
-0.07691969722509384,
0.16699260473251343,
-0.07476947456598282,
0.10180847346782684,
0.05837265029549599,
-0.3295818567276001,
-0.010758930817246437,
0.12642543017864227,
0.030477117747068405,
0.09977107495069504,
-0.018700912594795227,
0.04719923064112663,
0.011075117625296116,
0.027997151017189026,
-0.0939258560538292,
-0.03323792293667793,
-0.14734824001789093,
-0.03544525429606438,
-0.09491617977619171,
-0.024447239935398102,
0.269292950630188,
0.005437672603875399,
0.05558373034000397,
-0.11964231729507446,
-0.07344776391983032,
-0.07583962380886078,
-0.07977471500635147,
-0.021205732598900795,
-0.11365412920713425,
0.06104221194982529,
-0.011485567316412926,
-0.08311361074447632,
-0.11278795450925827,
-0.043469205498695374,
-0.18289406597614288,
0.0821399837732315,
0.0335225835442543,
0.07892793416976929,
-0.2801816165447235,
0.08420786261558533,
0.04518672078847885,
-0.0835626944899559,
0.019823530688881874,
-0.08192607015371323,
-0.04286707937717438,
-0.006278423126786947,
-0.03693044185638428,
-0.0884459912776947,
0.10716810822486877,
0.1580084264278412,
-0.05083105340600014,
0.04353693500161171,
-0.07717008143663406,
0.0446944534778595,
0.10556173324584961,
0.03715191408991814,
-0.00912248995155096,
0.006465920712798834,
0.05436272174119949,
-0.11213027685880661,
0.01778799295425415,
-0.03295382484793663,
-0.1692671924829483,
0.01611638441681862,
0.02191336080431938,
0.08422008901834488,
0.042106952518224716,
0.13374918699264526,
-0.045255158096551895,
-0.03810238465666771,
0.12856028974056244,
-0.003518170677125454,
-0.0027266035322099924,
0.04263519123196602,
-0.07405722886323929,
0.06224050372838974,
0.03224530816078186,
0.06507550925016403,
-0.08838954567909241,
-0.07766801118850708,
-0.03798297420144081,
0.007619286421686411,
-0.030212605372071266,
-0.029250554740428925,
0.02620057761669159,
0.053402043879032135,
-0.0241620484739542,
-0.1678251177072525,
-0.13773390650749207,
0.00567244365811348,
-0.03463026136159897,
-0.06034352630376816,
-0.09172911196947098,
-0.11689257621765137,
0.022298447787761688,
0.01672283187508583,
-0.07708073407411575,
-0.054256051778793335,
-0.059729307889938354,
0.04205011948943138,
-0.07092487066984177,
0.14112763106822968,
-0.11467497050762177,
0.04234171658754349,
-0.10254870355129242,
-0.04991884157061577,
-0.1932521015405655,
0.15436524152755737,
-0.053409744054079056,
0.12730517983436584,
-0.048516929149627686,
0.03334582969546318,
-0.08455422520637512,
0.00886745285242796,
-0.024523701518774033,
0.2663744390010834,
-0.1233612671494484,
-0.08157475292682648,
0.3245835304260254,
-0.0866551399230957,
-0.13101184368133545,
0.17166416347026825,
-0.00581461563706398,
0.06078298017382622,
0.17557644844055176,
0.18741080164909363,
-0.07020799070596695,
-0.009106408804655075,
-0.006908034905791283,
0.08536610007286072,
-0.1426268219947815,
0.015105406753718853,
-0.0069095678627491,
-0.009230674244463444,
-0.013791845180094242,
0.007894156500697136,
0.23989658057689667,
0.135564386844635,
-0.0794452428817749,
-0.030347473919391632,
0.012585587799549103,
-0.03232159465551376,
0.0846608355641365,
-0.027062173932790756,
0.09626356512308121,
-0.024425217881798744,
-0.0443769246339798,
-0.041578181087970734,
0.060682933777570724,
0.0039953915402293205,
0.011674324050545692,
-0.18290074169635773,
0.03165009990334511,
0.05584390088915825,
0.08656518161296844,
-0.12307853996753693,
-0.176169753074646,
-0.024073531851172447,
0.16490675508975983,
0.07964789122343063,
0.0825846791267395,
0.07539273798465729,
-0.08541291952133179,
0.017652546986937523,
0.04020996764302254,
0.17037948966026306,
-0.015576590783894062,
-0.10870205610990524,
-0.10396617650985718,
0.06813302636146545,
-0.0645066425204277,
0.226262167096138,
-0.04232753813266754,
0.019181568175554276,
0.07608156651258469,
0.15684202313423157,
0.011756951920688152,
0.015583735890686512,
0.06726869940757751,
-0.030799711123108864,
-0.01088039018213749,
-0.023927779868245125,
0.03538373112678528,
-0.01441163569688797,
-0.11656852066516876,
0.2350115180015564,
-0.10605599731206894,
0.06379605829715729,
0.18888333439826965,
-0.08889171481132507,
-0.012959163635969162,
-0.043133217841386795,
-0.046505268663167953,
0.004694156814366579,
0.062076911330223083,
-0.03991788998246193,
0.22515693306922913,
0.01117341686040163,
0.13349102437496185,
-0.007687731646001339,
-0.018896888941526413,
-0.014094335027039051,
-0.07590913772583008,
0.00629182206466794,
0.0627899020910263,
0.028147637844085693,
-0.15793262422084808,
0.10818427056074142,
0.013195234350860119,
0.0805935263633728,
0.2113124579191208,
0.0355629026889801,
0.0680663213133812,
0.020914744585752487,
0.0031901278998702765,
-0.06738251447677612,
-0.06983086466789246,
-0.321178674697876,
-0.030278725549578667,
0.049556665122509,
0.05528012290596962,
0.1350831389427185,
-0.042795684188604355,
-0.019094472751021385,
-0.03944716975092888,
-0.022730743512511253,
0.05761689692735672,
0.15371432900428772,
0.04195815697312355,
0.15377937257289886,
-0.006703630555421114,
-0.07485831528902054,
0.04169466346502304,
0.003349226899445057,
-0.11153942346572876,
0.1410476267337799,
-0.17577725648880005,
-0.3476906418800354,
-0.03124120645225048,
-0.130113884806633,
-0.06320375204086304,
0.039843443781137466,
0.08997023105621338,
-0.19903790950775146,
-0.0006238986970856786,
0.017847519367933273,
0.09509206563234329,
0.02413397654891014,
0.016077861189842224,
0.07153959572315216,
-0.06139683723449707,
-0.10136960446834564,
-0.11373108625411987,
-0.05264159291982651,
-0.06683648377656937,
-0.11798607558012009,
0.12459578365087509,
-0.16452273726463318,
0.017987580969929695,
0.21677619218826294,
0.04113182798027992,
0.06206201761960983,
-0.023023243993520737,
0.25566840171813965,
-0.10939715057611465,
0.0286073237657547,
0.17293484508991241,
0.01686146669089794,
0.02669895999133587,
0.10997503250837326,
-0.019446026533842087,
-0.12817075848579407,
0.06026293337345123,
-0.01656145602464676,
-0.07096251100301743,
-0.20454192161560059,
-0.21024994552135468,
-0.08096782863140106,
0.09014924615621567,
-0.018005413934588432,
0.06904570758342743,
0.1414383202791214,
0.02970438078045845,
-0.026131540536880493,
-0.0490809828042984,
0.09899041801691055,
0.051192380487918854,
0.17059782147407532,
-0.09738970547914505,
0.12161380052566528,
-0.0215446837246418,
-0.09003637731075287,
0.07071530818939209,
0.008625643327832222,
0.07919701933860779,
0.05664544552564621,
0.056681711226701736,
0.03477178141474724,
0.05184609070420265,
0.13479608297348022,
0.004874000791460276,
0.01784246601164341,
-0.08477703481912613,
0.005763623397797346,
-0.022313527762889862,
-0.09581872075796127,
0.007024643011391163,
0.05425998196005821,
-0.16158020496368408,
-0.028891153633594513,
-0.0017159533454105258,
0.10731203109025955,
0.04837334528565407,
0.09739865362644196,
-0.16217289865016937,
-0.09366218745708466,
0.044000640511512756,
-0.03462972491979599,
-0.07541224360466003,
0.07639168947935104,
0.06741161644458771,
-0.1383616030216217,
0.054750069975852966,
-0.011487936601042747,
0.09841711074113846,
-0.11301963031291962,
0.04922184348106384,
-0.1383095234632492,
-0.040402695536613464,
-0.005755018442869186,
0.07621902227401733,
-0.2630176544189453,
0.13395322859287262,
-0.03546271100640297,
-0.03840550407767296,
-0.1174018606543541,
-0.017455484718084335,
-0.0038811310660094023,
0.10694295167922974,
0.04950512945652008,
0.01964215375483036,
0.04950443282723427,
0.014236033894121647,
-0.10705028474330902,
0.03218303993344307,
0.010299173183739185,
-0.023959863930940628,
-0.05948719382286072,
0.02074790745973587,
-0.024005096405744553,
-0.018881697207689285,
-0.12932980060577393,
-0.03812210261821747,
-0.15508605539798737,
0.041080720722675323,
0.18053974211215973,
0.11493601649999619,
0.037637773901224136,
-0.018357248976826668,
-0.006955144926905632,
0.26158514618873596,
0.09794305264949799,
-0.13049659132957458,
-0.06943461298942566,
0.025171175599098206,
0.03627297282218933,
-0.06180146709084511,
-0.002944500418379903,
-0.03472563624382019,
0.05261370167136192,
-0.06326436251401901,
-0.17609041929244995,
0.10805560648441315,
-0.11141735315322876,
-0.029353909194469452,
-0.007894911803305149,
0.16547085344791412,
0.11611169576644897,
-0.0017626271583139896,
0.06191926822066307,
-0.04753774031996727,
-0.07226573675870895,
-0.09732077270746231,
-0.010933701880276203,
0.10964737087488174,
-0.03223278373479843,
0.11675872653722763,
0.05453915521502495,
-0.22489312291145325,
-0.07166748493909836,
-0.017499007284641266,
0.28386232256889343,
0.10852590948343277,
-0.03674788773059845,
0.17725762724876404,
0.11806492507457733,
-0.005635837558656931,
-0.21437020599842072,
-0.0862417072057724,
-0.056013092398643494,
-0.04754536226391792,
-0.08656063675880432,
-0.15293635427951813,
0.05119415372610092,
-0.03897181153297424,
-0.02094240114092827,
0.07458508759737015,
-0.3477265536785126,
-0.09044193476438522,
0.1824522167444229,
-0.06594786792993546,
0.38817891478538513,
-0.03777773678302765,
-0.0718904659152031,
-0.026183942332863808,
-0.15700702369213104,
0.1885446012020111,
-0.0038575793150812387,
0.05461110919713974,
0.008353380486369133,
0.2613135874271393,
0.04980030655860901,
0.015743695199489594,
0.051713887602090836,
0.08028706163167953,
-0.04876833036541939,
-0.1043628379702568,
-0.04317786917090416,
-0.0030902179423719645,
0.026795990765094757,
0.10658130794763565,
-0.04968138039112091,
0.0011742463102564216,
-0.14351791143417358,
-0.022059278562664986,
-0.1123029813170433,
0.034258145838975906,
0.048715740442276,
-0.01737889274954796,
-0.02300572209060192,
-0.04134640842676163,
-0.0317239984869957,
0.04755343124270439,
0.12839727103710175,
-0.07772476226091385,
0.17595504224300385,
0.0972408801317215,
0.1043165922164917,
-0.20370347797870636,
0.0401637889444828,
-0.014201825484633446,
-0.04263058304786682,
0.06336291879415512,
-0.07573235780000687,
0.0012032645754516125,
0.08475054055452347,
-0.06958045810461044,
0.11022278666496277,
0.05037990212440491,
-0.05388018116354942,
0.06973491609096527,
0.10324688255786896,
-0.22402584552764893,
-0.13858133554458618,
-0.0032234573736786842,
0.10255797952413559,
0.09593109786510468,
0.1335451602935791,
0.22795435786247253,
0.00013332384696695954,
-0.05698217451572418,
-0.014078579843044281,
0.06873737275600433,
-0.07360608130693436,
0.0630730614066124,
-0.08198481798171997,
0.012562907300889492,
-0.18125148117542267,
0.030582746490836143,
0.02556220255792141,
-0.07756127417087555,
0.08051129430532455,
0.1610197126865387,
-0.1543751358985901,
-0.12179239839315414,
-0.1505657136440277,
0.09805983304977417,
-0.04721866175532341,
0.0007801863248459995,
-0.005245056003332138,
-0.10755717009305954,
0.03880735859274864,
0.07905635237693787,
0.02388819307088852,
0.09664785116910934,
-0.050732966512441635,
-0.026460805907845497,
-0.028535734862089157,
-0.026952167972922325,
0.06222646310925484,
-0.060951024293899536,
-0.029420386999845505,
0.05857784301042557,
-0.01321637723594904,
0.11563537269830704,
-0.08114197105169296,
-0.1311916708946228,
-0.17651109397411346,
0.04375500977039337,
-0.12273527681827545,
-0.11150697618722916,
-0.12684613466262817,
-0.034054260700941086,
-0.026793459430336952,
-0.05424526333808899,
-0.020681774243712425,
-0.035995762795209885,
-0.09855540841817856,
0.05007615685462952,
-0.021738771349191666,
0.019113000482320786,
-0.11898110806941986,
0.047154951840639114,
0.023072753101587296,
0.00952209997922182,
0.22900183498859406,
0.23967400193214417,
-0.10501338541507721,
0.07970281690359116,
-0.13300393521785736,
-0.06916272640228271,
0.12155511975288391,
0.015302781015634537,
0.11593975126743317,
0.06520570814609528,
-0.012209351174533367,
0.07093607634305954,
0.06603305041790009,
0.07040399312973022,
0.09451892226934433,
-0.09665215760469437,
0.03132275491952896,
0.005647694226354361,
-0.054901495575904846,
-0.047769151628017426,
-0.008121064864099026,
0.06017419323325157,
0.06088278442621231,
0.08712935447692871,
-0.06329122185707092,
0.047178879380226135,
-0.06907396763563156,
0.0009565642685629427,
0.020402390509843826,
-0.10111594200134277,
0.05090409144759178,
-0.05048583075404167,
0.04585586115717888,
-0.0038993079215288162,
0.11841075867414474,
0.05489125847816467,
-0.028396034613251686,
0.03150009363889694,
0.07694239914417267,
-0.04186859354376793,
-0.011592810042202473,
-0.006134168244898319,
0.054798588156700134,
-0.043219733983278275,
-0.05355124920606613,
0.04504793509840965,
0.03471345081925392,
-0.05936227738857269,
0.11973360180854797,
-0.09357963502407074,
0.002641031751409173,
0.041799262166023254,
0.028395522385835648,
0.00043721101246774197,
-0.1502877026796341,
-0.14501814544200897,
-0.2070295363664627,
0.04020696133375168,
-0.0972619280219078,
0.06980330497026443,
0.0823310986161232,
0.020472751930356026,
-0.0072508519515395164,
-0.015675708651542664,
-0.054576434195041656,
-0.1250603049993515,
-0.17620421946048737,
-0.05234035849571228,
-0.19658192992210388,
0.009567178785800934,
-0.10002416372299194,
0.051153987646102905,
-0.018896259367465973,
0.08556535094976425,
-0.05606051906943321,
0.11778818815946579,
-0.009087399579584599,
-0.07489084452390671,
0.03840646520256996,
-0.04640626534819603,
0.03561873733997345,
-0.02484917640686035,
0.022134236991405487,
-0.045144930481910706,
0.08696575462818146,
0.020124923437833786,
0.06914480775594711,
-0.037223152816295624,
0.05012250691652298,
-0.1036195158958435,
-0.07003312557935715,
-0.0553266704082489,
0.06647614389657974,
-0.02439258247613907,
0.09582629799842834,
0.08311638981103897,
0.0007597408257424831,
0.01473532896488905,
0.25121182203292847,
-0.05404425039887428,
-0.10183586925268173,
-0.13247138261795044,
0.2002287209033966,
-0.03368125855922699,
0.0583178848028183,
-0.055855244398117065,
-0.00426106620579958,
-0.15232570469379425,
0.2420482039451599,
0.29523324966430664,
-0.08089511096477509,
0.013313311152160168,
-0.08143246918916702,
0.03359610587358475,
0.013623652048408985,
0.10647078603506088,
0.14291693270206451,
0.3189051151275635,
-0.012403322383761406,
0.07417552173137665,
0.00009350610343972221,
-0.060927875339984894,
-0.05881281569600105,
-0.05669590085744858,
-0.02383190393447876,
-0.006449878215789795,
-0.0066023580729961395,
0.1395004689693451,
-0.26645973324775696,
0.016834815964102745,
-0.18420034646987915,
-0.20412065088748932,
-0.08061134070158005,
0.018009180203080177,
0.12304088473320007,
0.044550321996212006,
0.1389375925064087,
0.00005718287866329774,
-0.06459779292345047,
0.1276867389678955,
-0.006128685083240271,
-0.15249358117580414,
-0.10721327364444733,
0.11934854090213776,
-0.12627308070659637,
-0.04108234494924545,
-0.041037049144506454,
0.09749634563922882,
0.08271387219429016,
0.03395570069551468,
-0.07119182497262955,
0.01351496484130621,
-0.03403165191411972,
-0.021414151415228844,
-0.005451269913464785,
0.06813238561153412,
0.02072959579527378,
0.025263439863920212,
0.05354813113808632,
-0.18906964361667633,
0.01142108254134655,
-0.08744467794895172,
0.07205761224031448,
-0.032799381762742996,
0.07564347237348557,
-0.03142198547720909,
0.06004147604107857,
0.0745621845126152,
-0.055871736258268356,
0.03904830664396286,
0.04052802547812462,
-0.020588697865605354,
-0.04159161075949669,
-0.061948951333761215,
-0.1403578817844391,
-0.22841989994049072,
-0.11440495401620865,
0.09922145307064056,
0.029091045260429382,
-0.0986558049917221,
0.048632487654685974,
-0.15799014270305634,
0.05539434030652046,
-0.07917378842830658,
0.11045927554368973,
0.08010590076446533,
0.027341295033693314,
0.0007489612326025963,
0.05805785581469536,
0.06042911112308502,
0.09924639761447906,
-0.11275497078895569,
-0.08823683857917786
] |
null | null | transformers |
# DialoGPT chat bot model using discord messages as data | {"tags": ["conversational"]} | text-generation | S34NtheGuy/DialoGPT-small-pikamew362 | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# DialoGPT chat bot model using discord messages as data | [
"# DialoGPT chat bot model using discord messages as data"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# DialoGPT chat bot model using discord messages as data"
] | [
51,
13
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# DialoGPT chat bot model using discord messages as data"
] | [
-0.02911895327270031,
0.003995177801698446,
-0.0046684942208230495,
-0.012396533973515034,
0.11202579736709595,
-0.020793797448277473,
0.19453662633895874,
0.0904533639550209,
0.11296340823173523,
-0.04555835947394371,
0.05478145554661751,
0.14494860172271729,
0.02601793222129345,
0.2345304638147354,
-0.07001583278179169,
-0.2198491394519806,
0.06904508173465729,
0.0020564962178468704,
0.07120723277330399,
0.12327651679515839,
0.0893358588218689,
-0.027994820848107338,
0.09009858965873718,
0.016869334504008293,
-0.19000788033008575,
-0.005467485636472702,
0.01664290763437748,
-0.08414720743894577,
0.0951712355017662,
0.05692970007658005,
0.027007173746824265,
0.022588100284337997,
-0.054826367646455765,
-0.07715894281864166,
0.04409089311957359,
-0.015045171603560448,
-0.03291803225874901,
0.042206596583127975,
-0.06107683107256889,
-0.10459832847118378,
0.18928758800029755,
0.11292500793933868,
0.02833588793873787,
0.07569966465234756,
-0.1312641203403473,
0.011802727356553078,
-0.04876156896352768,
0.02216631919145584,
0.14013226330280304,
0.12497154623270035,
-0.07691969722509384,
0.16699260473251343,
-0.07476947456598282,
0.10180847346782684,
0.05837265029549599,
-0.3295818567276001,
-0.010758930817246437,
0.12642543017864227,
0.030477117747068405,
0.09977107495069504,
-0.018700912594795227,
0.04719923064112663,
0.011075117625296116,
0.027997151017189026,
-0.0939258560538292,
-0.03323792293667793,
-0.14734824001789093,
-0.03544525429606438,
-0.09491617977619171,
-0.024447239935398102,
0.269292950630188,
0.005437672603875399,
0.05558373034000397,
-0.11964231729507446,
-0.07344776391983032,
-0.07583962380886078,
-0.07977471500635147,
-0.021205732598900795,
-0.11365412920713425,
0.06104221194982529,
-0.011485567316412926,
-0.08311361074447632,
-0.11278795450925827,
-0.043469205498695374,
-0.18289406597614288,
0.0821399837732315,
0.0335225835442543,
0.07892793416976929,
-0.2801816165447235,
0.08420786261558533,
0.04518672078847885,
-0.0835626944899559,
0.019823530688881874,
-0.08192607015371323,
-0.04286707937717438,
-0.006278423126786947,
-0.03693044185638428,
-0.0884459912776947,
0.10716810822486877,
0.1580084264278412,
-0.05083105340600014,
0.04353693500161171,
-0.07717008143663406,
0.0446944534778595,
0.10556173324584961,
0.03715191408991814,
-0.00912248995155096,
0.006465920712798834,
0.05436272174119949,
-0.11213027685880661,
0.01778799295425415,
-0.03295382484793663,
-0.1692671924829483,
0.01611638441681862,
0.02191336080431938,
0.08422008901834488,
0.042106952518224716,
0.13374918699264526,
-0.045255158096551895,
-0.03810238465666771,
0.12856028974056244,
-0.003518170677125454,
-0.0027266035322099924,
0.04263519123196602,
-0.07405722886323929,
0.06224050372838974,
0.03224530816078186,
0.06507550925016403,
-0.08838954567909241,
-0.07766801118850708,
-0.03798297420144081,
0.007619286421686411,
-0.030212605372071266,
-0.029250554740428925,
0.02620057761669159,
0.053402043879032135,
-0.0241620484739542,
-0.1678251177072525,
-0.13773390650749207,
0.00567244365811348,
-0.03463026136159897,
-0.06034352630376816,
-0.09172911196947098,
-0.11689257621765137,
0.022298447787761688,
0.01672283187508583,
-0.07708073407411575,
-0.054256051778793335,
-0.059729307889938354,
0.04205011948943138,
-0.07092487066984177,
0.14112763106822968,
-0.11467497050762177,
0.04234171658754349,
-0.10254870355129242,
-0.04991884157061577,
-0.1932521015405655,
0.15436524152755737,
-0.053409744054079056,
0.12730517983436584,
-0.048516929149627686,
0.03334582969546318,
-0.08455422520637512,
0.00886745285242796,
-0.024523701518774033,
0.2663744390010834,
-0.1233612671494484,
-0.08157475292682648,
0.3245835304260254,
-0.0866551399230957,
-0.13101184368133545,
0.17166416347026825,
-0.00581461563706398,
0.06078298017382622,
0.17557644844055176,
0.18741080164909363,
-0.07020799070596695,
-0.009106408804655075,
-0.006908034905791283,
0.08536610007286072,
-0.1426268219947815,
0.015105406753718853,
-0.0069095678627491,
-0.009230674244463444,
-0.013791845180094242,
0.007894156500697136,
0.23989658057689667,
0.135564386844635,
-0.0794452428817749,
-0.030347473919391632,
0.012585587799549103,
-0.03232159465551376,
0.0846608355641365,
-0.027062173932790756,
0.09626356512308121,
-0.024425217881798744,
-0.0443769246339798,
-0.041578181087970734,
0.060682933777570724,
0.0039953915402293205,
0.011674324050545692,
-0.18290074169635773,
0.03165009990334511,
0.05584390088915825,
0.08656518161296844,
-0.12307853996753693,
-0.176169753074646,
-0.024073531851172447,
0.16490675508975983,
0.07964789122343063,
0.0825846791267395,
0.07539273798465729,
-0.08541291952133179,
0.017652546986937523,
0.04020996764302254,
0.17037948966026306,
-0.015576590783894062,
-0.10870205610990524,
-0.10396617650985718,
0.06813302636146545,
-0.0645066425204277,
0.226262167096138,
-0.04232753813266754,
0.019181568175554276,
0.07608156651258469,
0.15684202313423157,
0.011756951920688152,
0.015583735890686512,
0.06726869940757751,
-0.030799711123108864,
-0.01088039018213749,
-0.023927779868245125,
0.03538373112678528,
-0.01441163569688797,
-0.11656852066516876,
0.2350115180015564,
-0.10605599731206894,
0.06379605829715729,
0.18888333439826965,
-0.08889171481132507,
-0.012959163635969162,
-0.043133217841386795,
-0.046505268663167953,
0.004694156814366579,
0.062076911330223083,
-0.03991788998246193,
0.22515693306922913,
0.01117341686040163,
0.13349102437496185,
-0.007687731646001339,
-0.018896888941526413,
-0.014094335027039051,
-0.07590913772583008,
0.00629182206466794,
0.0627899020910263,
0.028147637844085693,
-0.15793262422084808,
0.10818427056074142,
0.013195234350860119,
0.0805935263633728,
0.2113124579191208,
0.0355629026889801,
0.0680663213133812,
0.020914744585752487,
0.0031901278998702765,
-0.06738251447677612,
-0.06983086466789246,
-0.321178674697876,
-0.030278725549578667,
0.049556665122509,
0.05528012290596962,
0.1350831389427185,
-0.042795684188604355,
-0.019094472751021385,
-0.03944716975092888,
-0.022730743512511253,
0.05761689692735672,
0.15371432900428772,
0.04195815697312355,
0.15377937257289886,
-0.006703630555421114,
-0.07485831528902054,
0.04169466346502304,
0.003349226899445057,
-0.11153942346572876,
0.1410476267337799,
-0.17577725648880005,
-0.3476906418800354,
-0.03124120645225048,
-0.130113884806633,
-0.06320375204086304,
0.039843443781137466,
0.08997023105621338,
-0.19903790950775146,
-0.0006238986970856786,
0.017847519367933273,
0.09509206563234329,
0.02413397654891014,
0.016077861189842224,
0.07153959572315216,
-0.06139683723449707,
-0.10136960446834564,
-0.11373108625411987,
-0.05264159291982651,
-0.06683648377656937,
-0.11798607558012009,
0.12459578365087509,
-0.16452273726463318,
0.017987580969929695,
0.21677619218826294,
0.04113182798027992,
0.06206201761960983,
-0.023023243993520737,
0.25566840171813965,
-0.10939715057611465,
0.0286073237657547,
0.17293484508991241,
0.01686146669089794,
0.02669895999133587,
0.10997503250837326,
-0.019446026533842087,
-0.12817075848579407,
0.06026293337345123,
-0.01656145602464676,
-0.07096251100301743,
-0.20454192161560059,
-0.21024994552135468,
-0.08096782863140106,
0.09014924615621567,
-0.018005413934588432,
0.06904570758342743,
0.1414383202791214,
0.02970438078045845,
-0.026131540536880493,
-0.0490809828042984,
0.09899041801691055,
0.051192380487918854,
0.17059782147407532,
-0.09738970547914505,
0.12161380052566528,
-0.0215446837246418,
-0.09003637731075287,
0.07071530818939209,
0.008625643327832222,
0.07919701933860779,
0.05664544552564621,
0.056681711226701736,
0.03477178141474724,
0.05184609070420265,
0.13479608297348022,
0.004874000791460276,
0.01784246601164341,
-0.08477703481912613,
0.005763623397797346,
-0.022313527762889862,
-0.09581872075796127,
0.007024643011391163,
0.05425998196005821,
-0.16158020496368408,
-0.028891153633594513,
-0.0017159533454105258,
0.10731203109025955,
0.04837334528565407,
0.09739865362644196,
-0.16217289865016937,
-0.09366218745708466,
0.044000640511512756,
-0.03462972491979599,
-0.07541224360466003,
0.07639168947935104,
0.06741161644458771,
-0.1383616030216217,
0.054750069975852966,
-0.011487936601042747,
0.09841711074113846,
-0.11301963031291962,
0.04922184348106384,
-0.1383095234632492,
-0.040402695536613464,
-0.005755018442869186,
0.07621902227401733,
-0.2630176544189453,
0.13395322859287262,
-0.03546271100640297,
-0.03840550407767296,
-0.1174018606543541,
-0.017455484718084335,
-0.0038811310660094023,
0.10694295167922974,
0.04950512945652008,
0.01964215375483036,
0.04950443282723427,
0.014236033894121647,
-0.10705028474330902,
0.03218303993344307,
0.010299173183739185,
-0.023959863930940628,
-0.05948719382286072,
0.02074790745973587,
-0.024005096405744553,
-0.018881697207689285,
-0.12932980060577393,
-0.03812210261821747,
-0.15508605539798737,
0.041080720722675323,
0.18053974211215973,
0.11493601649999619,
0.037637773901224136,
-0.018357248976826668,
-0.006955144926905632,
0.26158514618873596,
0.09794305264949799,
-0.13049659132957458,
-0.06943461298942566,
0.025171175599098206,
0.03627297282218933,
-0.06180146709084511,
-0.002944500418379903,
-0.03472563624382019,
0.05261370167136192,
-0.06326436251401901,
-0.17609041929244995,
0.10805560648441315,
-0.11141735315322876,
-0.029353909194469452,
-0.007894911803305149,
0.16547085344791412,
0.11611169576644897,
-0.0017626271583139896,
0.06191926822066307,
-0.04753774031996727,
-0.07226573675870895,
-0.09732077270746231,
-0.010933701880276203,
0.10964737087488174,
-0.03223278373479843,
0.11675872653722763,
0.05453915521502495,
-0.22489312291145325,
-0.07166748493909836,
-0.017499007284641266,
0.28386232256889343,
0.10852590948343277,
-0.03674788773059845,
0.17725762724876404,
0.11806492507457733,
-0.005635837558656931,
-0.21437020599842072,
-0.0862417072057724,
-0.056013092398643494,
-0.04754536226391792,
-0.08656063675880432,
-0.15293635427951813,
0.05119415372610092,
-0.03897181153297424,
-0.02094240114092827,
0.07458508759737015,
-0.3477265536785126,
-0.09044193476438522,
0.1824522167444229,
-0.06594786792993546,
0.38817891478538513,
-0.03777773678302765,
-0.0718904659152031,
-0.026183942332863808,
-0.15700702369213104,
0.1885446012020111,
-0.0038575793150812387,
0.05461110919713974,
0.008353380486369133,
0.2613135874271393,
0.04980030655860901,
0.015743695199489594,
0.051713887602090836,
0.08028706163167953,
-0.04876833036541939,
-0.1043628379702568,
-0.04317786917090416,
-0.0030902179423719645,
0.026795990765094757,
0.10658130794763565,
-0.04968138039112091,
0.0011742463102564216,
-0.14351791143417358,
-0.022059278562664986,
-0.1123029813170433,
0.034258145838975906,
0.048715740442276,
-0.01737889274954796,
-0.02300572209060192,
-0.04134640842676163,
-0.0317239984869957,
0.04755343124270439,
0.12839727103710175,
-0.07772476226091385,
0.17595504224300385,
0.0972408801317215,
0.1043165922164917,
-0.20370347797870636,
0.0401637889444828,
-0.014201825484633446,
-0.04263058304786682,
0.06336291879415512,
-0.07573235780000687,
0.0012032645754516125,
0.08475054055452347,
-0.06958045810461044,
0.11022278666496277,
0.05037990212440491,
-0.05388018116354942,
0.06973491609096527,
0.10324688255786896,
-0.22402584552764893,
-0.13858133554458618,
-0.0032234573736786842,
0.10255797952413559,
0.09593109786510468,
0.1335451602935791,
0.22795435786247253,
0.00013332384696695954,
-0.05698217451572418,
-0.014078579843044281,
0.06873737275600433,
-0.07360608130693436,
0.0630730614066124,
-0.08198481798171997,
0.012562907300889492,
-0.18125148117542267,
0.030582746490836143,
0.02556220255792141,
-0.07756127417087555,
0.08051129430532455,
0.1610197126865387,
-0.1543751358985901,
-0.12179239839315414,
-0.1505657136440277,
0.09805983304977417,
-0.04721866175532341,
0.0007801863248459995,
-0.005245056003332138,
-0.10755717009305954,
0.03880735859274864,
0.07905635237693787,
0.02388819307088852,
0.09664785116910934,
-0.050732966512441635,
-0.026460805907845497,
-0.028535734862089157,
-0.026952167972922325,
0.06222646310925484,
-0.060951024293899536,
-0.029420386999845505,
0.05857784301042557,
-0.01321637723594904,
0.11563537269830704,
-0.08114197105169296,
-0.1311916708946228,
-0.17651109397411346,
0.04375500977039337,
-0.12273527681827545,
-0.11150697618722916,
-0.12684613466262817,
-0.034054260700941086,
-0.026793459430336952,
-0.05424526333808899,
-0.020681774243712425,
-0.035995762795209885,
-0.09855540841817856,
0.05007615685462952,
-0.021738771349191666,
0.019113000482320786,
-0.11898110806941986,
0.047154951840639114,
0.023072753101587296,
0.00952209997922182,
0.22900183498859406,
0.23967400193214417,
-0.10501338541507721,
0.07970281690359116,
-0.13300393521785736,
-0.06916272640228271,
0.12155511975288391,
0.015302781015634537,
0.11593975126743317,
0.06520570814609528,
-0.012209351174533367,
0.07093607634305954,
0.06603305041790009,
0.07040399312973022,
0.09451892226934433,
-0.09665215760469437,
0.03132275491952896,
0.005647694226354361,
-0.054901495575904846,
-0.047769151628017426,
-0.008121064864099026,
0.06017419323325157,
0.06088278442621231,
0.08712935447692871,
-0.06329122185707092,
0.047178879380226135,
-0.06907396763563156,
0.0009565642685629427,
0.020402390509843826,
-0.10111594200134277,
0.05090409144759178,
-0.05048583075404167,
0.04585586115717888,
-0.0038993079215288162,
0.11841075867414474,
0.05489125847816467,
-0.028396034613251686,
0.03150009363889694,
0.07694239914417267,
-0.04186859354376793,
-0.011592810042202473,
-0.006134168244898319,
0.054798588156700134,
-0.043219733983278275,
-0.05355124920606613,
0.04504793509840965,
0.03471345081925392,
-0.05936227738857269,
0.11973360180854797,
-0.09357963502407074,
0.002641031751409173,
0.041799262166023254,
0.028395522385835648,
0.00043721101246774197,
-0.1502877026796341,
-0.14501814544200897,
-0.2070295363664627,
0.04020696133375168,
-0.0972619280219078,
0.06980330497026443,
0.0823310986161232,
0.020472751930356026,
-0.0072508519515395164,
-0.015675708651542664,
-0.054576434195041656,
-0.1250603049993515,
-0.17620421946048737,
-0.05234035849571228,
-0.19658192992210388,
0.009567178785800934,
-0.10002416372299194,
0.051153987646102905,
-0.018896259367465973,
0.08556535094976425,
-0.05606051906943321,
0.11778818815946579,
-0.009087399579584599,
-0.07489084452390671,
0.03840646520256996,
-0.04640626534819603,
0.03561873733997345,
-0.02484917640686035,
0.022134236991405487,
-0.045144930481910706,
0.08696575462818146,
0.020124923437833786,
0.06914480775594711,
-0.037223152816295624,
0.05012250691652298,
-0.1036195158958435,
-0.07003312557935715,
-0.0553266704082489,
0.06647614389657974,
-0.02439258247613907,
0.09582629799842834,
0.08311638981103897,
0.0007597408257424831,
0.01473532896488905,
0.25121182203292847,
-0.05404425039887428,
-0.10183586925268173,
-0.13247138261795044,
0.2002287209033966,
-0.03368125855922699,
0.0583178848028183,
-0.055855244398117065,
-0.00426106620579958,
-0.15232570469379425,
0.2420482039451599,
0.29523324966430664,
-0.08089511096477509,
0.013313311152160168,
-0.08143246918916702,
0.03359610587358475,
0.013623652048408985,
0.10647078603506088,
0.14291693270206451,
0.3189051151275635,
-0.012403322383761406,
0.07417552173137665,
0.00009350610343972221,
-0.060927875339984894,
-0.05881281569600105,
-0.05669590085744858,
-0.02383190393447876,
-0.006449878215789795,
-0.0066023580729961395,
0.1395004689693451,
-0.26645973324775696,
0.016834815964102745,
-0.18420034646987915,
-0.20412065088748932,
-0.08061134070158005,
0.018009180203080177,
0.12304088473320007,
0.044550321996212006,
0.1389375925064087,
0.00005718287866329774,
-0.06459779292345047,
0.1276867389678955,
-0.006128685083240271,
-0.15249358117580414,
-0.10721327364444733,
0.11934854090213776,
-0.12627308070659637,
-0.04108234494924545,
-0.041037049144506454,
0.09749634563922882,
0.08271387219429016,
0.03395570069551468,
-0.07119182497262955,
0.01351496484130621,
-0.03403165191411972,
-0.021414151415228844,
-0.005451269913464785,
0.06813238561153412,
0.02072959579527378,
0.025263439863920212,
0.05354813113808632,
-0.18906964361667633,
0.01142108254134655,
-0.08744467794895172,
0.07205761224031448,
-0.032799381762742996,
0.07564347237348557,
-0.03142198547720909,
0.06004147604107857,
0.0745621845126152,
-0.055871736258268356,
0.03904830664396286,
0.04052802547812462,
-0.020588697865605354,
-0.04159161075949669,
-0.061948951333761215,
-0.1403578817844391,
-0.22841989994049072,
-0.11440495401620865,
0.09922145307064056,
0.029091045260429382,
-0.0986558049917221,
0.048632487654685974,
-0.15799014270305634,
0.05539434030652046,
-0.07917378842830658,
0.11045927554368973,
0.08010590076446533,
0.027341295033693314,
0.0007489612326025963,
0.05805785581469536,
0.06042911112308502,
0.09924639761447906,
-0.11275497078895569,
-0.08823683857917786
] |
null | null | transformers |
# DialoGPT chat bot model using discord messages as data | {"tags": ["conversational"]} | text-generation | S34NtheGuy/DialoGPT-small-wetterlettuce | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# DialoGPT chat bot model using discord messages as data | [
"# DialoGPT chat bot model using discord messages as data"
] | [
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# DialoGPT chat bot model using discord messages as data"
] | [
51,
13
] | [
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# DialoGPT chat bot model using discord messages as data"
] | [
-0.02911895327270031,
0.003995177801698446,
-0.0046684942208230495,
-0.012396533973515034,
0.11202579736709595,
-0.020793797448277473,
0.19453662633895874,
0.0904533639550209,
0.11296340823173523,
-0.04555835947394371,
0.05478145554661751,
0.14494860172271729,
0.02601793222129345,
0.2345304638147354,
-0.07001583278179169,
-0.2198491394519806,
0.06904508173465729,
0.0020564962178468704,
0.07120723277330399,
0.12327651679515839,
0.0893358588218689,
-0.027994820848107338,
0.09009858965873718,
0.016869334504008293,
-0.19000788033008575,
-0.005467485636472702,
0.01664290763437748,
-0.08414720743894577,
0.0951712355017662,
0.05692970007658005,
0.027007173746824265,
0.022588100284337997,
-0.054826367646455765,
-0.07715894281864166,
0.04409089311957359,
-0.015045171603560448,
-0.03291803225874901,
0.042206596583127975,
-0.06107683107256889,
-0.10459832847118378,
0.18928758800029755,
0.11292500793933868,
0.02833588793873787,
0.07569966465234756,
-0.1312641203403473,
0.011802727356553078,
-0.04876156896352768,
0.02216631919145584,
0.14013226330280304,
0.12497154623270035,
-0.07691969722509384,
0.16699260473251343,
-0.07476947456598282,
0.10180847346782684,
0.05837265029549599,
-0.3295818567276001,
-0.010758930817246437,
0.12642543017864227,
0.030477117747068405,
0.09977107495069504,
-0.018700912594795227,
0.04719923064112663,
0.011075117625296116,
0.027997151017189026,
-0.0939258560538292,
-0.03323792293667793,
-0.14734824001789093,
-0.03544525429606438,
-0.09491617977619171,
-0.024447239935398102,
0.269292950630188,
0.005437672603875399,
0.05558373034000397,
-0.11964231729507446,
-0.07344776391983032,
-0.07583962380886078,
-0.07977471500635147,
-0.021205732598900795,
-0.11365412920713425,
0.06104221194982529,
-0.011485567316412926,
-0.08311361074447632,
-0.11278795450925827,
-0.043469205498695374,
-0.18289406597614288,
0.0821399837732315,
0.0335225835442543,
0.07892793416976929,
-0.2801816165447235,
0.08420786261558533,
0.04518672078847885,
-0.0835626944899559,
0.019823530688881874,
-0.08192607015371323,
-0.04286707937717438,
-0.006278423126786947,
-0.03693044185638428,
-0.0884459912776947,
0.10716810822486877,
0.1580084264278412,
-0.05083105340600014,
0.04353693500161171,
-0.07717008143663406,
0.0446944534778595,
0.10556173324584961,
0.03715191408991814,
-0.00912248995155096,
0.006465920712798834,
0.05436272174119949,
-0.11213027685880661,
0.01778799295425415,
-0.03295382484793663,
-0.1692671924829483,
0.01611638441681862,
0.02191336080431938,
0.08422008901834488,
0.042106952518224716,
0.13374918699264526,
-0.045255158096551895,
-0.03810238465666771,
0.12856028974056244,
-0.003518170677125454,
-0.0027266035322099924,
0.04263519123196602,
-0.07405722886323929,
0.06224050372838974,
0.03224530816078186,
0.06507550925016403,
-0.08838954567909241,
-0.07766801118850708,
-0.03798297420144081,
0.007619286421686411,
-0.030212605372071266,
-0.029250554740428925,
0.02620057761669159,
0.053402043879032135,
-0.0241620484739542,
-0.1678251177072525,
-0.13773390650749207,
0.00567244365811348,
-0.03463026136159897,
-0.06034352630376816,
-0.09172911196947098,
-0.11689257621765137,
0.022298447787761688,
0.01672283187508583,
-0.07708073407411575,
-0.054256051778793335,
-0.059729307889938354,
0.04205011948943138,
-0.07092487066984177,
0.14112763106822968,
-0.11467497050762177,
0.04234171658754349,
-0.10254870355129242,
-0.04991884157061577,
-0.1932521015405655,
0.15436524152755737,
-0.053409744054079056,
0.12730517983436584,
-0.048516929149627686,
0.03334582969546318,
-0.08455422520637512,
0.00886745285242796,
-0.024523701518774033,
0.2663744390010834,
-0.1233612671494484,
-0.08157475292682648,
0.3245835304260254,
-0.0866551399230957,
-0.13101184368133545,
0.17166416347026825,
-0.00581461563706398,
0.06078298017382622,
0.17557644844055176,
0.18741080164909363,
-0.07020799070596695,
-0.009106408804655075,
-0.006908034905791283,
0.08536610007286072,
-0.1426268219947815,
0.015105406753718853,
-0.0069095678627491,
-0.009230674244463444,
-0.013791845180094242,
0.007894156500697136,
0.23989658057689667,
0.135564386844635,
-0.0794452428817749,
-0.030347473919391632,
0.012585587799549103,
-0.03232159465551376,
0.0846608355641365,
-0.027062173932790756,
0.09626356512308121,
-0.024425217881798744,
-0.0443769246339798,
-0.041578181087970734,
0.060682933777570724,
0.0039953915402293205,
0.011674324050545692,
-0.18290074169635773,
0.03165009990334511,
0.05584390088915825,
0.08656518161296844,
-0.12307853996753693,
-0.176169753074646,
-0.024073531851172447,
0.16490675508975983,
0.07964789122343063,
0.0825846791267395,
0.07539273798465729,
-0.08541291952133179,
0.017652546986937523,
0.04020996764302254,
0.17037948966026306,
-0.015576590783894062,
-0.10870205610990524,
-0.10396617650985718,
0.06813302636146545,
-0.0645066425204277,
0.226262167096138,
-0.04232753813266754,
0.019181568175554276,
0.07608156651258469,
0.15684202313423157,
0.011756951920688152,
0.015583735890686512,
0.06726869940757751,
-0.030799711123108864,
-0.01088039018213749,
-0.023927779868245125,
0.03538373112678528,
-0.01441163569688797,
-0.11656852066516876,
0.2350115180015564,
-0.10605599731206894,
0.06379605829715729,
0.18888333439826965,
-0.08889171481132507,
-0.012959163635969162,
-0.043133217841386795,
-0.046505268663167953,
0.004694156814366579,
0.062076911330223083,
-0.03991788998246193,
0.22515693306922913,
0.01117341686040163,
0.13349102437496185,
-0.007687731646001339,
-0.018896888941526413,
-0.014094335027039051,
-0.07590913772583008,
0.00629182206466794,
0.0627899020910263,
0.028147637844085693,
-0.15793262422084808,
0.10818427056074142,
0.013195234350860119,
0.0805935263633728,
0.2113124579191208,
0.0355629026889801,
0.0680663213133812,
0.020914744585752487,
0.0031901278998702765,
-0.06738251447677612,
-0.06983086466789246,
-0.321178674697876,
-0.030278725549578667,
0.049556665122509,
0.05528012290596962,
0.1350831389427185,
-0.042795684188604355,
-0.019094472751021385,
-0.03944716975092888,
-0.022730743512511253,
0.05761689692735672,
0.15371432900428772,
0.04195815697312355,
0.15377937257289886,
-0.006703630555421114,
-0.07485831528902054,
0.04169466346502304,
0.003349226899445057,
-0.11153942346572876,
0.1410476267337799,
-0.17577725648880005,
-0.3476906418800354,
-0.03124120645225048,
-0.130113884806633,
-0.06320375204086304,
0.039843443781137466,
0.08997023105621338,
-0.19903790950775146,
-0.0006238986970856786,
0.017847519367933273,
0.09509206563234329,
0.02413397654891014,
0.016077861189842224,
0.07153959572315216,
-0.06139683723449707,
-0.10136960446834564,
-0.11373108625411987,
-0.05264159291982651,
-0.06683648377656937,
-0.11798607558012009,
0.12459578365087509,
-0.16452273726463318,
0.017987580969929695,
0.21677619218826294,
0.04113182798027992,
0.06206201761960983,
-0.023023243993520737,
0.25566840171813965,
-0.10939715057611465,
0.0286073237657547,
0.17293484508991241,
0.01686146669089794,
0.02669895999133587,
0.10997503250837326,
-0.019446026533842087,
-0.12817075848579407,
0.06026293337345123,
-0.01656145602464676,
-0.07096251100301743,
-0.20454192161560059,
-0.21024994552135468,
-0.08096782863140106,
0.09014924615621567,
-0.018005413934588432,
0.06904570758342743,
0.1414383202791214,
0.02970438078045845,
-0.026131540536880493,
-0.0490809828042984,
0.09899041801691055,
0.051192380487918854,
0.17059782147407532,
-0.09738970547914505,
0.12161380052566528,
-0.0215446837246418,
-0.09003637731075287,
0.07071530818939209,
0.008625643327832222,
0.07919701933860779,
0.05664544552564621,
0.056681711226701736,
0.03477178141474724,
0.05184609070420265,
0.13479608297348022,
0.004874000791460276,
0.01784246601164341,
-0.08477703481912613,
0.005763623397797346,
-0.022313527762889862,
-0.09581872075796127,
0.007024643011391163,
0.05425998196005821,
-0.16158020496368408,
-0.028891153633594513,
-0.0017159533454105258,
0.10731203109025955,
0.04837334528565407,
0.09739865362644196,
-0.16217289865016937,
-0.09366218745708466,
0.044000640511512756,
-0.03462972491979599,
-0.07541224360466003,
0.07639168947935104,
0.06741161644458771,
-0.1383616030216217,
0.054750069975852966,
-0.011487936601042747,
0.09841711074113846,
-0.11301963031291962,
0.04922184348106384,
-0.1383095234632492,
-0.040402695536613464,
-0.005755018442869186,
0.07621902227401733,
-0.2630176544189453,
0.13395322859287262,
-0.03546271100640297,
-0.03840550407767296,
-0.1174018606543541,
-0.017455484718084335,
-0.0038811310660094023,
0.10694295167922974,
0.04950512945652008,
0.01964215375483036,
0.04950443282723427,
0.014236033894121647,
-0.10705028474330902,
0.03218303993344307,
0.010299173183739185,
-0.023959863930940628,
-0.05948719382286072,
0.02074790745973587,
-0.024005096405744553,
-0.018881697207689285,
-0.12932980060577393,
-0.03812210261821747,
-0.15508605539798737,
0.041080720722675323,
0.18053974211215973,
0.11493601649999619,
0.037637773901224136,
-0.018357248976826668,
-0.006955144926905632,
0.26158514618873596,
0.09794305264949799,
-0.13049659132957458,
-0.06943461298942566,
0.025171175599098206,
0.03627297282218933,
-0.06180146709084511,
-0.002944500418379903,
-0.03472563624382019,
0.05261370167136192,
-0.06326436251401901,
-0.17609041929244995,
0.10805560648441315,
-0.11141735315322876,
-0.029353909194469452,
-0.007894911803305149,
0.16547085344791412,
0.11611169576644897,
-0.0017626271583139896,
0.06191926822066307,
-0.04753774031996727,
-0.07226573675870895,
-0.09732077270746231,
-0.010933701880276203,
0.10964737087488174,
-0.03223278373479843,
0.11675872653722763,
0.05453915521502495,
-0.22489312291145325,
-0.07166748493909836,
-0.017499007284641266,
0.28386232256889343,
0.10852590948343277,
-0.03674788773059845,
0.17725762724876404,
0.11806492507457733,
-0.005635837558656931,
-0.21437020599842072,
-0.0862417072057724,
-0.056013092398643494,
-0.04754536226391792,
-0.08656063675880432,
-0.15293635427951813,
0.05119415372610092,
-0.03897181153297424,
-0.02094240114092827,
0.07458508759737015,
-0.3477265536785126,
-0.09044193476438522,
0.1824522167444229,
-0.06594786792993546,
0.38817891478538513,
-0.03777773678302765,
-0.0718904659152031,
-0.026183942332863808,
-0.15700702369213104,
0.1885446012020111,
-0.0038575793150812387,
0.05461110919713974,
0.008353380486369133,
0.2613135874271393,
0.04980030655860901,
0.015743695199489594,
0.051713887602090836,
0.08028706163167953,
-0.04876833036541939,
-0.1043628379702568,
-0.04317786917090416,
-0.0030902179423719645,
0.026795990765094757,
0.10658130794763565,
-0.04968138039112091,
0.0011742463102564216,
-0.14351791143417358,
-0.022059278562664986,
-0.1123029813170433,
0.034258145838975906,
0.048715740442276,
-0.01737889274954796,
-0.02300572209060192,
-0.04134640842676163,
-0.0317239984869957,
0.04755343124270439,
0.12839727103710175,
-0.07772476226091385,
0.17595504224300385,
0.0972408801317215,
0.1043165922164917,
-0.20370347797870636,
0.0401637889444828,
-0.014201825484633446,
-0.04263058304786682,
0.06336291879415512,
-0.07573235780000687,
0.0012032645754516125,
0.08475054055452347,
-0.06958045810461044,
0.11022278666496277,
0.05037990212440491,
-0.05388018116354942,
0.06973491609096527,
0.10324688255786896,
-0.22402584552764893,
-0.13858133554458618,
-0.0032234573736786842,
0.10255797952413559,
0.09593109786510468,
0.1335451602935791,
0.22795435786247253,
0.00013332384696695954,
-0.05698217451572418,
-0.014078579843044281,
0.06873737275600433,
-0.07360608130693436,
0.0630730614066124,
-0.08198481798171997,
0.012562907300889492,
-0.18125148117542267,
0.030582746490836143,
0.02556220255792141,
-0.07756127417087555,
0.08051129430532455,
0.1610197126865387,
-0.1543751358985901,
-0.12179239839315414,
-0.1505657136440277,
0.09805983304977417,
-0.04721866175532341,
0.0007801863248459995,
-0.005245056003332138,
-0.10755717009305954,
0.03880735859274864,
0.07905635237693787,
0.02388819307088852,
0.09664785116910934,
-0.050732966512441635,
-0.026460805907845497,
-0.028535734862089157,
-0.026952167972922325,
0.06222646310925484,
-0.060951024293899536,
-0.029420386999845505,
0.05857784301042557,
-0.01321637723594904,
0.11563537269830704,
-0.08114197105169296,
-0.1311916708946228,
-0.17651109397411346,
0.04375500977039337,
-0.12273527681827545,
-0.11150697618722916,
-0.12684613466262817,
-0.034054260700941086,
-0.026793459430336952,
-0.05424526333808899,
-0.020681774243712425,
-0.035995762795209885,
-0.09855540841817856,
0.05007615685462952,
-0.021738771349191666,
0.019113000482320786,
-0.11898110806941986,
0.047154951840639114,
0.023072753101587296,
0.00952209997922182,
0.22900183498859406,
0.23967400193214417,
-0.10501338541507721,
0.07970281690359116,
-0.13300393521785736,
-0.06916272640228271,
0.12155511975288391,
0.015302781015634537,
0.11593975126743317,
0.06520570814609528,
-0.012209351174533367,
0.07093607634305954,
0.06603305041790009,
0.07040399312973022,
0.09451892226934433,
-0.09665215760469437,
0.03132275491952896,
0.005647694226354361,
-0.054901495575904846,
-0.047769151628017426,
-0.008121064864099026,
0.06017419323325157,
0.06088278442621231,
0.08712935447692871,
-0.06329122185707092,
0.047178879380226135,
-0.06907396763563156,
0.0009565642685629427,
0.020402390509843826,
-0.10111594200134277,
0.05090409144759178,
-0.05048583075404167,
0.04585586115717888,
-0.0038993079215288162,
0.11841075867414474,
0.05489125847816467,
-0.028396034613251686,
0.03150009363889694,
0.07694239914417267,
-0.04186859354376793,
-0.011592810042202473,
-0.006134168244898319,
0.054798588156700134,
-0.043219733983278275,
-0.05355124920606613,
0.04504793509840965,
0.03471345081925392,
-0.05936227738857269,
0.11973360180854797,
-0.09357963502407074,
0.002641031751409173,
0.041799262166023254,
0.028395522385835648,
0.00043721101246774197,
-0.1502877026796341,
-0.14501814544200897,
-0.2070295363664627,
0.04020696133375168,
-0.0972619280219078,
0.06980330497026443,
0.0823310986161232,
0.020472751930356026,
-0.0072508519515395164,
-0.015675708651542664,
-0.054576434195041656,
-0.1250603049993515,
-0.17620421946048737,
-0.05234035849571228,
-0.19658192992210388,
0.009567178785800934,
-0.10002416372299194,
0.051153987646102905,
-0.018896259367465973,
0.08556535094976425,
-0.05606051906943321,
0.11778818815946579,
-0.009087399579584599,
-0.07489084452390671,
0.03840646520256996,
-0.04640626534819603,
0.03561873733997345,
-0.02484917640686035,
0.022134236991405487,
-0.045144930481910706,
0.08696575462818146,
0.020124923437833786,
0.06914480775594711,
-0.037223152816295624,
0.05012250691652298,
-0.1036195158958435,
-0.07003312557935715,
-0.0553266704082489,
0.06647614389657974,
-0.02439258247613907,
0.09582629799842834,
0.08311638981103897,
0.0007597408257424831,
0.01473532896488905,
0.25121182203292847,
-0.05404425039887428,
-0.10183586925268173,
-0.13247138261795044,
0.2002287209033966,
-0.03368125855922699,
0.0583178848028183,
-0.055855244398117065,
-0.00426106620579958,
-0.15232570469379425,
0.2420482039451599,
0.29523324966430664,
-0.08089511096477509,
0.013313311152160168,
-0.08143246918916702,
0.03359610587358475,
0.013623652048408985,
0.10647078603506088,
0.14291693270206451,
0.3189051151275635,
-0.012403322383761406,
0.07417552173137665,
0.00009350610343972221,
-0.060927875339984894,
-0.05881281569600105,
-0.05669590085744858,
-0.02383190393447876,
-0.006449878215789795,
-0.0066023580729961395,
0.1395004689693451,
-0.26645973324775696,
0.016834815964102745,
-0.18420034646987915,
-0.20412065088748932,
-0.08061134070158005,
0.018009180203080177,
0.12304088473320007,
0.044550321996212006,
0.1389375925064087,
0.00005718287866329774,
-0.06459779292345047,
0.1276867389678955,
-0.006128685083240271,
-0.15249358117580414,
-0.10721327364444733,
0.11934854090213776,
-0.12627308070659637,
-0.04108234494924545,
-0.041037049144506454,
0.09749634563922882,
0.08271387219429016,
0.03395570069551468,
-0.07119182497262955,
0.01351496484130621,
-0.03403165191411972,
-0.021414151415228844,
-0.005451269913464785,
0.06813238561153412,
0.02072959579527378,
0.025263439863920212,
0.05354813113808632,
-0.18906964361667633,
0.01142108254134655,
-0.08744467794895172,
0.07205761224031448,
-0.032799381762742996,
0.07564347237348557,
-0.03142198547720909,
0.06004147604107857,
0.0745621845126152,
-0.055871736258268356,
0.03904830664396286,
0.04052802547812462,
-0.020588697865605354,
-0.04159161075949669,
-0.061948951333761215,
-0.1403578817844391,
-0.22841989994049072,
-0.11440495401620865,
0.09922145307064056,
0.029091045260429382,
-0.0986558049917221,
0.048632487654685974,
-0.15799014270305634,
0.05539434030652046,
-0.07917378842830658,
0.11045927554368973,
0.08010590076446533,
0.027341295033693314,
0.0007489612326025963,
0.05805785581469536,
0.06042911112308502,
0.09924639761447906,
-0.11275497078895569,
-0.08823683857917786
] |
null | null | transformers | # Model Card for Password-Model
# Model Details
## Model Description
The Password Model is intended to be used with [Credential Digger](https://github.com/SAP/credential-digger) in order to automatically filter false positive password discoveries.
- **Developed by:** SAP OSS
- **Shared by [Optional]:** Hugging Face
- **Model type:** Text Classification
- **Language(s) (NLP):** en
- **License:** Apache-2.0
- **Related Models:**
- **Parent Model:** RoBERTa
- **Resources for more information:**
- [GitHub Repo](https://github.com/SAP/credential-digger)
- [Associated Paper](https://www.scitepress.org/Papers/2021/102381/102381.pdf)
# Uses
## Direct Use
The model is directly integrated into [Credential Digger]((https://github.com/SAP/credential-digger) and can be used to filter the false positive password discoveries of a scan.
## Out-of-Scope Use
The model should not be used to intentionally create hostile or alienating environments for people.
# Training Details
## Training Data
[CodeBERT-base-mlm](https://huggingface.co/microsoft/codebert-base-mlm) fine-tuned on a dataset for leak detection.
## Training Procedure
### Preprocessing
More information needed
### Speeds, Sizes, Times
More information needed
# Evaluation
More information needed
## Testing Data, Factors & Metrics
### Testing Data
More information needed
### Factors
More information needed
### Metrics
More information needed
## Results
More information needed
# Model Examination
More information needed
# Environmental Impact
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** More information needed
- **Hours used:** More information needed
- **Cloud Provider:** More information needed
- **Compute Region:** More information needed
- **Carbon Emitted:** More information needed
# Technical Specifications [optional]
## Model Architecture and Objective
More information needed
## Compute Infrastructure
More information needed
### Hardware
More information needed
### Software
More information needed
# Citation
**BibTeX:**
```
TBD
```
# Model Card Authors [optional]
SAP OSS in collaboration with Ezi Ozoani and the Hugging Face team.
# Model Card Contact
More information needed
# How to Get Started with the Model
The model is directly integrated into Credential Digger and can be used to filter the false positive discoveries of a scan
<details>
<summary> Click to expand </summary>
```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("SAPOSS/password-model")
model = AutoModelForSequenceClassification.from_pretrained("SAPOSS/password-model")
```
</details>
| {"language": ["en"]} | text-classification | SAPOSS/password-model | [
"transformers",
"tf",
"roberta",
"text-classification",
"en",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [
"1910.09700"
] | [
"en"
] | TAGS
#transformers #tf #roberta #text-classification #en #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
| # Model Card for Password-Model
# Model Details
## Model Description
The Password Model is intended to be used with Credential Digger in order to automatically filter false positive password discoveries.
- Developed by: SAP OSS
- Shared by [Optional]: Hugging Face
- Model type: Text Classification
- Language(s) (NLP): en
- License: Apache-2.0
- Related Models:
- Parent Model: RoBERTa
- Resources for more information:
- GitHub Repo
- Associated Paper
# Uses
## Direct Use
The model is directly integrated into Credential Digger and can be used to filter the false positive password discoveries of a scan.
## Out-of-Scope Use
The model should not be used to intentionally create hostile or alienating environments for people.
# Training Details
## Training Data
CodeBERT-base-mlm fine-tuned on a dataset for leak detection.
## Training Procedure
### Preprocessing
More information needed
### Speeds, Sizes, Times
More information needed
# Evaluation
More information needed
## Testing Data, Factors & Metrics
### Testing Data
More information needed
### Factors
More information needed
### Metrics
More information needed
## Results
More information needed
# Model Examination
More information needed
# Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type: More information needed
- Hours used: More information needed
- Cloud Provider: More information needed
- Compute Region: More information needed
- Carbon Emitted: More information needed
# Technical Specifications [optional]
## Model Architecture and Objective
More information needed
## Compute Infrastructure
More information needed
### Hardware
More information needed
### Software
More information needed
BibTeX:
# Model Card Authors [optional]
SAP OSS in collaboration with Ezi Ozoani and the Hugging Face team.
# Model Card Contact
More information needed
# How to Get Started with the Model
The model is directly integrated into Credential Digger and can be used to filter the false positive discoveries of a scan
<details>
<summary> Click to expand </summary>
</details>
| [
"# Model Card for Password-Model",
"# Model Details",
"## Model Description\n \n \nThe Password Model is intended to be used with Credential Digger in order to automatically filter false positive password discoveries.\n \n- Developed by: SAP OSS\n- Shared by [Optional]: Hugging Face\n- Model type: Text Classification\n- Language(s) (NLP): en\n- License: Apache-2.0\n- Related Models: \n - Parent Model: RoBERTa\n- Resources for more information:\n - GitHub Repo \n - Associated Paper",
"# Uses",
"## Direct Use\nThe model is directly integrated into Credential Digger and can be used to filter the false positive password discoveries of a scan.",
"## Out-of-Scope Use\n \nThe model should not be used to intentionally create hostile or alienating environments for people.",
"# Training Details",
"## Training Data\n \nCodeBERT-base-mlm fine-tuned on a dataset for leak detection.",
"## Training Procedure",
"### Preprocessing\n \nMore information needed",
"### Speeds, Sizes, Times\n \nMore information needed",
"# Evaluation\n \nMore information needed",
"## Testing Data, Factors & Metrics",
"### Testing Data\n \nMore information needed",
"### Factors\n \nMore information needed",
"### Metrics\n \nMore information needed",
"## Results \n \nMore information needed",
"# Model Examination\nMore information needed",
"# Environmental Impact\n \nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n \n- Hardware Type: More information needed\n- Hours used: More information needed\n- Cloud Provider: More information needed\n- Compute Region: More information needed\n- Carbon Emitted: More information needed",
"# Technical Specifications [optional]",
"## Model Architecture and Objective\n \nMore information needed",
"## Compute Infrastructure\nMore information needed",
"### Hardware\n \nMore information needed",
"### Software\n \nMore information needed\n \n \nBibTeX:",
"# Model Card Authors [optional]\n \nSAP OSS in collaboration with Ezi Ozoani and the Hugging Face team.",
"# Model Card Contact\n \nMore information needed",
"# How to Get Started with the Model\n \nThe model is directly integrated into Credential Digger and can be used to filter the false positive discoveries of a scan\n \n<details>\n<summary> Click to expand </summary>\n\n\n</details>"
] | [
"TAGS\n#transformers #tf #roberta #text-classification #en #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n",
"# Model Card for Password-Model",
"# Model Details",
"## Model Description\n \n \nThe Password Model is intended to be used with Credential Digger in order to automatically filter false positive password discoveries.\n \n- Developed by: SAP OSS\n- Shared by [Optional]: Hugging Face\n- Model type: Text Classification\n- Language(s) (NLP): en\n- License: Apache-2.0\n- Related Models: \n - Parent Model: RoBERTa\n- Resources for more information:\n - GitHub Repo \n - Associated Paper",
"# Uses",
"## Direct Use\nThe model is directly integrated into Credential Digger and can be used to filter the false positive password discoveries of a scan.",
"## Out-of-Scope Use\n \nThe model should not be used to intentionally create hostile or alienating environments for people.",
"# Training Details",
"## Training Data\n \nCodeBERT-base-mlm fine-tuned on a dataset for leak detection.",
"## Training Procedure",
"### Preprocessing\n \nMore information needed",
"### Speeds, Sizes, Times\n \nMore information needed",
"# Evaluation\n \nMore information needed",
"## Testing Data, Factors & Metrics",
"### Testing Data\n \nMore information needed",
"### Factors\n \nMore information needed",
"### Metrics\n \nMore information needed",
"## Results \n \nMore information needed",
"# Model Examination\nMore information needed",
"# Environmental Impact\n \nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n \n- Hardware Type: More information needed\n- Hours used: More information needed\n- Cloud Provider: More information needed\n- Compute Region: More information needed\n- Carbon Emitted: More information needed",
"# Technical Specifications [optional]",
"## Model Architecture and Objective\n \nMore information needed",
"## Compute Infrastructure\nMore information needed",
"### Hardware\n \nMore information needed",
"### Software\n \nMore information needed\n \n \nBibTeX:",
"# Model Card Authors [optional]\n \nSAP OSS in collaboration with Ezi Ozoani and the Hugging Face team.",
"# Model Card Contact\n \nMore information needed",
"# How to Get Started with the Model\n \nThe model is directly integrated into Credential Digger and can be used to filter the false positive discoveries of a scan\n \n<details>\n<summary> Click to expand </summary>\n\n\n</details>"
] | [
47,
7,
3,
99,
3,
30,
28,
3,
25,
4,
8,
12,
6,
11,
8,
7,
8,
5,
8,
68,
9,
10,
8,
6,
11,
27,
7,
55
] | [
"passage: TAGS\n#transformers #tf #roberta #text-classification #en #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n# Model Card for Password-Model# Model Details## Model Description\n \n \nThe Password Model is intended to be used with Credential Digger in order to automatically filter false positive password discoveries.\n \n- Developed by: SAP OSS\n- Shared by [Optional]: Hugging Face\n- Model type: Text Classification\n- Language(s) (NLP): en\n- License: Apache-2.0\n- Related Models: \n - Parent Model: RoBERTa\n- Resources for more information:\n - GitHub Repo \n - Associated Paper# Uses## Direct Use\nThe model is directly integrated into Credential Digger and can be used to filter the false positive password discoveries of a scan.## Out-of-Scope Use\n \nThe model should not be used to intentionally create hostile or alienating environments for people.# Training Details## Training Data\n \nCodeBERT-base-mlm fine-tuned on a dataset for leak detection.## Training Procedure### Preprocessing\n \nMore information needed### Speeds, Sizes, Times\n \nMore information needed# Evaluation\n \nMore information needed## Testing Data, Factors & Metrics### Testing Data\n \nMore information needed### Factors\n \nMore information needed### Metrics\n \nMore information needed## Results \n \nMore information needed# Model Examination\nMore information needed# Environmental Impact\n \nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n \n- Hardware Type: More information needed\n- Hours used: More information needed\n- Cloud Provider: More information needed\n- Compute Region: More information needed\n- Carbon Emitted: More information needed# Technical Specifications [optional]## Model Architecture and Objective\n \nMore information needed## Compute Infrastructure\nMore information needed### Hardware\n \nMore information needed### Software\n \nMore information needed\n \n \nBibTeX:# Model Card Authors [optional]\n \nSAP OSS in collaboration with Ezi Ozoani and the Hugging Face team.# Model Card Contact\n \nMore information needed"
] | [
-0.03890452906489372,
0.21332035958766937,
-0.0035550552420318127,
0.031876083463430405,
0.11966259777545929,
0.02037443034350872,
0.04933521896600723,
0.14299696683883667,
-0.022645318880677223,
0.05682619288563728,
0.01753057725727558,
-0.015051078982651234,
0.1405669003725052,
0.10824998468160629,
0.10231611132621765,
-0.2011265754699707,
0.010682216845452785,
-0.07237878441810608,
0.044210631400346756,
0.12016613036394119,
0.14207059144973755,
-0.05752833932638168,
0.11009626090526581,
-0.048631127923727036,
-0.074237160384655,
0.02958914265036583,
-0.08630198985338211,
-0.05859987065196037,
0.0815567597746849,
0.03031088039278984,
0.04766843467950821,
-0.013685053214430809,
0.07435929775238037,
-0.318320095539093,
0.02349359728395939,
0.09640755504369736,
-0.029217317700386047,
0.06835833191871643,
0.027958283200860023,
-0.03695714846253395,
0.09028656035661697,
-0.164771169424057,
0.13256531953811646,
0.05936766043305397,
-0.0728263407945633,
-0.13629327714443207,
-0.08543277531862259,
0.13856208324432373,
0.06983517855405807,
0.1039496511220932,
-0.0294731967151165,
0.14657679200172424,
-0.06375723332166672,
0.019100047647953033,
0.17515864968299866,
-0.11511462181806564,
-0.03590914234519005,
0.062483277171850204,
0.18026874959468842,
0.08646320551633835,
-0.10703620314598083,
0.025260236114263535,
0.02600288577377796,
0.021179288625717163,
0.079121895134449,
-0.003141574328765273,
-0.01718873530626297,
0.02185959555208683,
-0.08862221240997314,
-0.043227944523096085,
0.06814514845609665,
0.06059860810637474,
-0.08938471227884293,
-0.23480282723903656,
0.008849825710058212,
-0.0461425706744194,
0.021681716665625572,
-0.09130176901817322,
0.0390285924077034,
-0.030863596126437187,
0.01146079320460558,
-0.08829642832279205,
-0.09358804672956467,
-0.02194358967244625,
0.035389382392168045,
0.06210863217711449,
0.01213567890226841,
-0.021804139018058777,
0.03387929126620293,
0.10758136957883835,
-0.0379289835691452,
-0.05381925031542778,
-0.12506003677845,
-0.03407704830169678,
-0.14940041303634644,
-0.05430426821112633,
-0.004751571454107761,
0.027455084025859833,
0.018783649429678917,
0.1810159534215927,
-0.059917598962783813,
0.05853566154837608,
0.03158444166183472,
-0.014332219026982784,
0.08908428251743317,
0.11539982259273529,
-0.02913600392639637,
-0.1354570984840393,
-0.05176878347992897,
-0.008537044748663902,
-0.023781904950737953,
-0.0182188693434,
-0.05768375098705292,
-0.030899910256266594,
0.0805586650967598,
0.13062860071659088,
0.047871872782707214,
0.02007170580327511,
-0.04304211586713791,
-0.010191816836595535,
0.1443461775779724,
-0.11091813445091248,
0.0925033688545227,
-0.04234965145587921,
-0.03571055084466934,
-0.04808733984827995,
0.025385065004229546,
0.008943883702158928,
-0.07408510148525238,
0.01920195110142231,
-0.02315683476626873,
-0.03596948832273483,
-0.08375079184770584,
-0.0110350102186203,
0.07613536715507507,
-0.06373105943202972,
0.006688018329441547,
-0.08489910513162613,
-0.15899859368801117,
-0.055077388882637024,
0.03454239293932915,
-0.051755744963884354,
-0.034462377429008484,
-0.017948215827345848,
-0.023699168115854263,
0.001915155560709536,
-0.01229051873087883,
0.08374376595020294,
-0.0026757409796118736,
0.03335138037800789,
-0.06164858862757683,
0.025467893108725548,
0.08619292825460434,
0.029723025858402252,
-0.06141303479671478,
0.028381602838635445,
-0.15121862292289734,
0.09376095980405807,
-0.1054142490029335,
0.011293105781078339,
-0.1523353010416031,
-0.03224954009056091,
0.036560699343681335,
0.03697733208537102,
0.027407729998230934,
0.1327904909849167,
-0.07754229754209518,
-0.013996013440191746,
0.10225937515497208,
-0.036215975880622864,
-0.06522149592638016,
0.12105462700128555,
-0.035941675305366516,
0.11569107323884964,
0.07463037222623825,
0.0037547280080616474,
0.0640454813838005,
-0.17645877599716187,
-0.0433475635945797,
0.023479849100112915,
-0.060418300330638885,
0.10484011471271515,
0.07426167279481888,
-0.02030154876410961,
0.07503299415111542,
0.02032659947872162,
-0.029267631471157074,
-0.05200657993555069,
-0.02504722960293293,
-0.09678731858730316,
-0.031927868723869324,
-0.051102228462696075,
0.03179740905761719,
-0.005603313911706209,
-0.06707144528627396,
-0.060824740678071976,
-0.187669038772583,
0.020748145878314972,
0.12689527869224548,
0.012541668489575386,
0.011054173111915588,
-0.10740691423416138,
0.004176408983767033,
0.034909214824438095,
-0.0046103354543447495,
-0.14069032669067383,
-0.00431777723133564,
0.056857168674468994,
-0.10648113489151001,
0.05025571957230568,
-0.03462010249495506,
0.06051858514547348,
0.006036085542291403,
-0.0680701807141304,
-0.02634497545659542,
-0.019235558807849884,
0.009460416622459888,
-0.07117906212806702,
-0.11610431969165802,
-0.03277750685811043,
-0.04388526827096939,
0.2112952172756195,
-0.12126796692609787,
0.04313129931688309,
0.0626995861530304,
0.1256198287010193,
0.011433769017457962,
-0.055194344371557236,
0.016663111746311188,
-0.031151680275797844,
-0.002659493824467063,
-0.08302509784698486,
0.004446003586053848,
-0.014492310583591461,
-0.04474949836730957,
-0.02255360037088394,
-0.10376900434494019,
-0.017163317650556564,
0.0731305256485939,
0.15913519263267517,
-0.19249294698238373,
-0.014328139834105968,
0.0017927653389051557,
-0.029868295416235924,
-0.10645541548728943,
-0.0953587219119072,
0.24492740631103516,
0.005670598242431879,
0.0332133024930954,
-0.04359705001115799,
-0.1132102906703949,
0.003581769298762083,
0.037799883633852005,
-0.07269764691591263,
0.07652507722377777,
-0.043432123959064484,
-0.1491207480430603,
0.10917048901319504,
0.043620362877845764,
0.03645037114620209,
0.05336499959230423,
0.016483334824442863,
-0.0802956372499466,
-0.04476246237754822,
0.00004390976027934812,
-0.009993740357458591,
0.13039658963680267,
-0.06985912472009659,
-0.006178709212690592,
0.02542618289589882,
-0.032929565757513046,
0.007178600877523422,
-0.03932173177599907,
0.01638251729309559,
0.06811637431383133,
-0.0011772667057812214,
0.0014908850425854325,
-0.06170616298913956,
0.0010844409698620439,
0.06879712641239166,
0.05046208202838898,
0.050905924290418625,
0.025370854884386063,
-0.002573346719145775,
-0.1285395622253418,
0.16286008059978485,
-0.10391441732645035,
-0.3120482861995697,
-0.131933331489563,
0.09269194304943085,
0.08432027697563171,
-0.03323467820882797,
0.032138895243406296,
-0.089966781437397,
-0.0592474639415741,
-0.10947879403829575,
0.042316555976867676,
0.01192743331193924,
-0.10552362352609634,
-0.003169245319440961,
0.003573175286874175,
0.05003858730196953,
-0.11791860312223434,
0.04740788787603378,
0.07921769469976425,
-0.0699605792760849,
-0.011639238335192204,
0.024076957255601883,
0.1298760622739792,
0.06898602843284607,
-0.050798311829566956,
-0.031318239867687225,
-0.00233887298963964,
0.16655054688453674,
-0.16205641627311707,
0.12482652068138123,
0.1085124984383583,
-0.024501126259565353,
0.05872434377670288,
0.05410230904817581,
-0.017650261521339417,
-0.04075411334633827,
0.021195709705352783,
0.04528024420142174,
-0.04194296523928642,
-0.26840823888778687,
-0.034853219985961914,
0.018966300413012505,
-0.05687236413359642,
0.0819653645157814,
0.057529982179403305,
0.14241230487823486,
0.07835199683904648,
-0.10474297404289246,
-0.04208921268582344,
0.017117194831371307,
0.09444062411785126,
-0.10494501143693924,
-0.03242439404129982,
0.009059564210474491,
-0.03191963583230972,
-0.02525697462260723,
0.09653500467538834,
0.05917905643582344,
0.10923782736063004,
0.02537759579718113,
0.1252867877483368,
-0.018414663150906563,
0.04830755665898323,
-0.05421231687068939,
0.015967367216944695,
-0.007447865325957537,
0.022031443193554878,
-0.022034429013729095,
-0.021896498277783394,
0.005538701545447111,
0.1009833961725235,
0.11292093992233276,
-0.01824556104838848,
-0.0579705536365509,
-0.06165417283773422,
0.06758558750152588,
0.2260158807039261,
-0.0317172035574913,
-0.16393598914146423,
-0.0636121854186058,
0.07321063429117203,
-0.038199905306100845,
-0.1538512110710144,
-0.03800235316157341,
0.07108964025974274,
-0.20632709562778473,
0.04359973594546318,
-0.007285747677087784,
0.09069065004587173,
-0.04450313001871109,
-0.029985982924699783,
-0.029449351131916046,
0.088795967400074,
-0.02153261937201023,
0.0913282111287117,
-0.1806849241256714,
0.022028744220733643,
0.019203096628189087,
0.11460327357053757,
-0.105598583817482,
0.06338614225387573,
0.006486893631517887,
0.04414958506822586,
0.20901601016521454,
0.029283752664923668,
-0.10788016766309738,
-0.10652678459882736,
-0.04691148176789284,
-0.000004021804670628626,
0.0734938383102417,
-0.1080784872174263,
0.045165784657001495,
-0.008727602660655975,
-0.015080689452588558,
0.0033008793834596872,
-0.08453016728162766,
-0.1590123325586319,
-0.14188337326049805,
0.009018356911838055,
-0.11512334644794464,
0.07941379398107529,
-0.0797426775097847,
-0.03586648777127266,
-0.10344143956899643,
0.21370719373226166,
-0.12903174757957458,
-0.04893768951296806,
-0.13819216191768646,
-0.05627409741282463,
0.08926112204790115,
-0.06506019085645676,
0.033040035516023636,
0.03788956627249718,
0.19984373450279236,
-0.04553156718611717,
-0.06086210161447525,
0.05062330141663551,
-0.0680571123957634,
-0.12840092182159424,
-0.07628282159566879,
0.1109805554151535,
0.12499608099460602,
0.09490445256233215,
0.02333088405430317,
-0.022225726395845413,
0.00688424427062273,
-0.06089552491903305,
0.05102960765361786,
0.14963458478450775,
0.022156808525323868,
0.07837347686290741,
-0.03433549031615257,
-0.08661465346813202,
-0.0816747397184372,
-0.07143329083919525,
0.16404756903648376,
0.0966467410326004,
-0.10532646626234055,
0.1480090618133545,
0.18659354746341705,
-0.10828869789838791,
-0.2569974958896637,
0.07548101246356964,
0.011049461551010609,
0.02648516744375229,
0.02355586737394333,
-0.18564116954803467,
0.011168451979756355,
0.07086661458015442,
-0.006186951417475939,
0.06524977087974548,
-0.1760256141424179,
-0.128158301115036,
0.09618888050317764,
0.0030144008342176676,
-0.1407204121351242,
-0.12238411605358124,
-0.03169223293662071,
-0.08040677011013031,
-0.13306714594364166,
0.10366568714380264,
-0.09547532349824905,
0.034956928342580795,
0.010876533575356007,
0.07037950307130814,
0.021806087344884872,
-0.03863736614584923,
0.14693383872509003,
-0.0036571288947016,
0.0577872134745121,
-0.10382907092571259,
-0.05772251635789871,
0.0718463808298111,
-0.06892047077417374,
0.14114607870578766,
-0.028734970837831497,
0.030951492488384247,
-0.10405433177947998,
-0.012488750740885735,
-0.06747865676879883,
0.09385566413402557,
-0.10696378350257874,
-0.08489516377449036,
-0.11784108728170395,
0.10248950123786926,
0.07160741835832596,
-0.07358196377754211,
0.014346561394631863,
-0.025295399129390717,
0.11937718093395233,
0.14319488406181335,
0.15271121263504028,
0.0657803937792778,
-0.09946752339601517,
0.009225203655660152,
-0.04395478591322899,
0.09476017206907272,
-0.21406126022338867,
0.007120715919882059,
0.07531401515007019,
0.01747117005288601,
0.1379600167274475,
-0.06005563959479332,
-0.1941223442554474,
-0.0210978202521801,
-0.0023327546659857035,
-0.03725886344909668,
-0.2053467482328415,
-0.032727621495723724,
0.07013004273176193,
-0.1554391235113144,
-0.010380479507148266,
0.042275115847587585,
-0.04663373902440071,
-0.06776435673236847,
0.0009743095724843442,
0.08926058560609818,
-0.03344695270061493,
0.13159704208374023,
0.03516178950667381,
0.08163995295763016,
-0.0727812722325325,
0.11269672960042953,
0.06714607030153275,
-0.10076238214969635,
0.07863118499517441,
0.0715082660317421,
-0.04407351464033127,
-0.07581649720668793,
0.12052047997713089,
0.08931411057710648,
-0.02585577219724655,
-0.05884199216961861,
0.02745802514255047,
-0.14627958834171295,
0.06731347739696503,
0.05805805325508118,
0.015397613868117332,
-0.021365953609347343,
0.08629488199949265,
0.04077722132205963,
-0.0971883162856102,
0.044437531381845474,
-0.010260545648634434,
0.03042014315724373,
-0.05654619261622429,
0.02630596049129963,
0.03767997771501541,
0.053854696452617645,
-0.016199054196476936,
-0.038707032799720764,
-0.07859703153371811,
-0.002762584714218974,
-0.11940551549196243,
-0.028290968388319016,
-0.07372621446847916,
0.030874911695718765,
-0.002374033909291029,
-0.0318981409072876,
0.04653313383460045,
0.051804572343826294,
-0.038294468075037,
-0.07445507496595383,
-0.019487857818603516,
0.08782695978879929,
-0.11345847696065903,
-0.012361937202513218,
0.06769146770238876,
-0.07611702382564545,
0.10238676518201828,
0.012289578095078468,
-0.010651572607457638,
-0.01714997924864292,
-0.20899996161460876,
0.07321657985448837,
-0.05535781383514404,
0.03116169013082981,
0.029624849557876587,
-0.18515969812870026,
-0.010718024335801601,
-0.04479428008198738,
-0.03263403847813606,
0.03160923719406128,
0.03218195214867592,
-0.06637211889028549,
0.0640377625823021,
-0.06395682692527771,
-0.05234668776392937,
-0.0628780871629715,
0.07648852467536926,
0.12231733649969101,
-0.03193451464176178,
0.12471652030944824,
-0.016166139394044876,
0.07229600101709366,
-0.16753926873207092,
-0.006639906205236912,
0.01712319441139698,
0.006246314384043217,
0.042361292988061905,
-0.009845921769738197,
0.062404800206422806,
0.0005552734946832061,
0.16261951625347137,
-0.04413415491580963,
0.08840996026992798,
0.05604204908013344,
0.0013310856884345412,
-0.04473279044032097,
0.04395923390984535,
-0.013917648233473301,
-0.023090768605470657,
-0.023834217339754105,
-0.0039923894219100475,
-0.06736715883016586,
0.014862500131130219,
-0.07954332232475281,
0.0264121200889349,
0.16280129551887512,
0.11363058537244797,
0.041708797216415405,
0.09267366677522659,
-0.05127490684390068,
-0.03570253774523735,
0.024108994752168655,
-0.04107259586453438,
-0.011052506044507027,
-0.06012016162276268,
0.05411152169108391,
0.12025535106658936,
-0.15952174365520477,
0.11302032321691513,
-0.12589766085147858,
-0.05887030437588692,
-0.014460889622569084,
-0.11080802977085114,
-0.06246388331055641,
0.00869630929082632,
0.001742084976285696,
-0.05145232379436493,
0.027224989607930183,
0.04961862042546272,
-0.012590790167450905,
-0.026918835937976837,
0.1363726407289505,
-0.04613955318927765,
-0.051117751747369766,
0.0911555141210556,
0.038012050092220306,
0.005742667708545923,
-0.04339595139026642,
0.035278767347335815,
0.06631956249475479,
0.10853207856416702,
0.09602728486061096,
0.02067112736403942,
0.009892274625599384,
0.022551678121089935,
-0.04078718274831772,
-0.07198286801576614,
0.02709456905722618,
-0.006197551265358925,
-0.05260705575346947,
0.11382336914539337,
0.03797705098986626,
0.030758121982216835,
-0.015672847628593445,
0.17593331634998322,
-0.055744342505931854,
-0.06564796715974808,
-0.14914153516292572,
0.09768527746200562,
-0.00841153971850872,
0.0024719038046896458,
0.07245644927024841,
-0.12071030586957932,
0.028972823172807693,
0.13875728845596313,
0.16873809695243835,
0.02633284591138363,
0.011301531456410885,
0.009262881241738796,
0.005330209620296955,
-0.008421806618571281,
0.07889614999294281,
0.07940897345542908,
0.07040020823478699,
-0.09657873213291168,
0.18445539474487305,
0.02805156260728836,
-0.05419258028268814,
-0.02887292020022869,
0.12904271483421326,
-0.06048966571688652,
0.028222721070051193,
-0.031768202781677246,
0.09606362134218216,
-0.08108349144458771,
-0.30053165555000305,
0.058203112334012985,
-0.10932629555463791,
-0.14321444928646088,
0.008561864495277405,
0.11298732459545135,
0.023567957803606987,
0.05049588158726692,
0.07776359468698502,
0.008068669587373734,
0.13954122364521027,
0.02638119086623192,
-0.06743886321783066,
-0.0948714092373848,
0.055545151233673096,
-0.0871192216873169,
0.2791692018508911,
0.020469825714826584,
0.05811791494488716,
0.10426519811153412,
-0.03405000641942024,
-0.1447812169790268,
-0.002109681721776724,
0.09005925804376602,
-0.030738089233636856,
0.07384078949689865,
0.1828669309616089,
-0.024110302329063416,
0.1073928028345108,
0.07584892213344574,
-0.0062444512732326984,
0.05020251125097275,
-0.08060663193464279,
0.007957438938319683,
-0.09219589829444885,
0.10723020136356354,
-0.06040973961353302,
0.1500519961118698,
0.11829864233732224,
-0.042364176362752914,
-0.03819213807582855,
-0.08013764023780823,
0.00007352273678407073,
0.0142035698518157,
0.09581119567155838,
-0.010142004117369652,
-0.1316019743680954,
0.021809047088027,
-0.026267975568771362,
0.12747333943843842,
-0.23320063948631287,
-0.09089693427085876,
0.04701391980051994,
-0.03567539528012276,
-0.02748147025704384,
0.10670952498912811,
0.07951890677213669,
0.02795129269361496,
-0.01518607884645462,
-0.15165363252162933,
-0.002643328160047531,
0.150542214512825,
-0.10917068272829056,
0.024225939065217972
] |
null | null | transformers |
# CodeTrans model for api recommendation generation
Pretrained model for api recommendation generation using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans).
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used single-task training on Api Recommendation Generation dataset.
## Intended uses & limitations
The model could be used to generate api usage for the java programming tasks.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_api_generation"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_api_generation", skip_special_tokens=True),
device=0
)
tokenized_code = "parse the uses licence node of this package , if any , and returns the license definition if theres"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/single%20task/api%20generation/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 68.71 |
| CodeTrans-ST-Base | 70.45 |
| CodeTrans-TF-Small | 68.90 |
| CodeTrans-TF-Base | 72.11 |
| CodeTrans-TF-Large | 73.26 |
| CodeTrans-MT-Small | 58.43 |
| CodeTrans-MT-Base | 67.97 |
| CodeTrans-MT-Large | 72.29 |
| CodeTrans-MT-TF-Small | 69.29 |
| CodeTrans-MT-TF-Base | 72.89 |
| CodeTrans-MT-TF-Large | **73.39** |
| State of the art | 54.42 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "parse the uses licence node of this package , if any , and returns the license definition if theres"}]} | summarization | SEBIS/code_trans_t5_base_api_generation | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #has_space #text-generation-inference #region-us
| CodeTrans model for api recommendation generation
=================================================
Pretrained model for api recommendation generation using the t5 base model architecture. It was first released in
this repository.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used single-task training on Api Recommendation Generation dataset.
Intended uses & limitations
---------------------------
The model could be used to generate api usage for the java programming tasks.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
50,
112
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #has_space #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.08324263244867325,
-0.004070251248776913,
0.00038814687286503613,
0.06138232350349426,
0.10570977628231049,
-0.00214300281368196,
0.06543456017971039,
0.06939619779586792,
0.0037551831919699907,
-0.028418060392141342,
0.09614667296409607,
0.11167153716087341,
0.010882088914513588,
0.14677266776561737,
-0.03842217102646828,
-0.19013401865959167,
0.02206922322511673,
0.061821144074201584,
-0.15624167025089264,
0.13397887349128723,
0.12518222630023956,
-0.06335568428039551,
0.10122019797563553,
-0.0069983652792871,
-0.23451288044452667,
0.05959648638963699,
-0.024048948660492897,
-0.08949357271194458,
0.12055405229330063,
0.0737130418419838,
0.12061606347560883,
0.057921018451452255,
-0.008416212163865566,
-0.19662408530712128,
0.0364326573908329,
-0.02769639901816845,
0.01060265488922596,
0.05913905054330826,
0.020181482657790184,
-0.032058097422122955,
0.18866296112537384,
-0.017948880791664124,
0.00921032764017582,
0.047514207661151886,
-0.11550305783748627,
-0.1188117191195488,
-0.02771143428981304,
-0.023855770006775856,
0.06153389438986778,
0.061456888914108276,
0.005410714540630579,
0.13913284242153168,
-0.1553138792514801,
0.1182309091091156,
0.11989366263151169,
-0.17795000970363617,
-0.026062561199069023,
0.16400906443595886,
0.12122268974781036,
-0.032646238803863525,
-0.049844443798065186,
0.0166056789457798,
0.08851403743028641,
0.00976135116070509,
0.052823033183813095,
-0.13563697040081024,
-0.2031160444021225,
0.08723844587802887,
-0.07578927278518677,
-0.057329509407281876,
0.28963616490364075,
-0.005329827778041363,
-0.0299457386136055,
-0.0573122464120388,
-0.027091560885310173,
0.009223492816090584,
0.007291139103472233,
-0.02238025888800621,
0.012060923501849174,
-0.014844170771539211,
-0.016681835055351257,
-0.00020768323156517,
-0.10158611088991165,
-0.1239895299077034,
-0.0041587213054299355,
0.08799692243337631,
-0.013186436146497726,
0.023433709517121315,
-0.14662966132164001,
0.09300551563501358,
0.06412997096776962,
-0.09124398231506348,
0.012502768076956272,
-0.07143043726682663,
-0.03469303250312805,
-0.008505473844707012,
-0.07172199338674545,
-0.15842033922672272,
0.09216249734163284,
0.03449004143476486,
-0.0680885910987854,
0.04191809892654419,
0.018419645726680756,
0.07725272327661514,
0.047986146062612534,
0.18654771149158478,
-0.024543290957808495,
-0.08087380975484848,
0.031518761068582535,
-0.014952402561903,
-0.05046119913458824,
0.00026329176034778357,
-0.07167156785726547,
-0.051617104560136795,
0.031031154096126556,
0.11635664105415344,
-0.0928015410900116,
0.08223643153905869,
-0.07041355967521667,
-0.023609992116689682,
0.006741930264979601,
-0.13325969874858856,
-0.010239783674478531,
0.005560519639402628,
-0.05945007875561714,
-0.06519802659749985,
0.12180905044078827,
-0.059520088136196136,
-0.0800459012389183,
-0.04433702677488327,
-0.08605936169624329,
-0.004595561884343624,
-0.11189323663711548,
-0.1085159033536911,
0.02192148007452488,
0.055193666368722916,
0.05480029806494713,
-0.12390002608299255,
-0.14204095304012299,
-0.005378330126404762,
0.07892036437988281,
-0.008374981582164764,
0.04476483538746834,
-0.07617569714784622,
-0.043316490948200226,
-0.018842380493879318,
-0.020081497728824615,
0.06799009442329407,
-0.07089004665613174,
0.07739013433456421,
0.07835198938846588,
0.07562430948019028,
-0.0523725263774395,
0.052867740392684937,
-0.13221079111099243,
0.07885269075632095,
-0.1788279265165329,
0.08775825798511505,
-0.07012183219194412,
0.11211884766817093,
-0.10910438746213913,
-0.06564008444547653,
0.03006763383746147,
0.06982520967721939,
0.057183053344488144,
0.12140583992004395,
-0.15898442268371582,
-0.0340394601225853,
0.12842588126659393,
-0.106868676841259,
-0.21419210731983185,
0.06708048284053802,
-0.07621818035840988,
0.21003566682338715,
0.033839885145425797,
0.2056177407503128,
0.13101589679718018,
-0.0367656834423542,
0.05315421521663666,
0.07667063176631927,
-0.03278043493628502,
-0.05908573791384697,
0.07205846160650253,
0.08230853080749512,
-0.127701073884964,
0.060868412256240845,
-0.03216350078582764,
0.102208711206913,
-0.03329626843333244,
-0.04616187885403633,
-0.026363441720604897,
-0.05877934768795967,
0.05886329337954521,
-0.003976356703788042,
0.08789106458425522,
-0.02085822820663452,
-0.018704542890191078,
0.08773019164800644,
0.10466361045837402,
-0.12310285121202469,
-0.00010586480493657291,
-0.09225597977638245,
0.05128577724099159,
-0.11852686107158661,
0.03140004724264145,
-0.19879022240638733,
0.013699658215045929,
0.01344621367752552,
0.014935880899429321,
0.029251432046294212,
0.09785445034503937,
0.009278794750571251,
-0.006828873418271542,
0.01224474422633648,
0.008912199176847935,
0.01090695708990097,
-0.003316863439977169,
-0.03132149204611778,
-0.09683115780353546,
-0.05360179767012596,
-0.057624559849500656,
0.0013579025398939848,
-0.18764513731002808,
-0.00016014142602216452,
0.05528280511498451,
0.06397741287946701,
0.02775164321064949,
0.03864925727248192,
0.047821465879678726,
0.06786084175109863,
-0.05639050528407097,
-0.01470872014760971,
0.0555652491748333,
0.009204832836985588,
-0.10325878858566284,
0.09831970930099487,
-0.07822683453559875,
0.05902823433279991,
0.11524631083011627,
-0.14851944148540497,
-0.05473259091377258,
-0.02173418365418911,
-0.03511161357164383,
-0.026943573728203773,
0.009011228568851948,
-0.029481299221515656,
0.1535685956478119,
-0.015121810138225555,
0.16194337606430054,
-0.12519562244415283,
-0.05137786641716957,
-0.03172317519783974,
-0.009515902027487755,
0.015146544203162193,
0.14278119802474976,
0.05385401099920273,
-0.1853342205286026,
0.06815193593502045,
0.118377685546875,
-0.03821629285812378,
0.2148071676492691,
-0.035816650837659836,
-0.03280601650476456,
-0.036387667059898376,
0.06603355705738068,
-0.04132546856999397,
0.1548980325460434,
-0.21284356713294983,
-0.04126156494021416,
0.026945512741804123,
-0.000281838933005929,
0.10251704603433609,
-0.11762447655200958,
-0.010702055878937244,
0.0293254591524601,
-0.0341787114739418,
-0.09192598611116409,
0.07056131958961487,
0.007885019294917583,
0.032402824610471725,
-0.0024086260236799717,
-0.022405967116355896,
0.026776621118187904,
-0.03033292107284069,
-0.10950601100921631,
0.22230041027069092,
-0.08111561089754105,
-0.2643514573574066,
-0.17071300745010376,
0.08628562837839127,
-0.015054691582918167,
-0.010926216840744019,
0.06477008759975433,
-0.048831112682819366,
-0.041648849844932556,
-0.06025974825024605,
0.10967293381690979,
-0.020720234140753746,
-0.03521813452243805,
-0.023208115249872208,
0.0690203532576561,
0.0057252999395132065,
-0.19853168725967407,
-0.00830833613872528,
0.016485240310430527,
0.06813937425613403,
0.022319084033370018,
-0.12452998012304306,
0.11111390590667725,
0.09006965160369873,
-0.06621673703193665,
0.04468619450926781,
-0.02304990030825138,
0.2224038988351822,
-0.06703716516494751,
-0.06028791889548302,
0.15392008423805237,
-0.07441077381372452,
-0.010606183670461178,
0.04681631922721863,
-0.000027798923838417977,
-0.10740877687931061,
0.040660396218299866,
-0.04255986213684082,
-0.07002836465835571,
-0.2257843017578125,
-0.09102782607078552,
-0.09423563629388809,
0.0986279547214508,
0.017693310976028442,
0.026880599558353424,
-0.07732192426919937,
0.05379262566566467,
0.09233991801738739,
0.12845897674560547,
-0.007820608094334602,
0.06200823560357094,
0.07348491251468658,
-0.011977491900324821,
0.027820026502013206,
-0.10556742548942566,
-0.0508660227060318,
0.029683802276849747,
0.08986565470695496,
0.18562248349189758,
-0.010195715352892876,
0.14880971610546112,
0.08113619685173035,
0.06263759732246399,
0.04926365613937378,
0.14957259595394135,
-0.10608866065740585,
0.004389932844787836,
-0.017821358516812325,
-0.05227142199873924,
-0.12518317997455597,
0.03967566415667534,
-0.04538029432296753,
0.04948651045560837,
-0.11254715919494629,
-0.08325286954641342,
0.06611679494380951,
0.07081519812345505,
0.016276398673653603,
-0.2577133774757385,
-0.1055038720369339,
0.03905998542904854,
-0.07133249193429947,
-0.050707120448350906,
0.04484976828098297,
0.18259352445602417,
-0.10890360921621323,
-0.023267146199941635,
-0.0389319583773613,
0.15090158581733704,
-0.02814292535185814,
0.030184421688318253,
-0.0672115907073021,
-0.04860861226916313,
0.01095745898783207,
0.15446434915065765,
-0.2118285596370697,
0.23256941139698029,
-0.0041998522356152534,
-0.0009978357702493668,
-0.061705995351076126,
0.022666551172733307,
0.012919381260871887,
0.09586676955223083,
0.10113215446472168,
-0.014904026873409748,
-0.05976327508687973,
-0.14741340279579163,
0.049015287309885025,
0.08270814269781113,
0.04081719368696213,
-0.010046271607279778,
0.05897589772939682,
-0.024933159351348877,
0.02503201551735401,
0.0010881100315600634,
-0.001088171498849988,
-0.09750006347894669,
-0.08468786627054214,
0.005541081074625254,
-0.01885855570435524,
0.056186240166425705,
-0.03613563999533653,
0.0027638154570013285,
0.04572644457221031,
0.17418573796749115,
-0.03205316141247749,
-0.04348299652338028,
-0.09856298565864563,
0.021430136635899544,
0.13910363614559174,
-0.06816495954990387,
-0.00387842976488173,
0.001096487627364695,
0.04250676929950714,
-0.004151241388171911,
-0.1264544278383255,
0.06250254064798355,
-0.05781340226531029,
-0.008498750627040863,
-0.02436739020049572,
0.07310183346271515,
-0.03082187846302986,
0.0007343590259552002,
0.06359131634235382,
-0.038873590528964996,
-0.07128257304430008,
-0.13905847072601318,
-0.1145726665854454,
-0.0489964559674263,
0.05279383435845375,
0.028224684298038483,
-0.12704616785049438,
0.026226822286844254,
0.024142801761627197,
-0.024342387914657593,
0.2010313719511032,
0.1138060912489891,
-0.06108497083187103,
0.031193023547530174,
0.11747970432043076,
-0.09020669758319855,
-0.2604810297489166,
0.020748570561408997,
-0.04031044989824295,
0.09379842132329941,
0.03443167731165886,
-0.0948934331536293,
0.08279088884592056,
-0.02097368985414505,
0.02303592674434185,
-0.006243863608688116,
-0.2687211036682129,
-0.10980364680290222,
0.09777793288230896,
0.11711502075195312,
0.10535488277673721,
-0.1252315789461136,
-0.05247560143470764,
-0.08210138976573944,
-0.23017378151416779,
0.16495269536972046,
-0.11555499583482742,
0.09511632472276688,
-0.018534544855356216,
0.05459147319197655,
0.029432857409119606,
-0.058209143579006195,
0.13169172406196594,
-0.0024814645294100046,
0.10167817026376724,
-0.03399314358830452,
-0.11425499618053436,
0.11104530096054077,
-0.045805178582668304,
0.1533593386411667,
-0.1304854452610016,
0.08456820249557495,
-0.21703508496284485,
-0.02483569085597992,
-0.05184558033943176,
0.054875265806913376,
-0.014445072039961815,
-0.06427053362131119,
-0.08045324683189392,
0.03214627131819725,
0.034164171665906906,
0.012580733746290207,
0.12568911910057068,
-0.05697864666581154,
0.03116176277399063,
0.14847253262996674,
0.12784132361412048,
-0.05700111389160156,
0.002820485970005393,
0.07366670668125153,
0.027577554807066917,
0.11544498801231384,
-0.26873457431793213,
0.08345896005630493,
0.11234266310930252,
0.01776812970638275,
0.12266835570335388,
0.07542185485363007,
-0.03277035057544708,
0.028545862063765526,
0.09417746961116791,
-0.12660637497901917,
-0.06452908366918564,
-0.051118165254592896,
-0.05959179997444153,
-0.00029184468439780176,
0.05680744722485542,
0.13098016381263733,
-0.08187997341156006,
-0.006102823186665773,
0.00199095718562603,
-0.02507728524506092,
-0.13803023099899292,
0.11728601902723312,
0.046587299555540085,
0.0808941051363945,
-0.07884790748357773,
0.05905258283019066,
0.0596618726849556,
-0.14243322610855103,
-0.025602731853723526,
0.10799568891525269,
-0.12410534918308258,
-0.0867147445678711,
-0.01785833202302456,
0.26320141553878784,
-0.07899744808673859,
-0.07454913854598999,
-0.13451777398586273,
-0.07190392911434174,
-0.001673417049460113,
0.21376626193523407,
0.11214165389537811,
0.10640472173690796,
-0.030333803966641426,
-0.01893816702067852,
-0.09264447540044785,
0.07119004428386688,
0.08117904514074326,
0.02277998998761177,
-0.09168678522109985,
0.09612902253866196,
-0.008878530003130436,
0.1370653361082077,
-0.06126127392053604,
-0.032241810113191605,
-0.1813666671514511,
0.06231960654258728,
-0.1488129198551178,
0.0460527166724205,
-0.06871574372053146,
0.02069118618965149,
0.013900473713874817,
-0.011485165916383266,
-0.048052847385406494,
0.05125883221626282,
-0.09656893461942673,
0.015430111438035965,
-0.006139521021395922,
0.08859269320964813,
-0.07740944623947144,
-0.018000110983848572,
0.08416011929512024,
-0.05964435264468193,
0.08499456942081451,
-0.006227487698197365,
-0.06419218331575394,
0.08903088420629501,
-0.16720156371593475,
-0.021041367202997208,
0.04022087901830673,
0.010995736345648766,
0.06559747457504272,
-0.07127916812896729,
0.03693410009145737,
0.031073426827788353,
0.04714514687657356,
-0.0008464090060442686,
0.09996414184570312,
-0.1177210733294487,
-0.09408018738031387,
-0.05304785072803497,
-0.10954617708921432,
-0.03176761418581009,
0.045605048537254333,
0.045887935906648636,
0.09476906061172485,
0.08594691008329391,
-0.016034448519349098,
0.016325464472174644,
-0.08305704593658447,
-0.024516133591532707,
0.030026400461792946,
-0.0947980284690857,
-0.04886622726917267,
-0.08733359724283218,
0.024333827197551727,
-0.06732813268899918,
0.19909782707691193,
0.023188186809420586,
0.09830953925848007,
-0.007578644435852766,
-0.04344802722334862,
0.020655160769820213,
0.0560508593916893,
0.23240011930465698,
-0.03685067221522331,
0.059108342975378036,
-0.0585576668381691,
0.07459370046854019,
0.02477855607867241,
0.061425793915987015,
0.07562960684299469,
0.1522740125656128,
-0.029134660959243774,
0.11874938756227493,
0.019457601010799408,
0.024774551391601562,
-0.03486591577529907,
-0.052793949842453,
0.0605367012321949,
0.07479812949895859,
-0.05918506160378456,
0.09578273445367813,
0.12575297057628632,
-0.11985420435667038,
0.10218653827905655,
-0.012333677150309086,
-0.09626300632953644,
-0.03785078600049019,
-0.012896058149635792,
-0.05503840744495392,
-0.15851636230945587,
-0.004376606550067663,
-0.12275973707437515,
-0.03768715634942055,
0.06342015415430069,
0.027687029913067818,
-0.04799456521868706,
0.18692682683467865,
-0.009998148307204247,
-0.067192442715168,
0.05652279034256935,
-0.016440389677882195,
0.016737883910536766,
-0.023144636303186417,
0.07294473797082901,
0.006284869741648436,
-0.0166859719902277,
-0.008520363830029964,
0.03976834937930107,
-0.037855155766010284,
-0.009456876665353775,
-0.0788024440407753,
-0.0367996022105217,
-0.04167599976062775,
0.04062945023179054,
0.012973221950232983,
0.030470434576272964,
0.028515493497252464,
-0.037465404719114304,
-0.0023931963369250298,
0.23931947350502014,
-0.0459323488175869,
-0.0696059986948967,
-0.1470162570476532,
0.15095697343349457,
0.03770335763692856,
0.06073635816574097,
-0.002406149171292782,
-0.06202765554189682,
-0.04462224990129471,
0.2583902180194855,
0.20976389944553375,
-0.05042366683483124,
0.009776444174349308,
0.010107001289725304,
0.01618148945271969,
-0.009413979947566986,
0.1244063451886177,
0.03383458033204079,
0.2001720815896988,
-0.025167668238282204,
-0.061839811503887177,
-0.0516248494386673,
-0.05845838412642479,
0.031295888125896454,
0.11450695991516113,
0.02356274239718914,
-0.06604962050914764,
-0.03988160192966461,
0.07074405252933502,
-0.1282726228237152,
-0.1030551940202713,
0.047241903841495514,
-0.18587397038936615,
-0.08717001229524612,
-0.0687519907951355,
0.05947877839207649,
-0.05124003812670708,
0.046731360256671906,
-0.0430382564663887,
-0.005863628350198269,
0.03998279944062233,
0.025848684832453728,
-0.10632014274597168,
-0.10092949867248535,
0.057364579290151596,
-0.03469790890812874,
0.11320094764232635,
-0.031604520976543427,
0.10010737925767899,
0.111209936439991,
0.028191842138767242,
-0.04591188207268715,
0.0455232635140419,
0.0700485035777092,
0.02123195305466652,
0.05243566632270813,
0.05655156448483467,
-0.02615189552307129,
0.14874504506587982,
-0.03269147500395775,
-0.12373363226652145,
0.03582350164651871,
-0.03702682629227638,
0.0022656687069684267,
-0.1224040612578392,
-0.04427698627114296,
-0.08041687309741974,
0.09226454049348831,
0.1724868267774582,
-0.04678290709853172,
0.014694984070956707,
-0.0756988376379013,
0.1324695646762848,
-0.00181778974365443,
-0.009537314996123314,
-0.07610877603292465,
-0.1433912217617035,
-0.01026365626603365,
0.026862913742661476,
-0.02212177775800228,
-0.22619810700416565,
-0.012985149398446083,
-0.04320460185408592,
-0.014140601269900799,
-0.017843974754214287,
0.12053798884153366,
0.1434406191110611,
0.03890572860836983,
-0.027983708307147026,
-0.20331056416034698,
-0.0065091634169220924,
0.07350334525108337,
-0.1020413488149643,
-0.13709643483161926
] |
null | null | transformers |
# CodeTrans model for api recommendation generation
Pretrained model for api recommendation generation using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans).
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate api usage for the java programming tasks.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_api_generation_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_api_generation_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "parse the uses licence node of this package , if any , and returns the license definition if theres"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/api%20generation/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 480,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 68.71 |
| CodeTrans-ST-Base | 70.45 |
| CodeTrans-TF-Small | 68.90 |
| CodeTrans-TF-Base | 72.11 |
| CodeTrans-TF-Large | 73.26 |
| CodeTrans-MT-Small | 58.43 |
| CodeTrans-MT-Base | 67.97 |
| CodeTrans-MT-Large | 72.29 |
| CodeTrans-MT-TF-Small | 69.29 |
| CodeTrans-MT-TF-Base | 72.89 |
| CodeTrans-MT-TF-Large | **73.39** |
| State of the art | 54.42 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "parse the uses licence node of this package , if any , and returns the license definition if theres"}]} | summarization | SEBIS/code_trans_t5_base_api_generation_multitask | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #has_space #text-generation-inference #region-us
| CodeTrans model for api recommendation generation
=================================================
Pretrained model for api recommendation generation using the t5 base model architecture. It was first released in
this repository.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate api usage for the java programming tasks.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 480,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 480,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 480,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
50,
61,
143
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #has_space #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 480,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.12256897240877151,
-0.015976449474692345,
0.000032305484637618065,
0.14250300824642181,
0.10258165746927261,
0.012976809404790401,
0.06174398586153984,
0.07078048586845398,
-0.04345369338989258,
0.018698954954743385,
0.045772239565849304,
0.0211018193513155,
0.027914123609662056,
0.18603473901748657,
0.0077232192270457745,
-0.16173776984214783,
0.008575989864766598,
0.02495438978075981,
-0.05349770560860634,
0.11569982022047043,
0.10738357156515121,
-0.07696033269166946,
0.04656889662146568,
-0.04059641808271408,
-0.2378549724817276,
0.05867154896259308,
-0.013050624169409275,
-0.07451783120632172,
0.10561887919902802,
0.055464327335357666,
0.1266179382801056,
-0.0014985054731369019,
0.03744729235768318,
-0.11534920334815979,
0.006338912062346935,
0.030196359381079674,
0.03128744661808014,
0.02660209685564041,
0.06019331514835358,
0.0585920549929142,
0.15607768297195435,
0.002792404731735587,
0.0453871414065361,
0.05523095652461052,
-0.07603674381971359,
-0.15032494068145752,
-0.015520055778324604,
0.01627032645046711,
0.05294625461101532,
0.08847799897193909,
-0.01092330738902092,
0.127650186419487,
-0.1397540271282196,
0.12460733950138092,
0.11421128362417221,
-0.24017205834388733,
-0.015294607728719711,
0.12140544503927231,
0.09485305845737457,
0.09519573301076889,
-0.04207392781972885,
-0.04653184488415718,
0.0930064395070076,
0.0539315827190876,
0.06021875888109207,
-0.07975451648235321,
-0.1073530837893486,
0.03558693081140518,
-0.09423045068979263,
-0.06988690048456192,
0.24026696383953094,
0.0014920108951628208,
-0.0618584081530571,
-0.0833420529961586,
-0.04124949499964714,
-0.1394696682691574,
0.03220966458320618,
0.02616213448345661,
0.0021559461019933224,
-0.02818230725824833,
0.04523622989654541,
0.034832436591386795,
-0.09833836555480957,
-0.13939666748046875,
0.022608423605561256,
0.09161211550235748,
0.05442163348197937,
0.027164775878190994,
-0.08923038840293884,
0.11019455641508102,
0.037670835852622986,
-0.0686974823474884,
-0.016334274783730507,
-0.029659975320100784,
-0.11005210876464844,
0.03921840712428093,
-0.05916726216673851,
-0.19051702320575714,
0.00399256544187665,
-0.006488574203103781,
-0.03026563860476017,
0.0443386472761631,
0.0281368475407362,
0.030002405866980553,
0.03018997795879841,
0.19887255132198334,
0.011400168761610985,
-0.0939766988158226,
0.044414520263671875,
0.04168914258480072,
-0.05565522983670235,
-0.027105014771223068,
-0.06671284884214401,
-0.08959735184907913,
0.07410561293363571,
0.09586750715970993,
-0.13704314827919006,
0.04720623418688774,
-0.05950283259153366,
-0.03929324820637703,
0.024185363203287125,
-0.16549284756183624,
0.0005477384547702968,
0.0059184422716498375,
-0.053894542157649994,
-0.055192310363054276,
0.09114901721477509,
-0.1413186490535736,
-0.14698785543441772,
-0.0441853366792202,
-0.08198648691177368,
-0.05004376545548439,
-0.15492264926433563,
-0.15273986756801605,
-0.013459738343954086,
-0.038455355912446976,
0.011860562488436699,
-0.08444111049175262,
-0.1400805413722992,
-0.023772213608026505,
0.03207523003220558,
-0.002873649587854743,
0.005339242983609438,
-0.06195387244224548,
-0.016558468341827393,
-0.006988770328462124,
-0.03773823007941246,
0.0218957606703043,
-0.048723358660936356,
0.10028692334890366,
0.08622124791145325,
0.045347053557634354,
-0.0072351740673184395,
0.0543353445827961,
-0.061559293419122696,
0.061570122838020325,
-0.1267111450433731,
0.0919405147433281,
-0.07307878881692886,
0.08130999654531479,
-0.045844901353120804,
-0.10517151653766632,
0.0726039782166481,
0.06326272338628769,
0.05843840166926384,
0.033893972635269165,
-0.1556599885225296,
-0.014807317405939102,
0.1724151372909546,
-0.12600308656692505,
-0.12431518733501434,
0.10227809101343155,
-0.045910436660051346,
0.0868154838681221,
0.06094290688633919,
0.15250326693058014,
0.15167057514190674,
-0.04630204662680626,
0.03368215635418892,
0.041390303522348404,
0.05250486731529236,
-0.12466825544834137,
0.08307462930679321,
0.06325720250606537,
-0.07951284199953079,
0.06434807181358337,
-0.02995884045958519,
0.09113185107707977,
-0.007759633008390665,
-0.03527837246656418,
-0.05113429203629494,
-0.06251648813486099,
-0.003421568311750889,
0.003616802394390106,
0.07785855233669281,
-0.07324150949716568,
-0.06356687098741531,
0.07502666115760803,
0.165669783949852,
-0.13419879972934723,
-0.003716627135872841,
-0.0746791735291481,
0.03768071532249451,
-0.08891879767179489,
0.020177071914076805,
-0.16966307163238525,
0.034846678376197815,
0.07773864269256592,
-0.033663295209407806,
0.04352670535445213,
0.14302164316177368,
0.015361363999545574,
0.05243430286645889,
0.0061717345379292965,
-0.011644672602415085,
-0.10418520867824554,
-0.04237179085612297,
-0.05555357038974762,
-0.06505079567432404,
-0.09411656856536865,
-0.06546507030725479,
-0.0375465489923954,
-0.18403612077236176,
0.010296559892594814,
0.0022734105587005615,
0.030173037201166153,
0.03495887294411659,
-0.008739249780774117,
0.016856949776411057,
0.06857122480869293,
-0.05298030003905296,
-0.0379563607275486,
0.019256623461842537,
0.01631663180887699,
-0.033861737698316574,
-0.014759203419089317,
-0.09142837673425674,
0.03137549012899399,
0.10302381217479706,
0.038103993982076645,
-0.0725833922624588,
0.026635322719812393,
-0.022218136116862297,
-0.0440654382109642,
0.017040621489286423,
-0.06464197486639023,
0.13749058544635773,
-0.007453805301338434,
0.19917334616184235,
-0.1627255380153656,
-0.04439136013388634,
-0.009704147465527058,
0.025252755731344223,
0.033888135105371475,
0.1321856528520584,
-0.005215250421315432,
-0.11170672625303268,
0.04949392005801201,
0.03262712433934212,
-0.07451407611370087,
0.2359672337770462,
-0.050376325845718384,
-0.09228618443012238,
0.007338331546634436,
0.0845508947968483,
-0.020549481734633446,
0.15685264766216278,
-0.1818004697561264,
-0.02703171782195568,
0.023572424426674843,
0.024282298982143402,
0.0750947818160057,
-0.12760592997074127,
0.0025638628285378218,
0.0227162204682827,
-0.062150754034519196,
-0.06676700711250305,
0.008187796920537949,
-0.011099934577941895,
0.0437234565615654,
-0.009274977259337902,
-0.0306413434445858,
0.007859716191887856,
-0.03943672776222229,
-0.09095006436109543,
0.2060752809047699,
-0.11596361547708511,
-0.22330814599990845,
-0.21178925037384033,
0.12540575861930847,
-0.05991256609559059,
-0.003204703563824296,
0.03218092396855354,
-0.08249083161354065,
-0.06949486583471298,
-0.07712868601083755,
0.16247954964637756,
-0.07445720583200455,
-0.013383296318352222,
-0.0035954883787781,
0.06777089089155197,
0.00908642914146185,
-0.2254609763622284,
0.035296112298965454,
-0.00003843560261884704,
0.00215260311961174,
0.0011392331216484308,
-0.09183402359485626,
0.0922996774315834,
0.14775948226451874,
-0.07276821881532669,
0.027718059718608856,
0.006668323650956154,
0.19734971225261688,
-0.03798392042517662,
-0.05195325240492821,
0.14657780528068542,
0.0001852961868280545,
-0.006897287908941507,
0.024855952709913254,
-0.008259080350399017,
-0.09228256344795227,
0.05476008728146553,
-0.010071620345115662,
-0.025912441313266754,
-0.258489727973938,
-0.018044400960206985,
-0.08788280934095383,
0.04005458578467369,
0.04715663194656372,
0.05128289759159088,
-0.06593365222215652,
0.015312421135604382,
0.05751658231019974,
0.14128440618515015,
-0.00517881428822875,
0.047157805413007736,
0.06526105105876923,
-0.003612247295677662,
0.013763158582150936,
-0.10015963017940521,
-0.003023361088708043,
0.07102371007204056,
0.10217421501874924,
0.26593586802482605,
-0.10123381018638611,
0.2037847936153412,
0.04904738441109657,
0.06608321517705917,
0.05232519283890724,
0.16641901433467865,
-0.12078812718391418,
0.027466537430882454,
-0.0034897990990430117,
-0.025871682912111282,
-0.10578498989343643,
0.02310444787144661,
-0.0608033649623394,
0.0521894246339798,
-0.1082928404211998,
-0.0732554942369461,
0.009360997937619686,
0.13994017243385315,
0.03867349401116371,
-0.2174513190984726,
-0.10714644193649292,
0.022951027378439903,
-0.09461953490972519,
-0.11435602605342865,
0.06318091601133347,
0.2521936893463135,
-0.05846088007092476,
-0.04307849705219269,
-0.008487136103212833,
0.12600186467170715,
-0.0343753956258297,
-0.026642246171832085,
-0.035814862698316574,
0.05365290120244026,
0.016083331778645515,
0.12622390687465668,
-0.25825512409210205,
0.15143154561519623,
-0.005937220528721809,
0.05773159861564636,
-0.050719354301691055,
0.063067726790905,
-0.045340754091739655,
0.06488098949193954,
0.04683314263820648,
-0.008614126592874527,
-0.010026933625340462,
-0.16038577258586884,
0.012042977847158909,
0.03697900474071503,
0.024793168529868126,
0.038460563868284225,
0.05948573723435402,
-0.0011678846785798669,
0.04086736962199211,
0.0026888567954301834,
-0.09376772493124008,
-0.08437442779541016,
-0.09419884532690048,
0.005550973117351532,
-0.028163839131593704,
-0.03672074154019356,
-0.05479031428694725,
-0.024576708674430847,
0.033531904220581055,
0.16246865689754486,
-0.060717929154634476,
-0.06853358447551727,
-0.08725757151842117,
0.02111627720296383,
0.14992181956768036,
-0.06979554146528244,
0.044943396002054214,
0.00003005425060109701,
0.04780661314725876,
-0.003480475628748536,
-0.08788285404443741,
0.055763229727745056,
-0.030613433569669724,
-0.07306747883558273,
-0.017696937546133995,
0.0645008459687233,
0.0002107125910697505,
0.009430263191461563,
0.008952144533395767,
-0.06587254256010056,
-0.05356138199567795,
-0.12379016727209091,
-0.11858371645212173,
-0.0330977737903595,
0.03416915982961655,
0.0278093870729208,
-0.1325143575668335,
-0.05682002007961273,
0.012927143834531307,
-0.04412371292710304,
0.12053526192903519,
0.16153784096240997,
-0.07080428302288055,
0.03699595853686333,
0.13059939444065094,
-0.05541699379682541,
-0.1732703149318695,
0.01405899878591299,
0.05337383598089218,
0.12227083742618561,
-0.02998071350157261,
-0.14954151213169098,
0.04027434065937996,
0.013489783741533756,
0.02638660930097103,
0.01471684593707323,
-0.3293159604072571,
-0.13759250938892365,
0.07032071799039841,
0.1484404355287552,
0.07296758145093918,
-0.10315436124801636,
-0.025112226605415344,
-0.060710735619068146,
-0.18072746694087982,
0.12041627615690231,
-0.035238806158304214,
0.12796586751937866,
-0.04953227564692497,
0.019190357998013496,
0.02915559522807598,
-0.05397237837314606,
0.09217017889022827,
0.03266094997525215,
0.11783179640769958,
-0.04159184545278549,
0.016359636560082436,
0.13426026701927185,
-0.04046735540032387,
0.1798267662525177,
-0.13310833275318146,
0.09877623617649078,
-0.2440692037343979,
-0.06710769236087799,
-0.07769607752561569,
0.0069539109244942665,
-0.03872052952647209,
-0.05475945398211479,
-0.06947480142116547,
0.03867053613066673,
0.0027040732093155384,
-0.021388834342360497,
0.059627313166856766,
-0.03514856845140457,
-0.00972238089889288,
0.11839339882135391,
0.08615119755268097,
-0.020838698372244835,
-0.03637463599443436,
0.05347644165158272,
0.05171170458197594,
0.10634739696979523,
-0.218335822224617,
0.03502605855464935,
0.10542863607406616,
-0.002264767186716199,
0.12366069853305817,
0.04688983038067818,
-0.10353763401508331,
0.024994036182761192,
0.09899168461561203,
-0.08283824473619461,
-0.07814066112041473,
-0.018283270299434662,
-0.05786754563450813,
-0.0439688004553318,
0.05854938551783562,
0.10007383674383163,
-0.058582767844200134,
-0.020309722051024437,
-0.025642437860369682,
-0.030060352757573128,
-0.10856645554304123,
0.18967095017433167,
0.06444277614355087,
0.08538616448640823,
-0.0638420507311821,
0.05498693138360977,
0.09349147975444794,
-0.08836928009986877,
0.0049562412314116955,
0.18617519736289978,
-0.09810549765825272,
-0.055540118366479874,
0.057944025844335556,
0.20481199026107788,
-0.023433364927768707,
-0.05064472183585167,
-0.11987172067165375,
-0.0658172145485878,
0.03385471925139427,
0.15036547183990479,
0.09039454162120819,
0.10828510671854019,
-0.04244062677025795,
-0.010369044728577137,
-0.091516874730587,
0.08373069763183594,
0.06760108470916748,
0.04386930540204048,
-0.12142159789800644,
0.14760653674602509,
0.022939322516322136,
0.10258976370096207,
-0.03699008747935295,
-0.00964314304292202,
-0.11501693725585938,
0.050389740616083145,
-0.09347379952669144,
0.027517138049006462,
-0.013252799399197102,
0.05475956201553345,
-0.021550238132476807,
-0.001406836323440075,
-0.027046559378504753,
0.06726787984371185,
-0.091863714158535,
-0.000808886659797281,
-0.01013302430510521,
0.052747942507267,
-0.0499391071498394,
-0.019257744774222374,
0.03700253367424011,
-0.09187519550323486,
0.12869220972061157,
-0.04208803549408913,
-0.03931708261370659,
0.08581981807947159,
-0.04164542257785797,
0.059893954545259476,
0.008489972911775112,
0.04920550063252449,
0.024945147335529327,
0.03360222280025482,
0.07414205372333527,
0.03179299458861351,
0.05055319145321846,
0.014919908717274666,
0.08668766915798187,
-0.13654622435569763,
-0.10106363147497177,
-0.054620902985334396,
-0.10763437300920486,
-0.058704815804958344,
0.10554950684309006,
0.06709220260381699,
0.10152128338813782,
0.07841570675373077,
-0.014480073004961014,
0.0003698489163070917,
-0.12591812014579773,
-0.06213903799653053,
0.03501623123884201,
-0.05641503259539604,
-0.05216270685195923,
-0.060386743396520615,
0.03868663311004639,
-0.03300150856375694,
0.13952866196632385,
0.02903045155107975,
0.03355737775564194,
-0.02553357370197773,
-0.056484248489141464,
0.001993372803553939,
0.030868902802467346,
0.21804189682006836,
-0.07729604095220566,
0.046400200575590134,
-0.010330685414373875,
0.005139483138918877,
-0.002016895916312933,
0.13330261409282684,
0.09585688263177872,
0.14624279737472534,
-0.016460277140140533,
0.09582072496414185,
0.023082705214619637,
-0.024915004149079323,
-0.08456472307443619,
0.017076564952731133,
0.017260659486055374,
0.0908321887254715,
-0.07554871588945389,
0.13184890151023865,
0.08481588214635849,
-0.11421697586774826,
0.10775420814752579,
0.012719690799713135,
-0.12606488168239594,
-0.03426418825984001,
-0.006606419570744038,
-0.035885170102119446,
-0.14215652644634247,
0.014891806058585644,
-0.12700894474983215,
0.0012281592935323715,
0.041052695363759995,
0.05415395647287369,
-0.06960102170705795,
0.15893688797950745,
0.04309045523405075,
-0.050329092890024185,
0.06076902896165848,
0.002217128174379468,
0.03581657260656357,
0.021985426545143127,
0.031391263008117676,
0.03722086548805237,
-0.02038535475730896,
0.025732794776558876,
0.015754150226712227,
-0.03547700121998787,
-0.014794928953051567,
-0.02240615338087082,
-0.004703371785581112,
-0.022034164518117905,
0.018312031403183937,
0.04222326725721359,
0.1555943638086319,
0.034050602465867996,
-0.07950375229120255,
-0.02625216171145439,
0.18953026831150055,
-0.03155319020152092,
-0.08795062452554703,
-0.1370166391134262,
0.10715281963348389,
0.05318280681967735,
0.023947935551404953,
0.02562788315117359,
-0.08392989635467529,
-0.06281235069036484,
0.1875353455543518,
0.07480543851852417,
-0.013847403228282928,
-0.01877102255821228,
0.011174602434039116,
-0.0053768460638821125,
-0.05740498751401901,
0.20104815065860748,
0.02669382095336914,
0.2372392863035202,
0.01324449386447668,
0.0031789513304829597,
-0.0631890743970871,
-0.030418433248996735,
0.00032576770172454417,
0.12198809534311295,
-0.03859539330005646,
-0.04565846174955368,
-0.06428825855255127,
-0.0024098672438412905,
0.011172699742019176,
-0.08409542590379715,
0.08253604173660278,
-0.13204436004161835,
-0.10608087480068207,
-0.04092175140976906,
0.06054643541574478,
-0.06888359040021896,
0.01869351789355278,
-0.02938016504049301,
0.04575476050376892,
0.06956294178962708,
-0.02141238935291767,
-0.0802593007683754,
-0.1436574012041092,
0.09547024965286255,
-0.047454584389925,
0.1355772614479065,
-0.005990603938698769,
0.11906009912490845,
0.09035814553499222,
0.029674973338842392,
-0.07651281356811523,
0.11294800043106079,
0.028445957228541374,
0.050786226987838745,
0.04403845965862274,
0.1288142204284668,
-0.052749745547771454,
0.12315408140420914,
-0.054836709052324295,
-0.04645954817533493,
-0.004773831460624933,
-0.08607375621795654,
-0.0008856505737639964,
-0.14813342690467834,
-0.020122310146689415,
-0.09830418974161148,
0.10236896574497223,
0.18398898839950562,
-0.039099015295505524,
-0.027391426265239716,
-0.09265404939651489,
0.11092010140419006,
-0.028063632547855377,
0.058454032987356186,
-0.04034570977091789,
-0.17823433876037598,
0.0008144467719830573,
0.0190668273717165,
0.014945325441658497,
-0.23923826217651367,
-0.011237398721277714,
-0.03258301317691803,
-0.03461842983961105,
-0.07322995364665985,
0.15544632077217102,
0.1077202558517456,
0.041649166494607925,
-0.03608151897788048,
-0.16696791350841522,
-0.032305482774972916,
0.05704285576939583,
-0.1297958642244339,
-0.13337209820747375
] |
null | null | transformers |
# CodeTrans model for api recommendation generation
Pretrained model for api recommendation generation using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans).
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the api recommendation generation task for the java apis.
## Intended uses & limitations
The model could be used to generate api usage for the java programming tasks.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_api_generation_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_api_generation_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "parse the uses licence node of this package , if any , and returns the license definition if theres"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/api%20generation/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 320,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 68.71 |
| CodeTrans-ST-Base | 70.45 |
| CodeTrans-TF-Small | 68.90 |
| CodeTrans-TF-Base | 72.11 |
| CodeTrans-TF-Large | 73.26 |
| CodeTrans-MT-Small | 58.43 |
| CodeTrans-MT-Base | 67.97 |
| CodeTrans-MT-Large | 72.29 |
| CodeTrans-MT-TF-Small | 69.29 |
| CodeTrans-MT-TF-Base | 72.89 |
| CodeTrans-MT-TF-Large | **73.39** |
| State of the art | 54.42 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "parse the uses licence node of this package , if any , and returns the license definition if theres"}]} | summarization | SEBIS/code_trans_t5_base_api_generation_multitask_finetune | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
| CodeTrans model for api recommendation generation
=================================================
Pretrained model for api recommendation generation using the t5 base model architecture. It was first released in
this repository.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the api recommendation generation task for the java apis.
Intended uses & limitations
---------------------------
The model could be used to generate api usage for the java programming tasks.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 320,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 320,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 320,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
46,
61,
88,
111
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 320,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.08542430400848389,
0.06487412750720978,
-0.0014373332960531116,
0.1130978986620903,
0.0548197403550148,
0.02373896911740303,
0.05501509830355644,
0.09367094188928604,
-0.03215468302369118,
0.0601566806435585,
0.061396077275276184,
-0.04555758088827133,
0.06186922639608383,
0.17429979145526886,
0.021418601274490356,
-0.17714300751686096,
-0.0122740613296628,
0.028331242501735687,
-0.033135030418634415,
0.10592477768659592,
0.09633222222328186,
-0.08199559152126312,
0.0667664185166359,
-0.03391867130994797,
-0.12465950101613998,
0.053977370262145996,
-0.04817827418446541,
-0.046400830149650574,
0.0908857211470604,
0.06500733643770218,
0.11402440816164017,
-0.02883598767220974,
0.06996694952249527,
-0.19652490317821503,
0.0007870168192312121,
0.014997921884059906,
0.050871267914772034,
0.0212987270206213,
0.06600329279899597,
0.0795387253165245,
0.1457405388355255,
-0.02134372852742672,
0.03805059939622879,
0.04896761476993561,
-0.061049994081258774,
-0.07457225024700165,
-0.050933390855789185,
0.0746084600687027,
0.10937720537185669,
0.08784697949886322,
-0.011448191478848457,
0.02530287578701973,
-0.08670739829540253,
0.08485755324363708,
0.11239679902791977,
-0.22794406116008759,
-0.02225767821073532,
0.09308604151010513,
0.08979674428701401,
0.07057613879442215,
-0.08396328240633011,
-0.035817503929138184,
0.10843691974878311,
0.038888998329639435,
0.05146419256925583,
-0.08479372411966324,
-0.06373682618141174,
-0.0029555270448327065,
-0.048990193754434586,
-0.05144910514354706,
0.14837437868118286,
0.048402223736047745,
-0.05886248126626015,
-0.10604412853717804,
-0.03901026397943497,
-0.1765429973602295,
0.05208916962146759,
0.02138286456465721,
0.01664826087653637,
-0.02029554359614849,
0.05569373816251755,
-0.00036168660153634846,
-0.09333638846874237,
-0.10922130942344666,
-0.0009862248552963138,
0.0798444077372551,
0.07830709964036942,
0.02948947437107563,
-0.011623772792518139,
0.07222802191972733,
-0.010805708356201649,
-0.06135677546262741,
-0.029310399666428566,
0.008251736871898174,
-0.12115538865327835,
0.02629796601831913,
-0.01031450368463993,
-0.06447964906692505,
-0.010027661919593811,
0.08528472483158112,
-0.07275223731994629,
0.07410459220409393,
0.11067304015159607,
0.003743850626051426,
0.002051491290330887,
0.22060854732990265,
0.0444747619330883,
-0.15307047963142395,
0.0022268423344939947,
0.02948296256363392,
0.006483067758381367,
-0.004028368275612593,
-0.06075805798172951,
-0.04106254503130913,
0.0010829229140654206,
0.05520647391676903,
-0.14336664974689484,
0.012547686696052551,
-0.04445906728506088,
-0.019505267962813377,
0.07938768714666367,
-0.12882834672927856,
0.028775261715054512,
0.012993896380066872,
-0.07315411418676376,
-0.046841416507959366,
0.06443370133638382,
-0.1080244705080986,
-0.11290748417377472,
0.02600923739373684,
-0.051254887133836746,
-0.03060557320713997,
-0.12116976827383041,
-0.1212870255112648,
-0.013314896263182163,
-0.055292099714279175,
0.021086234599351883,
-0.10400837659835815,
-0.1054563894867897,
-0.015469600446522236,
0.026816653087735176,
-0.007380401715636253,
-0.016264643520116806,
-0.05234324559569359,
0.02549353428184986,
-0.004705527331680059,
-0.03301141783595085,
0.020454561337828636,
-0.04561397060751915,
0.09429541230201721,
0.09827948361635208,
0.05403265357017517,
0.00927720870822668,
0.03491339832544327,
-0.07059277594089508,
0.06366240233182907,
-0.073076531291008,
0.05371658504009247,
-0.024866212159395218,
0.06355114281177521,
-0.08682572841644287,
-0.0895204171538353,
0.061325374990701675,
0.05567365512251854,
0.049141060560941696,
0.015729228034615517,
-0.0973084419965744,
0.015995340421795845,
0.13786567747592926,
-0.10186152905225754,
-0.1392042636871338,
0.11770527064800262,
-0.0028559272177517414,
0.004740968346595764,
0.06762488186359406,
0.13736344873905182,
0.13794618844985962,
-0.08325184136629105,
-0.03647104650735855,
0.06700669229030609,
0.06991461664438248,
-0.06410963833332062,
0.10444729030132294,
0.02986677922308445,
0.02733438089489937,
0.028658036142587662,
0.037167906761169434,
0.06678687036037445,
-0.002042344305664301,
-0.03132613003253937,
-0.018692081794142723,
-0.0871933102607727,
-0.03341960161924362,
-0.010431594215333462,
0.02769622951745987,
-0.07627261430025101,
-0.07021299749612808,
0.033023420721292496,
0.1822451800107956,
-0.10172449797391891,
0.020366191864013672,
-0.08771704137325287,
-0.05264643207192421,
-0.08867570012807846,
0.012495556846261024,
-0.10312283039093018,
0.021104203537106514,
0.05529453232884407,
-0.05192982032895088,
0.06370823085308075,
0.08695772290229797,
0.002261590212583542,
0.047192253172397614,
-0.04187731817364693,
-0.04636375978589058,
-0.058623332530260086,
-0.05800400674343109,
-0.11755920946598053,
-0.019415950402617455,
-0.09217396378517151,
-0.029570911079645157,
-0.07040652632713318,
-0.18385377526283264,
0.009658779948949814,
-0.03877216577529907,
0.02764737978577614,
0.008624803274869919,
-0.017902055755257607,
0.03675620257854462,
0.0534745492041111,
-0.05382445082068443,
-0.08408395200967789,
0.012325380928814411,
0.019046716392040253,
-0.10231716930866241,
-0.04304956644773483,
-0.11263780295848846,
-0.028599485754966736,
0.0733785629272461,
0.0915042981505394,
-0.06543450057506561,
0.012252767570316792,
-0.026309821754693985,
-0.06422264873981476,
-0.04152990132570267,
-0.0667039081454277,
0.17126424610614777,
0.009185276925563812,
0.1745963990688324,
-0.14367583394050598,
-0.0480075366795063,
-0.0318533331155777,
-0.011171072721481323,
0.02090476267039776,
0.1571066826581955,
-0.00619153818115592,
-0.10291591286659241,
0.05227918177843094,
-0.02005951851606369,
-0.0670837014913559,
0.16731545329093933,
-0.0016703979345038533,
-0.09395919740200043,
0.02267892099916935,
0.09681346267461777,
-0.020040543749928474,
0.14752472937107086,
-0.09139367192983627,
-0.01412342768162489,
0.007345566526055336,
0.026490967720746994,
0.04028526693582535,
-0.12445609271526337,
0.020508678629994392,
0.05516792833805084,
-0.06943567842245102,
-0.04959993064403534,
-0.031506143510341644,
-0.04331555217504501,
0.04308444261550903,
-0.0007319682626985013,
-0.007876276969909668,
-0.011707347817718983,
-0.024659249931573868,
-0.08882962167263031,
0.19999079406261444,
-0.08561437577009201,
-0.22348333895206451,
-0.17626409232616425,
0.03466076776385307,
-0.06610474735498428,
-0.002858651103451848,
0.05100172385573387,
-0.11851266026496887,
-0.07149288803339005,
-0.08843638002872467,
0.14192308485507965,
-0.10492604970932007,
0.005106189753860235,
-0.005483846180140972,
0.040658388286828995,
0.029167283326387405,
-0.1828138530254364,
0.029928067699074745,
-0.009213399142026901,
0.0029486343264579773,
0.002789892256259918,
-0.06527777761220932,
0.09527512639760971,
0.11795386672019958,
-0.0852382555603981,
0.016285695135593414,
-0.0008143674349412322,
0.14850114285945892,
-0.05239098519086838,
0.022129397839307785,
0.2115936279296875,
0.017148403450846672,
0.028337951749563217,
0.041360557079315186,
0.01337831187993288,
-0.09293855726718903,
0.06817010790109634,
0.05161271616816521,
-0.029645731672644615,
-0.25796958804130554,
-0.004439906217157841,
-0.06707019358873367,
0.03586422652006149,
0.11181513220071793,
0.0529540553689003,
-0.12576071918010712,
0.029188605025410652,
-0.00516007374972105,
0.15354013442993164,
-0.035790834575891495,
0.05398303642868996,
0.01532681379467249,
0.019723722711205482,
0.013631009496748447,
-0.10059276968240738,
0.010095709003508091,
0.07772301882505417,
0.11381476372480392,
0.21808505058288574,
-0.06208997592329979,
0.19651709496974945,
0.024939758703112602,
0.0641409382224083,
0.02762184850871563,
0.10751395672559738,
-0.12202619761228561,
-0.0011269659735262394,
0.0021097264252603054,
-0.01215099636465311,
-0.07908525317907333,
0.05316080525517464,
-0.03497175872325897,
0.08193393796682358,
-0.06378152966499329,
0.027078324928879738,
0.017713913694024086,
0.15325967967510223,
0.05206149443984032,
-0.19287045300006866,
-0.11826355755329132,
0.0252094529569149,
-0.10993805527687073,
-0.11890923231840134,
0.06898412108421326,
0.21178163588047028,
-0.04471416026353836,
0.016199642792344093,
0.0004801983304787427,
0.14020074903964996,
-0.07849955558776855,
-0.022950554266572,
0.028782919049263,
0.06669745594263077,
0.008974138647317886,
0.1317238211631775,
-0.26900413632392883,
0.09089209139347076,
0.010605254210531712,
0.09346402436494827,
-0.022901728749275208,
0.06491155922412872,
-0.033541981130838394,
0.010160543024539948,
0.07124298065900803,
-0.002959359437227249,
-0.05351225659251213,
-0.20638342201709747,
-0.0547233447432518,
0.02754710242152214,
0.04116840660572052,
-0.012609451077878475,
0.0865597203373909,
-0.005798890255391598,
0.06563052535057068,
-0.032255109399557114,
-0.1229538843035698,
-0.06474797427654266,
-0.12272103130817413,
-0.027428777888417244,
-0.00018699273641686887,
-0.03401370719075203,
-0.02608921006321907,
0.012427259236574173,
-0.014387376606464386,
0.2156325727701187,
-0.14144586026668549,
-0.11466647684574127,
-0.0885075181722641,
0.06672807782888412,
0.1229018047451973,
-0.10002285987138748,
0.012842020019888878,
0.021762289106845856,
0.0479704923927784,
-0.040750738233327866,
-0.054099805653095245,
0.020717158913612366,
-0.05416306108236313,
-0.08141092956066132,
-0.022159893065690994,
0.09161580353975296,
-0.008964632637798786,
0.04637347161769867,
0.003871575463563204,
-0.09112109243869781,
-0.04136936739087105,
-0.1337880939245224,
-0.09368211776018143,
-0.018063567578792572,
0.04385385289788246,
-0.005250465124845505,
-0.08371539413928986,
0.07377442717552185,
-0.010733740404248238,
-0.09411835670471191,
0.06764647364616394,
0.18290208280086517,
-0.05313291773200035,
0.020102763548493385,
0.11474864184856415,
-0.05520966649055481,
-0.1431417167186737,
-0.05695443972945213,
0.05405336990952492,
0.09760860353708267,
-0.030900390818715096,
-0.14332878589630127,
0.07991646230220795,
0.04046664386987686,
0.0249093696475029,
0.0072932057082653046,
-0.27622729539871216,
-0.12387975305318832,
0.048041339963674545,
0.08989822119474411,
0.059886522591114044,
-0.12087908387184143,
-0.038884032517671585,
-0.06319543719291687,
-0.10413732379674911,
0.05006169527769089,
0.051581185311079025,
0.12679162621498108,
-0.047493867576122284,
0.03665778040885925,
0.02999345026910305,
-0.0243089459836483,
0.1091998964548111,
0.01044442132115364,
0.1038188561797142,
-0.02362525649368763,
0.03356223180890083,
0.05529619753360748,
-0.05892356112599373,
0.18199411034584045,
-0.18180589377880096,
0.07430906593799591,
-0.23837393522262573,
-0.057065241038799286,
-0.013288825750350952,
-0.009868647903203964,
-0.038266729563474655,
-0.058265358209609985,
-0.09511056542396545,
0.011580340564250946,
0.0414663664996624,
-0.017551250755786896,
0.08615683019161224,
-0.02382989600300789,
-0.05452953651547432,
0.04432513564825058,
0.08909949660301208,
-0.02690024860203266,
-0.13145765662193298,
0.014866987243294716,
0.03490881249308586,
0.09058462828397751,
-0.21974097192287445,
0.020216820761561394,
0.12172364443540573,
0.004246084950864315,
0.11074291914701462,
0.008053041063249111,
-0.07125647366046906,
0.046401891857385635,
0.07598383724689484,
-0.035773761570453644,
-0.08393502235412598,
-0.009562494233250618,
-0.019580435007810593,
-0.08037717640399933,
0.03619447350502014,
0.08495648205280304,
-0.058479852974414825,
-0.021053912118077278,
-0.017636466771364212,
0.003477690741419792,
-0.07181784510612488,
0.19477050006389618,
0.027734970673918724,
0.09047377109527588,
-0.062054090201854706,
0.08402899652719498,
0.10404111444950104,
-0.10422758758068085,
0.013293820433318615,
0.16885586082935333,
-0.07774806022644043,
-0.025212230160832405,
0.05127668380737305,
0.10869047045707703,
-0.038124579936265945,
-0.07652488350868225,
-0.10099465399980545,
-0.07496501505374908,
0.014760702848434448,
0.013823181390762329,
0.07434751838445663,
0.07428224384784698,
-0.03446424752473831,
0.010264962911605835,
-0.09371326118707657,
0.0974641740322113,
0.07298281788825989,
0.052202701568603516,
-0.14277447760105133,
0.12973906099796295,
0.04213441535830498,
0.08706531673669815,
-0.00017508477321825922,
0.03662953898310661,
-0.09356743842363358,
0.043677378445863724,
-0.04930907487869263,
0.0473402701318264,
-0.01091675367206335,
0.05785147473216057,
-0.03366394713521004,
0.020430967211723328,
-0.029939811676740646,
0.05354942008852959,
-0.041153330355882645,
-0.024266518652439117,
-0.01670166663825512,
0.048863206058740616,
-0.05313822627067566,
-0.027445541694760323,
0.01047591120004654,
-0.08017826080322266,
0.10413981229066849,
-0.07010906934738159,
-0.010169086046516895,
0.005376317072659731,
0.010185711085796356,
0.07176701724529266,
0.026133252307772636,
0.0603519044816494,
-0.008018802851438522,
-0.006409597583115101,
0.0492970235645771,
0.019939785823225975,
-0.004529655911028385,
-0.004685898311436176,
0.04978650435805321,
-0.14387662708759308,
-0.0905638113617897,
-0.10409880429506302,
-0.07124154269695282,
-0.06886796653270721,
0.08355395495891571,
0.08750910311937332,
0.07435010373592377,
0.08872587233781815,
-0.021107792854309082,
-0.008302496746182442,
-0.14026880264282227,
-0.035092081874608994,
0.05037572234869003,
-0.022554831579327583,
-0.10984216630458832,
-0.05089782178401947,
0.04885096848011017,
-0.04184272140264511,
0.11613774299621582,
-0.005931745283305645,
0.05588942766189575,
-0.013017759658396244,
-0.0423215813934803,
-0.00965633150190115,
-0.007364087272435427,
0.20441961288452148,
-0.10019847005605698,
0.023021839559078217,
0.010974198579788208,
-0.010788305662572384,
0.03899010270833969,
0.13112401962280273,
0.09342446178197861,
0.14952626824378967,
0.05065493658185005,
0.10335035622119904,
-0.051762260496616364,
-0.04327418655157089,
-0.180076003074646,
0.04194044694304466,
0.007819599471986294,
0.03210976719856262,
-0.021193373948335648,
0.09969522058963776,
0.14498324692249298,
-0.13325056433677673,
0.09139193594455719,
0.019243082031607628,
-0.10017220675945282,
-0.04157920554280281,
-0.06077861040830612,
-0.04815458878874779,
-0.09029577672481537,
0.020713642239570618,
-0.11370589584112167,
0.018876420333981514,
0.0865534096956253,
0.04502364620566368,
-0.02983597293496132,
0.1415533870458603,
-0.018138717859983444,
-0.05750337243080139,
0.009882756508886814,
0.024120032787322998,
0.04582945257425308,
0.09466510266065598,
0.013646801002323627,
0.07625047862529755,
-0.056597866117954254,
0.07339660823345184,
0.016781847923994064,
0.01282927580177784,
0.011622999794781208,
0.002547877375036478,
-0.004807437304407358,
-0.04445970058441162,
-0.002999792341142893,
0.08657864481210709,
0.1754903495311737,
0.042772844433784485,
-0.04614019766449928,
-0.04514249414205551,
0.17296822369098663,
-0.04397827759385109,
-0.06624919176101685,
-0.11497282236814499,
0.15378820896148682,
0.06080644950270653,
0.02160845696926117,
0.016084907576441765,
-0.07100576162338257,
-0.05561532452702522,
0.2186702936887741,
0.00860306154936552,
-0.02586705982685089,
-0.04725071042776108,
-0.015401246957480907,
-0.007970763370394707,
-0.0451558418571949,
0.14800970256328583,
0.021605588495731354,
0.20486457645893097,
0.005331514868885279,
-0.014096337370574474,
-0.044277556240558624,
-0.025203121826052666,
-0.023307044059038162,
0.18562249839305878,
-0.03416244685649872,
0.027168022468686104,
-0.08979010581970215,
-0.015839532017707825,
0.04277172312140465,
-0.1047726571559906,
0.09951622039079666,
-0.09849660098552704,
-0.08767669647932053,
0.02400105446577072,
0.08336600661277771,
-0.024272354319691658,
0.029250022023916245,
-0.01055964920669794,
0.049591030925512314,
0.042841192334890366,
-0.02696482464671135,
-0.10165306180715561,
-0.12703466415405273,
0.05671657249331474,
-0.01770891435444355,
0.15849149227142334,
0.023001160472631454,
0.10041743516921997,
0.0828566700220108,
0.011972433887422085,
-0.07574556767940521,
0.11597810685634613,
0.03031080774962902,
0.016373267397284508,
0.07137923687696457,
0.1271100640296936,
-0.03505314141511917,
0.13085798919200897,
-0.00014390457363333553,
-0.03068922646343708,
-0.03927968442440033,
-0.022629182785749435,
-0.00895681045949459,
-0.14073124527931213,
0.008451968431472778,
-0.06175035610795021,
0.12990181148052216,
0.1808163821697235,
-0.043193645775318146,
-0.03183571621775627,
-0.03987541422247887,
0.0857044905424118,
-0.010252661071717739,
0.09850291907787323,
-0.0015669689746573567,
-0.16722752153873444,
0.012450596317648888,
-0.040822722017765045,
0.013391304761171341,
-0.18707022070884705,
-0.042816709727048874,
-0.044411711394786835,
-0.03738737851381302,
-0.0828021988272667,
0.13431131839752197,
0.07382689416408539,
0.02967323176562786,
-0.04678547382354736,
-0.20930594205856323,
-0.025822622701525688,
0.04817973077297211,
-0.13999073207378387,
-0.12807630002498627
] |
null | null | transformers |
# CodeTrans model for api recommendation generation
Pretrained model for api recommendation generation using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans).
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the api recommendation generation task for the java apis.
## Intended uses & limitations
The model could be used to generate api usage for the java programming tasks.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_api_generation_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_api_generation_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "parse the uses licence node of this package , if any , and returns the license definition if theres"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/api%20generation/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V3-8 for 1,400,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 68.71 |
| CodeTrans-ST-Base | 70.45 |
| CodeTrans-TF-Small | 68.90 |
| CodeTrans-TF-Base | 72.11 |
| CodeTrans-TF-Large | 73.26 |
| CodeTrans-MT-Small | 58.43 |
| CodeTrans-MT-Base | 67.97 |
| CodeTrans-MT-Large | 72.29 |
| CodeTrans-MT-TF-Small | 69.29 |
| CodeTrans-MT-TF-Base | 72.89 |
| CodeTrans-MT-TF-Large | **73.39** |
| State of the art | 54.42 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "parse the uses licence node of this package , if any , and returns the license definition if theres"}]} | summarization | SEBIS/code_trans_t5_base_api_generation_transfer_learning_finetune | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
| CodeTrans model for api recommendation generation
=================================================
Pretrained model for api recommendation generation using the t5 base model architecture. It was first released in
this repository.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the api recommendation generation task for the java apis.
Intended uses & limitations
---------------------------
The model could be used to generate api usage for the java programming tasks.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V3-8 for 1,400,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V3-8 for 1,400,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V3-8 for 1,400,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
46,
61,
87,
111
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V3-8 for 1,400,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.0856538787484169,
0.060153115540742874,
-0.0012047745985910296,
0.11414234340190887,
0.05794797092676163,
0.021572934463620186,
0.03190245479345322,
0.10159759223461151,
-0.048055607825517654,
0.06075490638613701,
0.05902991071343422,
-0.04957288131117821,
0.05876823142170906,
0.17839139699935913,
0.019673997536301613,
-0.182862788438797,
-0.01657814532518387,
0.02760694921016693,
-0.040412940084934235,
0.10733391344547272,
0.09527043998241425,
-0.07704946398735046,
0.07527446746826172,
-0.04415375366806984,
-0.11525058001279831,
0.05235849693417549,
-0.03228660672903061,
-0.043989963829517365,
0.09823901951313019,
0.08038149774074554,
0.11044298857450485,
-0.03725489228963852,
0.06523269414901733,
-0.19870217144489288,
-0.00022974413877818733,
0.027412129566073418,
0.05228355899453163,
0.028272824361920357,
0.053451064974069595,
0.07109930366277695,
0.15091265738010406,
-0.01616288535296917,
0.04216553270816803,
0.05441305413842201,
-0.07032649219036102,
-0.08325657248497009,
-0.04577844962477684,
0.06397539377212524,
0.0988009124994278,
0.10218842327594757,
-0.009714680723845959,
0.030003488063812256,
-0.08070483803749084,
0.08926825225353241,
0.11571156233549118,
-0.2215745598077774,
-0.02403879724442959,
0.10541757941246033,
0.08794844895601273,
0.06983673572540283,
-0.0858035758137703,
-0.031151046976447105,
0.11688237637281418,
0.03603100776672363,
0.05592186003923416,
-0.08139220625162125,
-0.053078070282936096,
-0.004259347915649414,
-0.052857108414173126,
-0.05185816437005997,
0.15623892843723297,
0.04316403344273567,
-0.05535595864057541,
-0.10606756806373596,
-0.04474732279777527,
-0.1885402947664261,
0.048711810261011124,
0.01594643108546734,
0.017321638762950897,
-0.015574764460325241,
0.032641682773828506,
-0.0015930302906781435,
-0.09079813212156296,
-0.11023213714361191,
-0.0052256654016673565,
0.06154877319931984,
0.06632279604673386,
0.033204540610313416,
-0.012326210737228394,
0.07841882109642029,
-0.005316728260368109,
-0.05303221940994263,
-0.030695775523781776,
0.007092988584190607,
-0.11225903779268265,
0.021484747529029846,
-0.007767031900584698,
-0.058882273733615875,
-0.017308887094259262,
0.08020435273647308,
-0.08282297104597092,
0.07221326231956482,
0.10127177834510803,
0.008772688917815685,
0.003895218251273036,
0.2262176126241684,
0.04238957539200783,
-0.15818020701408386,
0.008161165751516819,
0.019711928442120552,
0.003214875003322959,
-0.00015391569468192756,
-0.06576243788003922,
-0.04720178246498108,
0.01113587524741888,
0.06210444122552872,
-0.13525670766830444,
0.0202484168112278,
-0.04311126098036766,
-0.013278172351419926,
0.07625081390142441,
-0.12550774216651917,
0.030849680304527283,
0.008746945299208164,
-0.07956422120332718,
-0.033956386148929596,
0.07559981942176819,
-0.12546762824058533,
-0.11209078878164291,
0.02937527745962143,
-0.0496537871658802,
-0.03756626695394516,
-0.12774285674095154,
-0.1264919638633728,
-0.015228203497827053,
-0.030186675488948822,
0.013869297690689564,
-0.1138317734003067,
-0.09785926342010498,
-0.019919054582715034,
0.03402174264192581,
-0.009314211085438728,
-0.019143689423799515,
-0.05729309469461441,
0.011859800666570663,
0.0054968418553471565,
-0.031123915687203407,
0.032857540994882584,
-0.046178702265024185,
0.09629027545452118,
0.08302148431539536,
0.05299665778875351,
0.010243801400065422,
0.032565079629421234,
-0.07704821228981018,
0.06464310735464096,
-0.09499646723270416,
0.05869613215327263,
-0.020732995122671127,
0.06316915899515152,
-0.09839055687189102,
-0.09364169836044312,
0.042128048837184906,
0.05885445326566696,
0.054392166435718536,
0.022214267402887344,
-0.1129729151725769,
0.020995471626520157,
0.13569240272045135,
-0.11189267784357071,
-0.1326044201850891,
0.1174362376332283,
-0.007355137262493372,
0.0203007310628891,
0.07103265076875687,
0.13697487115859985,
0.14310050010681152,
-0.09867803007364273,
-0.04053681343793869,
0.07367516309022903,
0.05272669717669487,
-0.07227174937725067,
0.08380352705717087,
0.027962898835539818,
0.013694104738533497,
0.028009681031107903,
0.03810836374759674,
0.06923013180494308,
-0.004048471804708242,
-0.03206847980618477,
-0.02820749580860138,
-0.08939553052186966,
-0.046650443226099014,
-0.004224614705890417,
0.024253204464912415,
-0.06899112462997437,
-0.07024482637643814,
0.02598275989294052,
0.177878275513649,
-0.10262648016214371,
0.025924818590283394,
-0.07526466250419617,
-0.03955083340406418,
-0.07621146738529205,
0.015365442261099815,
-0.10364173352718353,
0.01256603468209505,
0.055021777749061584,
-0.03920610994100571,
0.061076819896698,
0.08130073547363281,
0.0031894270796328783,
0.030836904421448708,
-0.05135291814804077,
-0.046271201223134995,
-0.036217398941516876,
-0.06368522346019745,
-0.1127905547618866,
-0.025310751050710678,
-0.08555809408426285,
-0.026990791782736778,
-0.07791535556316376,
-0.1818598359823227,
0.0012521165190264583,
-0.032325416803359985,
0.027310868725180626,
0.01615419238805771,
-0.018409179523587227,
0.024721898138523102,
0.05089467763900757,
-0.05319387838244438,
-0.09013785421848297,
0.01460444089025259,
0.022866787388920784,
-0.09685899317264557,
-0.029154758900403976,
-0.1096097081899643,
-0.026045365259051323,
0.07904892414808273,
0.0895044133067131,
-0.09368609637022018,
0.00030098907882347703,
-0.02493397891521454,
-0.0598357617855072,
-0.049421459436416626,
-0.06542973965406418,
0.17941227555274963,
0.0069727469235658646,
0.17143066227436066,
-0.14142446219921112,
-0.05404709652066231,
-0.029059773311018944,
-0.0008211242384277284,
0.02088088169693947,
0.1582120954990387,
-0.008644322864711285,
-0.10471812635660172,
0.048802927136421204,
-0.032757509499788284,
-0.0639997199177742,
0.1639631688594818,
-0.005154743324965239,
-0.08962082117795944,
0.015838203951716423,
0.0950799286365509,
-0.00958374422043562,
0.15613968670368195,
-0.07406790554523468,
-0.008444961160421371,
0.0001178245001938194,
0.02822246588766575,
0.0447719469666481,
-0.11727091670036316,
0.020745152607560158,
0.04852649196982384,
-0.07182144373655319,
-0.03595833107829094,
-0.02914406917989254,
-0.046857867389917374,
0.046689003705978394,
0.0072781178168952465,
-0.006136803887784481,
-0.01791500113904476,
-0.02669607847929001,
-0.08856546878814697,
0.20897328853607178,
-0.08323896676301956,
-0.22531923651695251,
-0.17916949093341827,
0.04331551492214203,
-0.0462513230741024,
-0.011180229485034943,
0.03880687430500984,
-0.11120115965604782,
-0.0764285996556282,
-0.09739950299263,
0.14306031167507172,
-0.12372899800539017,
0.00124655372928828,
-0.0335206612944603,
0.05013833940029144,
0.029099563136696815,
-0.18630430102348328,
0.03068234957754612,
-0.013736426830291748,
0.0019659455865621567,
-0.004878610838204622,
-0.0649242252111435,
0.08440770953893661,
0.1269959807395935,
-0.08104746043682098,
0.015993621200323105,
-0.008144206367433071,
0.15158019959926605,
-0.060795191675424576,
0.029876522719860077,
0.18706655502319336,
0.025443976745009422,
0.026294488459825516,
0.05532270297408104,
0.010418522171676159,
-0.09670214354991913,
0.07467474788427353,
0.05203263461589813,
-0.03152647987008095,
-0.24877788126468658,
-0.01885177008807659,
-0.06203583627939224,
0.04810815304517746,
0.11761283129453659,
0.053894925862550735,
-0.12447056174278259,
0.017230793833732605,
-0.00022842509497422725,
0.14990031719207764,
-0.030115388333797455,
0.05899535119533539,
0.02731010504066944,
0.011271846480667591,
0.014288704842329025,
-0.09642305225133896,
0.012079010717570782,
0.07722043991088867,
0.10279130190610886,
0.2254539579153061,
-0.07873931527137756,
0.1898040920495987,
0.013358453288674355,
0.07555156946182251,
0.03596058115363121,
0.10097946226596832,
-0.12213975936174393,
0.006906939670443535,
0.0028481341432780027,
-0.013415164314210415,
-0.06447957456111908,
0.05324491113424301,
-0.04047909379005432,
0.08196912705898285,
-0.07082387804985046,
0.016510024666786194,
0.011889126151800156,
0.15871265530586243,
0.054585862904787064,
-0.19320210814476013,
-0.12327849119901657,
0.018484514206647873,
-0.11103154718875885,
-0.11586497724056244,
0.07631900906562805,
0.21192434430122375,
-0.047423653304576874,
0.009444531053304672,
-0.0020770332776010036,
0.13854973018169403,
-0.09097452461719513,
-0.029989469796419144,
0.0279866810888052,
0.07374812662601471,
0.0019888030365109444,
0.12445324659347534,
-0.26897329092025757,
0.09983357042074203,
0.009048490785062313,
0.09513487666845322,
-0.012324979528784752,
0.05800116807222366,
-0.038368869572877884,
0.013732852414250374,
0.07743025571107864,
0.002118707401677966,
-0.06301326304674149,
-0.19595886766910553,
-0.05333046242594719,
0.019780069589614868,
0.04105684906244278,
-0.0036713327281177044,
0.08044995367527008,
-0.010816141031682491,
0.054834187030792236,
-0.024686014279723167,
-0.13815775513648987,
-0.05810348689556122,
-0.1276879608631134,
-0.041753798723220825,
-0.007281542755663395,
-0.019505692645907402,
-0.02675032801926136,
0.017421968281269073,
0.004105514846742153,
0.22845759987831116,
-0.1285664439201355,
-0.0994037389755249,
-0.08735912293195724,
0.06843233108520508,
0.13451409339904785,
-0.0995294526219368,
0.024261925369501114,
0.023074720054864883,
0.04439434036612511,
-0.035612136125564575,
-0.0633648931980133,
0.03114571049809456,
-0.05098618566989899,
-0.06624387949705124,
-0.02541446126997471,
0.09348917752504349,
-0.0066157374531030655,
0.04607138782739639,
0.004812085069715977,
-0.09163706004619598,
-0.04046308249235153,
-0.13269919157028198,
-0.09154728055000305,
-0.0003862711018882692,
0.04033452644944191,
-0.0038804677315056324,
-0.09237392991781235,
0.0656706765294075,
-0.017377523705363274,
-0.09083238989114761,
0.06712178885936737,
0.1674550324678421,
-0.06559882313013077,
0.0184356477111578,
0.10398198664188385,
-0.06416788697242737,
-0.1497633308172226,
-0.040527671575546265,
0.04880550876259804,
0.09126005321741104,
-0.04357147216796875,
-0.13397401571273804,
0.07370224595069885,
0.031448978930711746,
0.026287740096449852,
0.02493143267929554,
-0.28067439794540405,
-0.12793852388858795,
0.03680933639407158,
0.08393067121505737,
0.07190246880054474,
-0.11388092488050461,
-0.03851598501205444,
-0.061301033943891525,
-0.08307603001594543,
0.05487821623682976,
0.06699613481760025,
0.12256497144699097,
-0.045898016542196274,
0.03231823816895485,
0.034918393939733505,
-0.02531970851123333,
0.08814677596092224,
0.0014271796680986881,
0.10395777970552444,
-0.026277072727680206,
0.022016137838363647,
0.0515855997800827,
-0.05696955695748329,
0.17790667712688446,
-0.1639150083065033,
0.08431817591190338,
-0.20797426998615265,
-0.0577382817864418,
-0.01846347749233246,
0.0013714834349229932,
-0.03741610050201416,
-0.060544900596141815,
-0.11061511188745499,
0.03184424340724945,
0.0460476391017437,
-0.02018638327717781,
0.056128229945898056,
-0.027926532551646233,
-0.05548146739602089,
0.054946236312389374,
0.08413184434175491,
-0.017899319529533386,
-0.1283981204032898,
0.0320037305355072,
0.027485838159918785,
0.09881792962551117,
-0.21223007142543793,
0.018117757514119148,
0.12415693700313568,
0.0058268108405172825,
0.10626287758350372,
0.009730038233101368,
-0.07877573370933533,
0.037242449820041656,
0.07304132729768753,
-0.035051748156547546,
-0.09057514369487762,
-0.012052037753164768,
-0.032496754080057144,
-0.07996472716331482,
0.026605630293488503,
0.08974245190620422,
-0.06469620764255524,
-0.012666559778153896,
-0.009973851963877678,
0.004860911518335342,
-0.07126345485448837,
0.1884032040834427,
0.025325383991003036,
0.0826399028301239,
-0.05682061240077019,
0.08352663367986679,
0.09680439531803131,
-0.12396231293678284,
0.01512081827968359,
0.16453465819358826,
-0.07623518258333206,
-0.022620080038905144,
0.06735141575336456,
0.11348290741443634,
-0.050297148525714874,
-0.06647121161222458,
-0.08872947096824646,
-0.07774581760168076,
0.01784488558769226,
0.038000453263521194,
0.07303644716739655,
0.07929970324039459,
-0.034334585070610046,
0.011645626276731491,
-0.10253988951444626,
0.09332679212093353,
0.082428477704525,
0.04950021207332611,
-0.1432497352361679,
0.1387144923210144,
0.04058350250124931,
0.06937678158283234,
-0.0009240907966159284,
0.03065750189125538,
-0.10305733978748322,
0.041807498782873154,
-0.025913072749972343,
0.0525856651365757,
-0.0023245993070304394,
0.06039170175790787,
-0.03517366945743561,
0.02164394035935402,
-0.03204909339547157,
0.0495966300368309,
-0.03859945386648178,
-0.025235705077648163,
-0.016996972262859344,
0.038922201842069626,
-0.053739938884973526,
-0.03106486238539219,
0.0063749016262590885,
-0.07770726084709167,
0.09782497584819794,
-0.06581874191761017,
-0.007213058415800333,
0.008671467192471027,
0.004775787238031626,
0.07475196570158005,
0.028669530525803566,
0.05277368426322937,
-0.009168418124318123,
-0.0104646235704422,
0.044097620993852615,
0.019415298476815224,
-0.005841178819537163,
-0.005143104121088982,
0.04480356723070145,
-0.1467384248971939,
-0.08562881499528885,
-0.09678322821855545,
-0.06567003577947617,
-0.06944306939840317,
0.08202780038118362,
0.0833047479391098,
0.07719285786151886,
0.08833934366703033,
-0.03304901346564293,
-0.002243347465991974,
-0.1431712806224823,
-0.03128262236714363,
0.05089138075709343,
-0.02207360602915287,
-0.09331480413675308,
-0.04172540456056595,
0.054426033049821854,
-0.039502859115600586,
0.120777927339077,
-0.0032700796145945787,
0.05764234811067581,
-0.012650740332901478,
-0.05270110070705414,
-0.024581890553236008,
-0.004563599359244108,
0.21064956486225128,
-0.09427059441804886,
0.02061775140464306,
0.006410134490579367,
-0.0006768827443011105,
0.04104724898934364,
0.15740928053855896,
0.08986169099807739,
0.14329983294010162,
0.051224466413259506,
0.10398270189762115,
-0.05672288313508034,
-0.03949417173862457,
-0.156266450881958,
0.05302335321903229,
-0.006908489856868982,
0.033053893595933914,
-0.029154451563954353,
0.10833278298377991,
0.13100793957710266,
-0.1368156373500824,
0.1012628972530365,
0.021005436778068542,
-0.1026877835392952,
-0.046580925583839417,
-0.0771920382976532,
-0.047967083752155304,
-0.10819227248430252,
0.015987439081072807,
-0.11409114301204681,
0.02262859232723713,
0.06336658447980881,
0.04604511708021164,
-0.028577888384461403,
0.13723181188106537,
-0.02231007255613804,
-0.060669466853141785,
0.03141837939620018,
0.0301709845662117,
0.038492534309625626,
0.08539590239524841,
0.02412335015833378,
0.06408581882715225,
-0.07395583391189575,
0.06110818684101105,
0.025405125692486763,
0.011801042594015598,
0.019673915579915047,
0.013660544529557228,
0.003105561248958111,
-0.048455629497766495,
0.005182947963476181,
0.07639008015394211,
0.18246765434741974,
0.04745476692914963,
-0.05101708322763443,
-0.04644414409995079,
0.19312769174575806,
-0.053475625813007355,
-0.06906454265117645,
-0.11961086094379425,
0.1677035540342331,
0.04701187461614609,
0.019622614607214928,
0.009393508546054363,
-0.06816759705543518,
-0.04522193595767021,
0.23690089583396912,
0.030408784747123718,
-0.021872684359550476,
-0.04220835864543915,
-0.008698401041328907,
-0.01142469048500061,
-0.03902004286646843,
0.1487932950258255,
0.022910751402378082,
0.22338391840457916,
0.0034375174436718225,
-0.01746329478919506,
-0.04478868469595909,
-0.02399628981947899,
-0.010295875370502472,
0.18974992632865906,
-0.0338575504720211,
0.0237968061119318,
-0.09712021052837372,
-0.016485545784235,
0.019959094002842903,
-0.1089247316122055,
0.10212747752666473,
-0.11211104691028595,
-0.07826409488916397,
0.01568528078496456,
0.06972112506628036,
-0.028724396601319313,
0.039912957698106766,
-0.014306708239018917,
0.04883234202861786,
0.049231816083192825,
-0.024304861202836037,
-0.11454259604215622,
-0.14858120679855347,
0.04502595588564873,
-0.006767206359654665,
0.14973397552967072,
0.017738431692123413,
0.09118916094303131,
0.08749793469905853,
0.01592695154249668,
-0.07444963604211807,
0.10512304306030273,
0.026498422026634216,
0.005678717978298664,
0.07322436571121216,
0.12432468682527542,
-0.03647226840257645,
0.15113379061222076,
-0.0034530481789261103,
-0.0328943096101284,
-0.024615578353405,
-0.021442564204335213,
-0.010585738345980644,
-0.15021975338459015,
0.0021805227734148502,
-0.05890917778015137,
0.1371873915195465,
0.18821772933006287,
-0.03950031101703644,
-0.027944643050432205,
-0.04665584862232208,
0.07930320501327515,
-0.017624344676733017,
0.08991778641939163,
0.0055432748049497604,
-0.15968847274780273,
0.0067809997126460075,
-0.02564997225999832,
0.007283676881343126,
-0.18762199580669403,
-0.04411382973194122,
-0.041161585599184036,
-0.0411086231470108,
-0.08748728781938553,
0.14018036425113678,
0.07184210419654846,
0.03196267783641815,
-0.042874906212091446,
-0.1781623512506485,
-0.016628017649054527,
0.052829526364803314,
-0.13559983670711517,
-0.1263471245765686
] |
null | null | transformers |
# CodeTrans model for code comment generation java
Pretrained model on programming language java using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized java code functions: it works best with tokenized java functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used single-task training on Code Comment Generation dataset.
## Intended uses & limitations
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_comment_generation_java"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_comment_generation_java", skip_special_tokens=True),
device=0
)
tokenized_code = "protected String renderUri ( URI uri ) { return uri . toASCIIString ( ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/single%20task/code%20comment%20generation/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 37.98 |
| CodeTrans-ST-Base | 38.07 |
| CodeTrans-TF-Small | 38.56 |
| CodeTrans-TF-Base | 39.06 |
| CodeTrans-TF-Large | **39.50** |
| CodeTrans-MT-Small | 20.15 |
| CodeTrans-MT-Base | 27.44 |
| CodeTrans-MT-Large | 34.69 |
| CodeTrans-MT-TF-Small | 38.37 |
| CodeTrans-MT-TF-Base | 38.90 |
| CodeTrans-MT-TF-Large | 39.25 |
| State of the art | 38.17 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "protected String renderUri ( URI uri ) { return uri . toASCIIString ( ) ; }"}]} | summarization | SEBIS/code_trans_t5_base_code_comment_generation_java | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
| CodeTrans model for code comment generation java
================================================
Pretrained model on programming language java using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized java code functions: it works best with tokenized java functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used single-task training on Code Comment Generation dataset.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
46,
112
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.0875348374247551,
0.013697436079382896,
-0.0003455282421782613,
0.06394645571708679,
0.12497641891241074,
-0.003056386485695839,
0.07015896588563919,
0.0622900165617466,
0.008274346590042114,
-0.0482400581240654,
0.08769276738166809,
0.1497582495212555,
0.02762714959681034,
0.1312154233455658,
-0.03489409759640694,
-0.20416918396949768,
-0.0008053651545196772,
0.06260603666305542,
-0.1364232748746872,
0.12527789175510406,
0.13377933204174042,
-0.05561942979693413,
0.1017858013510704,
-0.009229477494955063,
-0.22977487742900848,
0.06711380928754807,
-0.024753574281930923,
-0.08570939302444458,
0.1276760995388031,
0.08113759756088257,
0.10691884160041809,
0.037618692964315414,
0.002452101558446884,
-0.22514183819293976,
0.0342220775783062,
-0.03248739242553711,
0.015348796732723713,
0.05422268807888031,
0.04324764385819435,
-0.03354261815547943,
0.19429150223731995,
-0.003324622754007578,
0.008941693231463432,
0.05411025509238243,
-0.10770773887634277,
-0.09157682210206985,
-0.017219066619873047,
-0.012689750641584396,
0.09100747853517532,
0.07052944600582123,
0.021522730588912964,
0.12098428606987,
-0.13321104645729065,
0.13232064247131348,
0.09553149342536926,
-0.15953649580478668,
-0.022312473505735397,
0.12433511018753052,
0.09789133816957474,
-0.04756445810198784,
-0.058550458401441574,
0.006124221254140139,
0.07550051063299179,
0.024089157581329346,
0.044558025896549225,
-0.14467084407806396,
-0.21428675949573517,
0.07047666609287262,
-0.05522088706493378,
-0.054558660835027695,
0.28957539796829224,
-0.001740014529787004,
-0.03579780086874962,
-0.04674946889281273,
-0.02384253218770027,
0.037143558263778687,
0.001583861536346376,
-0.01219947636127472,
0.013403519056737423,
-0.006495804525911808,
0.0001787974761100486,
-0.020048044621944427,
-0.1046537458896637,
-0.12809570133686066,
0.012318591587245464,
0.0632692351937294,
-0.008053308352828026,
0.028492100536823273,
-0.1689462959766388,
0.09569334238767624,
0.08020750433206558,
-0.09261907637119293,
0.02069508098065853,
-0.06649632006883621,
-0.01483930740505457,
-0.015727058053016663,
-0.04498734325170517,
-0.16703221201896667,
0.090079665184021,
0.0329529233276844,
-0.06404729187488556,
0.0523817278444767,
0.008363490924239159,
0.07629864662885666,
0.05736444145441055,
0.17173059284687042,
-0.005652987863868475,
-0.07473986595869064,
0.048203229904174805,
-0.026784855872392654,
-0.06084217131137848,
0.012999911792576313,
-0.07063441723585129,
-0.03990170732140541,
0.013609337620437145,
0.12503035366535187,
-0.10864166170358658,
0.07088617980480194,
-0.06570059806108475,
-0.034918107092380524,
0.013724357821047306,
-0.13763315975666046,
-0.028480611741542816,
0.003789276583120227,
-0.06344790011644363,
-0.04812151566147804,
0.1105416938662529,
-0.056825656443834305,
-0.11266995966434479,
-0.03793490305542946,
-0.07759636640548706,
-0.002286880975589156,
-0.10701865702867508,
-0.07657937705516815,
0.017111066728830338,
0.04218428209424019,
0.06876492500305176,
-0.11328937113285065,
-0.18311932682991028,
-0.00378196919336915,
0.0859009325504303,
-0.009058515541255474,
0.043713636696338654,
-0.09632625430822372,
-0.02627558447420597,
-0.03624427691102028,
-0.023713121190667152,
0.06400034576654434,
-0.06666994839906693,
0.07964087277650833,
0.08765355497598648,
0.05573558062314987,
-0.06110719218850136,
0.05496655032038689,
-0.1368434876203537,
0.06887786090373993,
-0.17572492361068726,
0.09287060797214508,
-0.04748747497797012,
0.12308106571435928,
-0.10658205300569534,
-0.056166231632232666,
0.04451017454266548,
0.06461703032255173,
0.051738351583480835,
0.1248263269662857,
-0.14438219368457794,
-0.03378137946128845,
0.1368870586156845,
-0.10836703330278397,
-0.21830464899539948,
0.06135636940598488,
-0.07434312254190445,
0.21410374343395233,
0.048365335911512375,
0.19819733500480652,
0.14343023300170898,
-0.02868485078215599,
0.07107708603143692,
0.09276288002729416,
-0.04263358190655708,
-0.08118170499801636,
0.061137605458498,
0.06848236173391342,
-0.13355247676372528,
0.06377530097961426,
-0.028956551104784012,
0.10540744662284851,
-0.03231040760874748,
-0.04164525493979454,
-0.010478834621608257,
-0.06200810521841049,
0.016047755256295204,
-0.004955985117703676,
0.08650654554367065,
-0.006605913396924734,
0.01244751363992691,
0.0616019144654274,
0.1054123044013977,
-0.12497183680534363,
-0.006952561903744936,
-0.093503437936306,
0.028618594631552696,
-0.11639437079429626,
0.032513875514268875,
-0.21166153252124786,
0.027220679447054863,
0.01945713721215725,
0.011345173232257366,
0.026673752814531326,
0.04674104228615761,
0.002225014613941312,
0.008788947947323322,
0.012042907066643238,
-0.00012668300769291818,
0.012038164772093296,
-0.01339554600417614,
-0.028828123584389687,
-0.10568856447935104,
-0.048531509935855865,
-0.05472125485539436,
-0.018339067697525024,
-0.1854812502861023,
-0.007206879090517759,
0.03074919991195202,
0.06931772083044052,
0.030629904940724373,
0.037145815789699554,
0.050374578684568405,
0.06326886266469955,
-0.047165270894765854,
-0.020603148266673088,
0.06363126635551453,
0.022122951224446297,
-0.09089092165231705,
0.08014687150716782,
-0.05041682347655296,
0.0392286479473114,
0.09161380678415298,
-0.16121985018253326,
-0.0484076589345932,
-0.04355262964963913,
-0.03548724204301834,
-0.03213665261864662,
0.005714211147278547,
-0.02016386389732361,
0.19718383252620697,
-0.00277286721393466,
0.17384465038776398,
-0.1252789944410324,
-0.056693870574235916,
-0.03044029138982296,
-0.018076809123158455,
0.02996285818517208,
0.1407017558813095,
0.08248013257980347,
-0.2236068695783615,
0.0575043261051178,
0.08426006138324738,
-0.021634353324770927,
0.214022696018219,
-0.041451066732406616,
-0.02952142432332039,
-0.030555350705981255,
0.06863059848546982,
-0.04190429672598839,
0.14710868895053864,
-0.22171849012374878,
-0.03171179071068764,
0.019685082137584686,
-0.007631672080606222,
0.1122758537530899,
-0.11735276877880096,
-0.002150582382455468,
0.01657246984541416,
-0.03956224396824837,
-0.09183894842863083,
0.04511590301990509,
0.003761471714824438,
0.029469814151525497,
-0.006956758908927441,
-0.016358038410544395,
0.034719642251729965,
-0.03919856995344162,
-0.11604341864585876,
0.23168928921222687,
-0.08178829401731491,
-0.26040002703666687,
-0.19569189846515656,
0.07312697917222977,
-0.013156517408788204,
-0.009771275334060192,
0.05569472163915634,
-0.03973342850804329,
-0.05276770517230034,
-0.043325118720531464,
0.11012377589941025,
-0.028776198625564575,
-0.048147059977054596,
-0.010282251052558422,
0.08146045356988907,
-0.00270624621771276,
-0.19431070983409882,
-0.013962035067379475,
0.02327197976410389,
0.07291260361671448,
0.0076109846122562885,
-0.13866771757602692,
0.10433061420917511,
0.07624371349811554,
-0.05397311970591545,
0.04713505133986473,
-0.02445952594280243,
0.20842666923999786,
-0.06126997619867325,
-0.06025610491633415,
0.16236791014671326,
-0.09642329066991806,
-0.003352442290633917,
0.028680074959993362,
0.003236632328480482,
-0.1170538067817688,
0.03786719590425491,
-0.039697349071502686,
-0.05638670548796654,
-0.25277459621429443,
-0.08142433315515518,
-0.08600790798664093,
0.09921654313802719,
0.02161657251417637,
0.026387520134449005,
-0.07102641463279724,
0.053963709622621536,
0.08214443922042847,
0.14172638952732086,
-0.0023885478731244802,
0.0621887668967247,
0.0461786612868309,
0.00000787666613177862,
-0.005724714137613773,
-0.11177843809127808,
-0.04886482656002045,
0.029043223708868027,
0.0967540293931961,
0.1904471069574356,
-0.0007648671162314713,
0.1683170348405838,
0.07690630853176117,
0.04392387717962265,
0.03241889551281929,
0.17099055647850037,
-0.12204890698194504,
0.019687142223119736,
-0.017568619921803474,
-0.047731269150972366,
-0.13728395104408264,
0.02280402183532715,
-0.07212063670158386,
0.061882730573415756,
-0.1281474381685257,
-0.057739004492759705,
0.06905921548604965,
0.09702833741903305,
-0.014093056321144104,
-0.25188183784484863,
-0.11165375262498856,
0.03986385837197304,
-0.07736600190401077,
-0.07006209343671799,
0.05375116318464279,
0.17215187847614288,
-0.12860530614852905,
-0.015176679007709026,
-0.04420911520719528,
0.16267751157283783,
-0.0767742246389389,
0.03270037844777107,
-0.04839164763689041,
-0.03533410280942917,
0.01931367628276348,
0.1675824224948883,
-0.21182399988174438,
0.23328521847724915,
0.005609455052763224,
0.029242709279060364,
-0.06598269194364548,
0.03204849734902382,
0.0026204015593975782,
0.09018553048372269,
0.1278182566165924,
-0.017440814524888992,
-0.037498705089092255,
-0.1433279663324356,
0.042860161513090134,
0.088055320084095,
0.0523323230445385,
-0.026728661730885506,
0.057409390807151794,
-0.022015240043401718,
0.022249603644013405,
-0.01924106292426586,
-0.07525847852230072,
-0.10074827075004578,
-0.09780948609113693,
-0.003032986307516694,
-0.03312069922685623,
0.05907544866204262,
-0.026102658361196518,
0.026925021782517433,
0.10447284579277039,
0.17686131596565247,
-0.08071305602788925,
-0.05718423053622246,
-0.10008548200130463,
0.02711847424507141,
0.1261013001203537,
-0.07769280672073364,
-0.010893245227634907,
0.0021051305811852217,
0.03895604982972145,
0.0037654642947018147,
-0.13072633743286133,
0.05351848527789116,
-0.06955447793006897,
0.00780505733564496,
-0.030631467700004578,
0.09126220643520355,
-0.018012814223766327,
-0.01911618560552597,
0.06623081117868423,
-0.0735074058175087,
-0.055885396897792816,
-0.14300747215747833,
-0.10915983468294144,
-0.04150675609707832,
0.06663099676370621,
0.024734467267990112,
-0.14429457485675812,
0.024876385927200317,
-0.004918987862765789,
-0.03161003440618515,
0.20293238759040833,
0.08884996175765991,
-0.02574251964688301,
0.02489391341805458,
0.1639450043439865,
-0.11602018773555756,
-0.2334480732679367,
0.005280734039843082,
-0.03230864554643631,
0.07774461060762405,
0.017152797430753708,
-0.1297965943813324,
0.0966779887676239,
-0.0212701428681612,
0.038408175110816956,
-0.0028145096730440855,
-0.2850101590156555,
-0.0927276760339737,
0.09940031915903091,
0.13139492273330688,
0.07567469775676727,
-0.1134088858962059,
-0.07127122581005096,
-0.08849947154521942,
-0.24174600839614868,
0.16580738127231598,
-0.10594948381185532,
0.0877404436469078,
-0.01802125759422779,
0.05255144089460373,
0.027449941262602806,
-0.056967593729496,
0.10794489830732346,
0.016404593363404274,
0.11107928305864334,
-0.028078187257051468,
-0.1109815165400505,
0.09532010555267334,
-0.03439773619174957,
0.1620626449584961,
-0.11545030772686005,
0.08731027692556381,
-0.22446514666080475,
-0.03686108812689781,
-0.047640107572078705,
0.04783904552459717,
-0.010010837577283382,
-0.0743076354265213,
-0.04631466791033745,
0.02392154559493065,
0.0339743047952652,
0.016516052186489105,
0.12039679288864136,
-0.04230856895446777,
0.0035285227932035923,
0.12763555347919464,
0.14849890768527985,
-0.043803393840789795,
-0.017025446519255638,
0.040696121752262115,
0.03064838983118534,
0.1105346530675888,
-0.2375105619430542,
0.08407941460609436,
0.11292080581188202,
0.015424934215843678,
0.12785789370536804,
0.06999306380748749,
-0.034878864884376526,
0.02407427504658699,
0.08550633490085602,
-0.13883844017982483,
-0.04545171931385994,
-0.06494591385126114,
-0.032525431364774704,
0.011911292560398579,
0.06969189643859863,
0.1323765367269516,
-0.07027289271354675,
-0.012021319009363651,
-0.0032027200795710087,
-0.019216537475585938,
-0.13435979187488556,
0.12141181528568268,
0.04016401618719101,
0.07558362931013107,
-0.085504911839962,
0.08729752153158188,
0.05737169831991196,
-0.14994265139102936,
-0.027014533057808876,
0.13614708185195923,
-0.13091707229614258,
-0.08283495903015137,
-0.0009928954532369971,
0.30476242303848267,
-0.08785919100046158,
-0.09488599002361298,
-0.13568812608718872,
-0.0589756965637207,
-0.00480257673189044,
0.21958544850349426,
0.09407008439302444,
0.09977125376462936,
-0.0452021062374115,
-0.017672209069132805,
-0.1107776090502739,
0.06677959859371185,
0.07870634645223618,
0.010086342692375183,
-0.09087855368852615,
0.08646351844072342,
-0.004493309184908867,
0.14587192237377167,
-0.05127015709877014,
-0.03204577788710594,
-0.16943508386611938,
0.07053176313638687,
-0.1295023113489151,
0.06592816114425659,
-0.07465513795614243,
0.0265691876411438,
0.01240604929625988,
0.008961480110883713,
-0.03397351875901222,
0.055781904608011246,
-0.0832022875547409,
0.012593800202012062,
-0.0011513333301991224,
0.08368167281150818,
-0.07915686815977097,
-0.018968697637319565,
0.08212091028690338,
-0.05834730342030525,
0.0918813943862915,
0.017350368201732635,
-0.06442798674106598,
0.1012454703450203,
-0.18230204284191132,
-0.015219482593238354,
0.04310361668467522,
0.013158856891095638,
0.06185366213321686,
-0.05620328709483147,
0.03308556228876114,
0.03108653984963894,
0.035992447286844254,
-0.01840687356889248,
0.10512319952249527,
-0.13565194606781006,
-0.09874997287988663,
-0.027431296184659004,
-0.11547094583511353,
-0.039944760501384735,
0.04154624044895172,
0.05203212797641754,
0.07814259082078934,
0.08607788383960724,
-0.02199070155620575,
0.03141273930668831,
-0.07487697154283524,
-0.015214871615171432,
0.048350680619478226,
-0.08761196583509445,
-0.0652468279004097,
-0.093665212392807,
0.019157055765390396,
-0.07295798510313034,
0.17917779088020325,
0.010520649142563343,
0.13619844615459442,
-0.011928456835448742,
-0.058892980217933655,
0.0081118643283844,
0.05523201823234558,
0.21566550433635712,
-0.04356808587908745,
0.048623524606227875,
-0.06566096842288971,
0.07222473621368408,
0.01657106727361679,
0.0591760091483593,
0.08517037332057953,
0.13076718151569366,
-0.016010455787181854,
0.11139610409736633,
0.034444570541381836,
0.041151013225317,
-0.04293122515082359,
-0.0702202096581459,
0.09094005823135376,
0.06136952340602875,
-0.04172206297516823,
0.09653545916080475,
0.12303636968135834,
-0.11214438825845718,
0.09893506020307541,
0.0013752031372860074,
-0.10303182899951935,
-0.04028548300266266,
-0.020668353885412216,
-0.04859589412808418,
-0.12325625866651535,
-0.0005443849368020892,
-0.1319676786661148,
-0.03752988949418068,
0.051451023668050766,
0.025116777047514915,
-0.0677337571978569,
0.18952828645706177,
0.007759132422506809,
-0.07555653154850006,
0.06747636198997498,
-0.006039418745785952,
0.01735023222863674,
-0.028778282925486565,
0.08784612268209457,
-0.0061198752373456955,
-0.020769067108631134,
-0.011700782924890518,
0.04509396851062775,
-0.037726566195487976,
-0.011415199376642704,
-0.07268580049276352,
-0.03392927721142769,
-0.03647568076848984,
0.039575133472681046,
-0.004547376651316881,
0.022212669253349304,
0.027247052639722824,
-0.04432417079806328,
0.0019617003854364157,
0.24082204699516296,
-0.043419938534498215,
-0.07224733382463455,
-0.13918447494506836,
0.17177271842956543,
0.05537329241633415,
0.059337515383958817,
0.015530291944742203,
-0.0547466054558754,
-0.035400863736867905,
0.2699294984340668,
0.18216174840927124,
-0.04876147210597992,
-0.0007213094504550099,
0.00996269192546606,
0.0165316890925169,
-0.005329003091901541,
0.12688973546028137,
0.03906630724668503,
0.21224625408649445,
-0.02446792647242546,
-0.05357332527637482,
-0.04532114416360855,
-0.051101215183734894,
0.0354292057454586,
0.11981749534606934,
0.02639816515147686,
-0.04894211143255234,
-0.03498129919171333,
0.09090722352266312,
-0.14609721302986145,
-0.11333595961332321,
0.03274228423833847,
-0.1482396125793457,
-0.08157680928707123,
-0.07629746198654175,
0.050488971173763275,
-0.032418906688690186,
0.04592205956578255,
-0.03576705977320671,
-0.011583673767745495,
0.06371704488992691,
0.0388018861413002,
-0.13238593935966492,
-0.10421805828809738,
0.049209050834178925,
-0.055664222687482834,
0.12114697694778442,
-0.026003092527389526,
0.10079795867204666,
0.09326222538948059,
0.024225132539868355,
-0.05229032039642334,
0.04322533681988716,
0.06295718997716904,
0.0478549487888813,
0.06272850185632706,
0.06944398581981659,
-0.024032380431890488,
0.14079567790031433,
-0.04788777977228165,
-0.119390107691288,
0.03177432343363762,
-0.01158355176448822,
-0.008139731362462044,
-0.11275683343410492,
-0.027326466515660286,
-0.08268268406391144,
0.09143614768981934,
0.1697247177362442,
-0.046309929341077805,
0.017417678609490395,
-0.07789275795221329,
0.1435597538948059,
0.004004329442977905,
-0.015456177294254303,
-0.07638128846883774,
-0.1352270543575287,
-0.019851481541991234,
0.03283907100558281,
-0.016769496724009514,
-0.22979772090911865,
-0.00013190227036830038,
-0.047559067606925964,
-0.015641456469893456,
-0.04228579252958298,
0.10924817621707916,
0.13639593124389648,
0.05022285133600235,
-0.026618963107466698,
-0.17581267654895782,
-0.010601256042718887,
0.06815240532159805,
-0.10393248498439789,
-0.15014556050300598
] |
null | null | transformers |
# CodeTrans model for code comment generation java
Pretrained model on programming language java using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized java code functions: it works best with tokenized java functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_comment_generation_java_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_comment_generation_java_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "protected String renderUri ( URI uri ) { return uri . toASCIIString ( ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/code%20comment%20generation/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 460,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 37.98 |
| CodeTrans-ST-Base | 38.07 |
| CodeTrans-TF-Small | 38.56 |
| CodeTrans-TF-Base | 39.06 |
| CodeTrans-TF-Large | **39.50** |
| CodeTrans-MT-Small | 20.15 |
| CodeTrans-MT-Base | 27.44 |
| CodeTrans-MT-Large | 34.69 |
| CodeTrans-MT-TF-Small | 38.37 |
| CodeTrans-MT-TF-Base | 38.90 |
| CodeTrans-MT-TF-Large | 39.25 |
| State of the art | 38.17 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "protected String renderUri ( URI uri ) { return uri . toASCIIString ( ) ; }"}]} | summarization | SEBIS/code_trans_t5_base_code_comment_generation_java_multitask | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
| CodeTrans model for code comment generation java
================================================
Pretrained model on programming language java using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized java code functions: it works best with tokenized java functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 460,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 460,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 460,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
46,
61,
143
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 460,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.12591016292572021,
-0.025582507252693176,
-0.00045510733616538346,
0.13206195831298828,
0.10586302727460861,
0.024606896564364433,
0.058722399175167084,
0.06693139672279358,
-0.027645224705338478,
0.019352810457348824,
0.04320186376571655,
0.010109315626323223,
0.03332019969820976,
0.1934121549129486,
0.007108581252396107,
-0.11778350174427032,
-0.014771566726267338,
0.0448477566242218,
-0.0377376563847065,
0.12756924331188202,
0.09428270161151886,
-0.07437840104103088,
0.05365275964140892,
-0.06888635456562042,
-0.24488693475723267,
0.05987200886011124,
-0.005781178362667561,
-0.06332988291978836,
0.09910094738006592,
0.046591151505708694,
0.1256496161222458,
-0.004036305006593466,
0.021530287340283394,
-0.14177891612052917,
0.011029825545847416,
0.011523084715008736,
0.03323275223374367,
0.017438136041164398,
0.04688705876469612,
0.053984325379133224,
0.13570615649223328,
0.011027167551219463,
0.042142726480960846,
0.06187174469232559,
-0.0753200501203537,
-0.11905595660209656,
-0.007599940989166498,
0.02382717840373516,
0.05090343952178955,
0.10049303621053696,
-0.012554025277495384,
0.1216515451669693,
-0.15156885981559753,
0.1274675875902176,
0.10188797116279602,
-0.21814857423305511,
-0.01190328411757946,
0.1260872334241867,
0.0898347944021225,
0.0965995341539383,
-0.060968514531850815,
-0.06684678792953491,
0.10347471386194229,
0.051961712539196014,
0.044744133949279785,
-0.10083525627851486,
-0.11131997406482697,
0.024067087098956108,
-0.07454617321491241,
-0.0630764365196228,
0.22019626200199127,
0.0011836233315989375,
-0.076922208070755,
-0.054246362298727036,
-0.024811994284391403,
-0.13353964686393738,
0.0362517312169075,
0.028270438313484192,
0.008561753667891026,
-0.03348739817738533,
0.019097665324807167,
0.0304223895072937,
-0.07381737232208252,
-0.15557575225830078,
0.027319151908159256,
0.092945396900177,
0.05577626824378967,
0.025092117488384247,
-0.09730851650238037,
0.1047055721282959,
0.03546175733208656,
-0.060223501175642014,
-0.027714189141988754,
-0.01738988794386387,
-0.10408946126699448,
0.033238865435123444,
-0.05166281387209892,
-0.1827246993780136,
0.01644335873425007,
0.010571795515716076,
-0.04764074087142944,
0.05198332294821739,
0.027402980253100395,
0.03694290667772293,
0.022434169426560402,
0.1979631632566452,
0.057026516646146774,
-0.12038962543010712,
0.054191526025533676,
0.04393109679222107,
-0.03706011176109314,
-0.005103102885186672,
-0.06900831311941147,
-0.09874839335680008,
0.09364704042673111,
0.10360001772642136,
-0.13744235038757324,
0.03740476444363594,
-0.07060126960277557,
-0.043950583785772324,
-0.0004851460107602179,
-0.15935304760932922,
0.0035883202217519283,
0.027501670643687248,
-0.06701434403657913,
-0.055126119405031204,
0.0932188406586647,
-0.17009682953357697,
-0.1502256840467453,
-0.042679719626903534,
-0.08061506599187851,
-0.03987366333603859,
-0.16792230308055878,
-0.15638631582260132,
-0.00981274526566267,
-0.03890194743871689,
0.019021354615688324,
-0.08571233600378036,
-0.158063143491745,
-0.026368875056505203,
0.018936581909656525,
0.004093340132385492,
-0.0018337155925109982,
-0.07829388976097107,
-0.009560251608490944,
-0.02963339351117611,
-0.03822571411728859,
0.015534772537648678,
-0.04757603257894516,
0.12136177718639374,
0.1032286286354065,
0.054196540266275406,
-0.022108718752861023,
0.061141543090343475,
-0.07937251776456833,
0.06469081342220306,
-0.11654316633939743,
0.09531964361667633,
-0.06016671285033226,
0.07912914454936981,
-0.03387034311890602,
-0.10594906657934189,
0.07633455097675323,
0.06114551052451134,
0.06560032814741135,
0.035415422171354294,
-0.138342946767807,
-0.024625107645988464,
0.1838291883468628,
-0.12465314567089081,
-0.13677330315113068,
0.10096292197704315,
-0.038328688591718674,
0.08281073719263077,
0.08129016309976578,
0.14372998476028442,
0.15089261531829834,
-0.02571946568787098,
0.023484138771891594,
0.05017700418829918,
0.04370974376797676,
-0.1308685839176178,
0.07837451249361038,
0.06634392589330673,
-0.08873774856328964,
0.06193612888455391,
-0.017103582620620728,
0.09760825335979462,
-0.010715937241911888,
-0.02449660561978817,
-0.05092811584472656,
-0.07918865978717804,
-0.005598760675638914,
0.007973982021212578,
0.06587402522563934,
-0.08218387514352798,
-0.05862542241811752,
0.09035851061344147,
0.17436833679676056,
-0.13125953078269958,
-0.0019140124786645174,
-0.08132704347372055,
0.03762523829936981,
-0.076051265001297,
0.0285626407712698,
-0.1626410037279129,
0.03416156396269798,
0.07820359617471695,
-0.02909783646464348,
0.053756311535835266,
0.12989544868469238,
0.013498872518539429,
0.04428404942154884,
0.001048662350513041,
-0.01461320836097002,
-0.12115241587162018,
-0.056499697268009186,
-0.06246567517518997,
-0.06335564702749252,
-0.0898352786898613,
-0.059979334473609924,
-0.03679604455828667,
-0.1931515485048294,
0.011443311348557472,
0.0037421074230223894,
0.0033043825533241034,
0.027911191806197166,
-0.012610101141035557,
0.02976706065237522,
0.07556485384702682,
-0.06040119007229805,
-0.036431387066841125,
0.03183588758111,
0.023717159405350685,
-0.04096973314881325,
-0.05693592131137848,
-0.08138849586248398,
0.006327448878437281,
0.10730377584695816,
0.04209185019135475,
-0.07912139594554901,
0.021284498274326324,
-0.020485835149884224,
-0.04912937059998512,
0.009179526008665562,
-0.06389646232128143,
0.14582160115242004,
-0.006118045188486576,
0.1988690346479416,
-0.16440075635910034,
-0.038568586111068726,
-0.024566931650042534,
0.025076130405068398,
0.062017351388931274,
0.13760004937648773,
-0.014313124120235443,
-0.08466321974992752,
0.06590716540813446,
0.016491683200001717,
-0.10018204897642136,
0.2324889898300171,
-0.04729941859841347,
-0.09342518448829651,
0.022508464753627777,
0.10190711170434952,
-0.017073463648557663,
0.16741394996643066,
-0.20241312682628632,
-0.02751314826309681,
0.01767326518893242,
0.008022401481866837,
0.06628099828958511,
-0.12671604752540588,
0.003343170043081045,
0.009127849712967873,
-0.0726892426609993,
-0.07051105797290802,
-0.008308447897434235,
-0.005648903548717499,
0.03778240084648132,
-0.007137793116271496,
-0.02944306656718254,
0.01706789620220661,
-0.04003683477640152,
-0.10617917776107788,
0.21960118412971497,
-0.097491055727005,
-0.21865157783031464,
-0.20573140680789948,
0.11311618238687515,
-0.061917051672935486,
-0.01327546313405037,
0.03602645918726921,
-0.0795813724398613,
-0.05582629516720772,
-0.05623222887516022,
0.1727411448955536,
-0.06231231614947319,
-0.011640723794698715,
-0.015280085615813732,
0.0765785276889801,
0.009883911348879337,
-0.21011431515216827,
0.035576146095991135,
-0.0044695911929011345,
-0.01299143023788929,
0.006153336260467768,
-0.1002361848950386,
0.09060539305210114,
0.15222258865833282,
-0.0816972479224205,
0.020428795367479324,
0.007025804836302996,
0.19050347805023193,
-0.039357740432024,
-0.05575942248106003,
0.14128082990646362,
-0.017918426543474197,
-0.009609545581042767,
0.016031969338655472,
-0.013350795954465866,
-0.09814433753490448,
0.06307902187108994,
-0.00940536055713892,
-0.02357589639723301,
-0.27275350689888,
-0.007871591486036777,
-0.08011751621961594,
0.05662355199456215,
0.03763324394822121,
0.041491422802209854,
-0.08817299455404282,
0.02822044864296913,
0.060363881289958954,
0.15032795071601868,
-0.004394609946757555,
0.05416720733046532,
0.058492597192525864,
-0.0011080722324550152,
0.007647753693163395,
-0.09940394014120102,
0.012863297946751118,
0.07337572425603867,
0.1120772734284401,
0.272129088640213,
-0.09983893483877182,
0.19954539835453033,
0.047967977821826935,
0.048777200281620026,
0.050157591700553894,
0.13432687520980835,
-0.13315357267856598,
0.03202131390571594,
0.0034943795762956142,
-0.008837350644171238,
-0.10900909453630447,
0.008495568297803402,
-0.06612849980592728,
0.09041846543550491,
-0.10502144694328308,
-0.05984077602624893,
0.01016103383153677,
0.14743903279304504,
0.041291166096925735,
-0.22395017743110657,
-0.1294742375612259,
0.020238952711224556,
-0.09496309608221054,
-0.1058678850531578,
0.06485940515995026,
0.24275220930576324,
-0.07732886075973511,
-0.039325471967458725,
-0.004490266554057598,
0.13398389518260956,
-0.03599094599485397,
-0.02194424718618393,
-0.03655267506837845,
0.06438702344894409,
0.0158383809030056,
0.1359335333108902,
-0.2950015962123871,
0.12953795492649078,
-0.008377361111342907,
0.06254688650369644,
-0.02961323782801628,
0.04896612837910652,
-0.038313981145620346,
0.07838710397481918,
0.037849023938179016,
-0.009856277145445347,
0.035525280982255936,
-0.15919406712055206,
0.014664524234831333,
0.041245799511671066,
0.01862596534192562,
0.05605851486325264,
0.06316506117582321,
-0.0035413214936852455,
0.05753646418452263,
-0.019499419257044792,
-0.12560445070266724,
-0.070408396422863,
-0.06654578447341919,
-0.016861870884895325,
-0.03160252422094345,
-0.011935049667954445,
-0.044464826583862305,
-0.010054690763354301,
0.07745455205440521,
0.1830257773399353,
-0.09767717868089676,
-0.07754011452198029,
-0.0754859670996666,
0.05150185152888298,
0.10793237388134003,
-0.08134980499744415,
0.029433472082018852,
-0.0028842324391007423,
0.04549865797162056,
-0.009085324592888355,
-0.07584843039512634,
0.05125569552183151,
-0.038801200687885284,
-0.06951557844877243,
-0.011818863451480865,
0.06314300745725632,
-0.001427481765858829,
0.02692478522658348,
0.011982616037130356,
-0.09532652795314789,
-0.04532327130436897,
-0.11998175829648972,
-0.1264762133359909,
-0.04257044941186905,
0.018758391961455345,
0.043386414647102356,
-0.147370383143425,
-0.057929281145334244,
0.003144026268273592,
-0.038831308484077454,
0.12987370789051056,
0.15809668600559235,
-0.0548589713871479,
0.03110283613204956,
0.1481463760137558,
-0.06087049841880798,
-0.1913076639175415,
0.03402959927916527,
0.04597251117229462,
0.119239941239357,
-0.04340717941522598,
-0.16566240787506104,
0.048553287982940674,
0.020347030833363533,
0.03650442138314247,
0.05306714400649071,
-0.3096622824668884,
-0.1246357411146164,
0.08030389249324799,
0.15983599424362183,
0.1192542240023613,
-0.12222771346569061,
-0.038677725940942764,
-0.0629231184720993,
-0.16147735714912415,
0.09294339269399643,
-0.048498764634132385,
0.1333884745836258,
-0.07553107291460037,
0.028212271630764008,
0.035329509526491165,
-0.04541182518005371,
0.0734027847647667,
0.03190687298774719,
0.12189026921987534,
-0.042616937309503555,
0.018534572795033455,
0.1264793872833252,
-0.0333072803914547,
0.18269111216068268,
-0.14569400250911713,
0.09730499237775803,
-0.23278197646141052,
-0.05850515142083168,
-0.07565324008464813,
0.0037082377821207047,
-0.03492661565542221,
-0.04643496498465538,
-0.07671928405761719,
0.031666748225688934,
-0.0035245574545115232,
-0.007459436077624559,
0.04334527626633644,
-0.031083209440112114,
-0.01978834718465805,
0.10619198530912399,
0.10624285042285919,
-0.01808132603764534,
-0.07107992470264435,
0.05519259348511696,
0.05055367201566696,
0.11512676626443863,
-0.1925860047340393,
0.030319537967443466,
0.10359316319227219,
0.01645619608461857,
0.12557706236839294,
0.04513365775346756,
-0.1040380597114563,
0.042342714965343475,
0.08780016750097275,
-0.0753517672419548,
-0.0660046935081482,
-0.02085418626666069,
-0.07932723313570023,
-0.06698310375213623,
0.05126432329416275,
0.09382260590791702,
-0.05034700408577919,
-0.020097030326724052,
-0.02403847686946392,
-0.019215384498238564,
-0.11189306527376175,
0.1857793927192688,
0.07522747665643692,
0.08549061417579651,
-0.06674297899007797,
0.06407170742750168,
0.08450556546449661,
-0.08472401648759842,
0.007118762005120516,
0.18878792226314545,
-0.10392576456069946,
-0.04836498573422432,
0.07241826504468918,
0.22248290479183197,
-0.02694285660982132,
-0.05847856029868126,
-0.13967779278755188,
-0.07725676149129868,
0.03196696937084198,
0.16566148400306702,
0.1009681299328804,
0.09446423500776291,
-0.026781028136610985,
-0.0005909337196499109,
-0.10767573863267899,
0.09237152338027954,
0.06245126202702522,
0.050182782113552094,
-0.10608349740505219,
0.13051776587963104,
0.03838328644633293,
0.12194278091192245,
-0.02719750814139843,
-0.011370863765478134,
-0.13887912034988403,
0.06321156024932861,
-0.11031775176525116,
0.0350799560546875,
-0.008887186646461487,
0.051737479865550995,
-0.02423037961125374,
0.0020569751504808664,
-0.03142685815691948,
0.06896939128637314,
-0.08179471641778946,
0.001159046427346766,
0.0038583858404308558,
0.05486380681395531,
-0.0516037754714489,
-0.018955543637275696,
0.03337569534778595,
-0.09247572720050812,
0.12383376806974411,
-0.03694275766611099,
-0.028572507202625275,
0.07977231591939926,
-0.04635243117809296,
0.03982338309288025,
0.01471255999058485,
0.0493989996612072,
0.019814949482679367,
0.013520481996238232,
0.07729906588792801,
0.035861652344465256,
0.0531943179666996,
0.025023730471730232,
0.12194476276636124,
-0.13932110369205475,
-0.08402173221111298,
-0.055758148431777954,
-0.11314988136291504,
-0.056893423199653625,
0.10051625967025757,
0.04737988859415054,
0.10432649403810501,
0.09166692942380905,
-0.03138802573084831,
0.010531870648264885,
-0.12626200914382935,
-0.06650291383266449,
0.02816174365580082,
-0.03070109151303768,
-0.08416880667209625,
-0.05615626275539398,
0.03780580312013626,
-0.03234218433499336,
0.12047957628965378,
0.01819838583469391,
0.03513360396027565,
-0.019796257838606834,
-0.059794455766677856,
-0.01642569899559021,
0.021365638822317123,
0.21296235918998718,
-0.08520440012216568,
0.04158947244286537,
0.0013756841653957963,
0.016431253403425217,
0.00706610968336463,
0.11855911463499069,
0.11882929503917694,
0.16585023701190948,
-0.03591436520218849,
0.09987962245941162,
0.017827501520514488,
-0.0025597333442419767,
-0.07602547109127045,
0.02012467011809349,
0.021784698590636253,
0.06296232342720032,
-0.048700448125600815,
0.18780910968780518,
0.0926799401640892,
-0.1225573942065239,
0.11157350987195969,
0.02507544495165348,
-0.13248062133789062,
-0.03287925198674202,
0.021936021745204926,
-0.03567180037498474,
-0.1477944403886795,
0.023969268426299095,
-0.12950918078422546,
-0.01561045367270708,
0.050287116318941116,
0.05160146951675415,
-0.07835956662893295,
0.17256224155426025,
0.03444580361247063,
-0.06040751934051514,
0.05366647616028786,
-0.001844891463406384,
0.025871440768241882,
0.022593876346945763,
0.03596055507659912,
0.037114664912223816,
-0.03893119841814041,
0.03624013066291809,
0.024567633867263794,
-0.024366721510887146,
-0.01929791271686554,
-0.021125687286257744,
-0.0028708078898489475,
-0.015488815493881702,
0.01932363212108612,
0.05617582052946091,
0.1601107269525528,
0.035600632429122925,
-0.07383470982313156,
-0.018084153532981873,
0.1737896353006363,
-0.02738557942211628,
-0.09550637006759644,
-0.1263810098171234,
0.1318204700946808,
0.05228955298662186,
0.011103147640824318,
0.026424070820212364,
-0.08305872976779938,
-0.053766295313835144,
0.20934075117111206,
0.05354069918394089,
-0.03091832809150219,
-0.023326952010393143,
0.007705303840339184,
-0.0017314424039795995,
-0.03954828530550003,
0.20251347124576569,
0.022918855771422386,
0.22474196553230286,
0.02297782339155674,
-0.007072215899825096,
-0.06863610446453094,
-0.040818870067596436,
0.0022720620036125183,
0.12044496089220047,
-0.039126068353652954,
-0.03973929211497307,
-0.08343338966369629,
-0.0039124093018472195,
-0.0006727470317855477,
-0.08100391924381256,
0.10064122825860977,
-0.13742107152938843,
-0.09820006042718887,
-0.049912769347429276,
0.050177231431007385,
-0.059747159481048584,
0.01675320230424404,
-0.024305790662765503,
0.043730929493904114,
0.07012317329645157,
-0.03262219950556755,
-0.10080000013113022,
-0.1708804965019226,
0.09555414319038391,
-0.05313752964138985,
0.13235825300216675,
-0.015285338275134563,
0.15277332067489624,
0.0850919559597969,
0.02537774294614792,
-0.06445746123790741,
0.1141669824719429,
0.031649183481931686,
0.05732973292469978,
0.04822840914130211,
0.12345841526985168,
-0.05071190744638443,
0.13516610860824585,
-0.05121776461601257,
-0.0311016533523798,
-0.028407001867890358,
-0.0750245749950409,
-0.019074033945798874,
-0.16223140060901642,
-0.019599372521042824,
-0.09521178156137466,
0.0932948887348175,
0.19443345069885254,
-0.04351705312728882,
-0.0309781301766634,
-0.09307892620563507,
0.10865969955921173,
-0.011371197178959846,
0.06511930376291275,
-0.03284208104014397,
-0.1740713119506836,
-0.00039817916695028543,
0.011064798571169376,
0.01428974699229002,
-0.27564433217048645,
-0.006099204998463392,
-0.03941909968852997,
-0.027952050790190697,
-0.08656490594148636,
0.15924884378910065,
0.08874693512916565,
0.04892034828662872,
-0.040518634021282196,
-0.1632826030254364,
-0.03603244572877884,
0.05830506980419159,
-0.13863122463226318,
-0.14473536610603333
] |
null | null | transformers |
# CodeTrans model for code comment generation java
Pretrained model on programming language java using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized java code functions: it works best with tokenized java functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code comment generation task for the java function/method.
## Intended uses & limitations
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_comment_generation_java_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_comment_generation_java_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "protected String renderUri ( URI uri ) { return uri . toASCIIString ( ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/code%20comment%20generation/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 60,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 37.98 |
| CodeTrans-ST-Base | 38.07 |
| CodeTrans-TF-Small | 38.56 |
| CodeTrans-TF-Base | 39.06 |
| CodeTrans-TF-Large | **39.50** |
| CodeTrans-MT-Small | 20.15 |
| CodeTrans-MT-Base | 27.44 |
| CodeTrans-MT-Large | 34.69 |
| CodeTrans-MT-TF-Small | 38.37 |
| CodeTrans-MT-TF-Base | 38.90 |
| CodeTrans-MT-TF-Large | 39.25 |
| State of the art | 38.17 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "protected String renderUri ( URI uri ) { return uri . toASCIIString ( ) ; }"}]} | summarization | SEBIS/code_trans_t5_base_code_comment_generation_java_multitask_finetune | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
| CodeTrans model for code comment generation java
================================================
Pretrained model on programming language java using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized java code functions: it works best with tokenized java functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code comment generation task for the java function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 60,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 60,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 60,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
46,
61,
88,
109
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 60,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.09200116991996765,
0.07021292299032211,
-0.0012527774088084698,
0.10779352486133575,
0.047798819839954376,
0.018271364271640778,
0.016550462692975998,
0.1108686625957489,
-0.014417032711207867,
0.06604883819818497,
0.05467728525400162,
-0.06225843355059624,
0.050045568495988846,
0.18557438254356384,
0.031114783138036728,
-0.1716153770685196,
-0.032297249883413315,
0.022660918533802032,
-0.04969893395900726,
0.10686450451612473,
0.08407086879014969,
-0.0667600929737091,
0.07235101610422134,
-0.043068092316389084,
-0.12144649773836136,
0.05448788404464722,
-0.034458648413419724,
-0.02794210985302925,
0.09714410454034805,
0.05296992510557175,
0.10469820350408554,
-0.031243199482560158,
0.0531383641064167,
-0.2081429660320282,
0.0022867510560899973,
0.028021354228258133,
0.062110088765621185,
0.02870793454349041,
0.05525040626525879,
0.06568107008934021,
0.13778254389762878,
-0.013674805872142315,
0.04273267462849617,
0.06385519355535507,
-0.06563089787960052,
-0.10208193212747574,
-0.049910977482795715,
0.04111175984144211,
0.08205699920654297,
0.10019538551568985,
-0.012176279909908772,
-0.0018653589067980647,
-0.07780355215072632,
0.08963562548160553,
0.12840266525745392,
-0.2121378630399704,
-0.01857864111661911,
0.13690833747386932,
0.0875903069972992,
0.05374116823077202,
-0.08360475301742554,
-0.029598211869597435,
0.1064176857471466,
0.04364725574851036,
0.06321611255407333,
-0.09353378415107727,
-0.05054335296154022,
-0.001007242128252983,
-0.03899301216006279,
-0.0498681478202343,
0.14439773559570312,
0.029651790857315063,
-0.04875616729259491,
-0.11385560035705566,
-0.057245224714279175,
-0.18423740565776825,
0.04293133690953255,
0.02169596217572689,
0.016390696167945862,
0.0015852400101721287,
0.020846210420131683,
-0.013536492362618446,
-0.08634527027606964,
-0.11607786267995834,
0.027012676000595093,
0.014432089403271675,
0.059728771448135376,
0.03194810822606087,
-0.024051932618021965,
0.08279430121183395,
-0.020805681124329567,
-0.056386955082416534,
-0.026317758485674858,
0.016384458169341087,
-0.12390757352113724,
0.0338924415409565,
-0.00822617206722498,
-0.04737795889377594,
0.004170638974756002,
0.07487613707780838,
-0.11858595162630081,
0.09113867580890656,
0.0922606959939003,
0.010355057194828987,
0.011803258210420609,
0.21097928285598755,
0.05311250314116478,
-0.16401684284210205,
0.0151464082300663,
0.007497651968151331,
0.009300037287175655,
-0.0009942500619217753,
-0.055420905351638794,
-0.053523823618888855,
0.014271697029471397,
0.0674123764038086,
-0.13668671250343323,
0.01438929233700037,
-0.0528627410531044,
-0.005775441415607929,
0.08044548332691193,
-0.1279733031988144,
0.03520909324288368,
0.01082997303456068,
-0.04786654934287071,
-0.03860108554363251,
0.07776496559381485,
-0.12785027921199799,
-0.11558707058429718,
0.030149750411510468,
-0.04235948249697685,
-0.03506895899772644,
-0.12505075335502625,
-0.10573919117450714,
-0.005254114978015423,
-0.055822085589170456,
-0.002501916605979204,
-0.09157205373048782,
-0.11554079502820969,
-0.021467149257659912,
0.0364898182451725,
-0.01596996746957302,
-0.02171529270708561,
-0.04802008345723152,
0.010860759764909744,
-0.00031266181031242013,
-0.02504897490143776,
0.03797479346394539,
-0.03890260308980942,
0.09403681010007858,
0.08727720379829407,
0.054075535386800766,
0.004215224180370569,
0.032434556633234024,
-0.09433355927467346,
0.08641313761472702,
-0.10681237280368805,
0.07018730044364929,
-0.025857847183942795,
0.058100514113903046,
-0.0990191176533699,
-0.08246768265962601,
0.013798648491501808,
0.05151395872235298,
0.056095097213983536,
0.02079128473997116,
-0.12636758387088776,
0.017111681401729584,
0.1537008285522461,
-0.12214640527963638,
-0.12062228471040726,
0.11097713559865952,
-0.002910129725933075,
0.02845357358455658,
0.06574128568172455,
0.15427733957767487,
0.13859668374061584,
-0.08361412584781647,
-0.02218508906662464,
0.07822775095701218,
0.06701955944299698,
-0.06200435012578964,
0.06733035296201706,
0.0090692900121212,
0.02292872965335846,
0.04203713312745094,
0.048819079995155334,
0.055463001132011414,
0.01071603037416935,
-0.03173355758190155,
-0.04617457464337349,
-0.08196275681257248,
-0.04744720086455345,
0.00016637254157103598,
0.021290011703968048,
-0.06680519133806229,
-0.06186271458864212,
0.010531893000006676,
0.17041827738285065,
-0.102103590965271,
0.028298581019043922,
-0.07112594693899155,
-0.04192288964986801,
-0.07415574043989182,
0.03122975490987301,
-0.12331734597682953,
0.021569062024354935,
0.060860514640808105,
-0.05478443205356598,
0.04475570470094681,
0.08141257613897324,
0.01068577729165554,
0.02188105694949627,
-0.056321822106838226,
-0.03458574414253235,
-0.04103628918528557,
-0.07029759138822556,
-0.09551727771759033,
-0.030083244666457176,
-0.08383750915527344,
-0.034272439777851105,
-0.0465826578438282,
-0.16918742656707764,
-0.00035276488051749766,
0.00006486894562840462,
0.033649805933237076,
0.024618467316031456,
-0.037786755710840225,
0.04028666764497757,
0.053860656917095184,
-0.048440661281347275,
-0.0870618224143982,
0.022345105186104774,
0.030500685796141624,
-0.08957608789205551,
-0.018685748800635338,
-0.08916555345058441,
-0.057423610240221024,
0.06542307138442993,
0.09592310339212418,
-0.10141187906265259,
-0.007908937521278858,
-0.021119369193911552,
-0.05750706046819687,
-0.06224203482270241,
-0.0646621510386467,
0.16023656725883484,
0.00792479794472456,
0.16578099131584167,
-0.14084236323833466,
-0.06077444925904274,
-0.026389190927147865,
0.005840100347995758,
0.027390673756599426,
0.16020944714546204,
0.00450095534324646,
-0.09518823772668839,
0.050402965396642685,
-0.03375127911567688,
-0.06408004462718964,
0.1882733851671219,
-0.014028022065758705,
-0.07950030267238617,
0.004819141700863838,
0.10947863012552261,
-0.01465789508074522,
0.1609179824590683,
-0.09054557979106903,
-0.0005959264817647636,
-0.0012947022914886475,
0.018051782622933388,
0.04205209016799927,
-0.1190887987613678,
0.021613597869873047,
0.03599860146641731,
-0.07469364255666733,
-0.030914245173335075,
-0.015317725017666817,
-0.04056384414434433,
0.03682226687669754,
0.009084716439247131,
0.02622591331601143,
-0.019088350236415863,
-0.029312264174222946,
-0.09365012496709824,
0.19359324872493744,
-0.08059383928775787,
-0.21621844172477722,
-0.1774912178516388,
0.0764978751540184,
-0.034898389130830765,
-0.01848405972123146,
0.03915544971823692,
-0.11430606991052628,
-0.06358829140663147,
-0.09564591944217682,
0.11092102527618408,
-0.11951887607574463,
-0.00954999215900898,
-0.02868977002799511,
0.059994205832481384,
0.04956035688519478,
-0.17093950510025024,
0.023425554856657982,
-0.0076098572462797165,
0.0045471410267055035,
-0.004259043373167515,
-0.06242450326681137,
0.08297696709632874,
0.11617914587259293,
-0.06125752627849579,
0.025817109271883965,
0.0007489898125641048,
0.15249241888523102,
-0.05497101694345474,
0.039735518395900726,
0.18695683777332306,
0.026474814862012863,
0.028936590999364853,
0.04798527806997299,
0.013468587771058083,
-0.09739718586206436,
0.061602722853422165,
0.06460622698068619,
-0.048883646726608276,
-0.23861709237098694,
-0.016322707757353783,
-0.07614213228225708,
0.06469246745109558,
0.11360723525285721,
0.06348535418510437,
-0.14895153045654297,
0.014797879382967949,
0.00040799268754199147,
0.1477757841348648,
-0.032348956912755966,
0.055572446435689926,
0.03840511664748192,
0.006147504784166813,
0.004819059278815985,
-0.09919459372758865,
0.017041102051734924,
0.0767572671175003,
0.12002572417259216,
0.20898018777370453,
-0.0854218527674675,
0.20373205840587616,
0.013460781425237656,
0.09247367829084396,
0.04140595719218254,
0.07567748427391052,
-0.1365179419517517,
0.010267460718750954,
0.005676595028489828,
-0.024057069793343544,
-0.057705964893102646,
0.04925832897424698,
-0.04771113023161888,
0.06634539365768433,
-0.0401100292801857,
0.005898943170905113,
0.018116621300578117,
0.19302304089069366,
0.05509179085493088,
-0.1680576056241989,
-0.12622371315956116,
0.023144064471125603,
-0.08986837416887283,
-0.1105266660451889,
0.07170253247022629,
0.22897042334079742,
-0.05662010610103607,
0.03148585557937622,
-0.008356385864317417,
0.1350603848695755,
-0.09965389966964722,
-0.020792318508028984,
0.030624819919466972,
0.0669887363910675,
0.010114981792867184,
0.12324240803718567,
-0.2613084614276886,
0.07308515161275864,
0.013484325259923935,
0.0835152193903923,
-0.012156537733972073,
0.06304588913917542,
-0.04509127512574196,
0.006862538866698742,
0.07573097944259644,
0.0096807312220335,
-0.0695943832397461,
-0.18357744812965393,
-0.038548026233911514,
0.02810828574001789,
0.03744616359472275,
-0.014920368790626526,
0.07535077631473541,
-0.020216211676597595,
0.04298735782504082,
-0.03628838062286377,
-0.14668625593185425,
-0.053667522966861725,
-0.13104897737503052,
-0.033981066197156906,
0.009594369679689407,
-0.046804722398519516,
-0.023571183905005455,
0.0327569954097271,
0.039376888424158096,
0.24568213522434235,
-0.14830942451953888,
-0.09492665529251099,
-0.09198115766048431,
0.06742467731237411,
0.13478069007396698,
-0.09412557631731033,
0.020803501829504967,
0.011839991435408592,
0.056521423161029816,
-0.04719008132815361,
-0.07076415419578552,
0.03261918947100639,
-0.05252448096871376,
-0.08490405976772308,
-0.028884466737508774,
0.11141445487737656,
-0.018445493653416634,
0.03693754971027374,
-0.0026113083586096764,
-0.07460928708314896,
-0.03845318779349327,
-0.13223807513713837,
-0.06736009567975998,
0.018179193139076233,
0.02645953558385372,
-0.01565621607005596,
-0.1075410544872284,
0.07427563518285751,
0.014544368721544743,
-0.09796083718538284,
0.0669572651386261,
0.16749447584152222,
-0.07150888442993164,
0.040413759648799896,
0.11463940888643265,
-0.058435577899217606,
-0.16130772233009338,
-0.0364004410803318,
0.04345877468585968,
0.08166228234767914,
-0.026530060917139053,
-0.155369833111763,
0.06439002603292465,
0.03271302953362465,
0.011695711873471737,
0.032228101044893265,
-0.28615981340408325,
-0.12813936173915863,
0.004502834752202034,
0.07599196583032608,
0.0664779394865036,
-0.11482633650302887,
-0.05276361480355263,
-0.06704696267843246,
-0.08537446707487106,
0.045540932565927505,
0.052483800798654556,
0.11788401752710342,
-0.048004791140556335,
0.03501024469733238,
0.03898073360323906,
-0.028894303366541862,
0.07322794198989868,
-0.012823507189750671,
0.0951499417424202,
-0.021113533526659012,
0.03384818136692047,
0.05529506504535675,
-0.05754709616303444,
0.18295784294605255,
-0.17401494085788727,
0.09129893034696579,
-0.1803465187549591,
-0.047442443668842316,
-0.026217903941869736,
0.0020763284992426634,
-0.0365135557949543,
-0.0581655316054821,
-0.10867276787757874,
0.024492822587490082,
0.03961445391178131,
-0.02708538807928562,
0.05980372056365013,
-0.03477244824171066,
-0.034946832805871964,
0.08882726728916168,
0.06599252671003342,
-0.016413463279604912,
-0.1360826939344406,
0.02837596833705902,
0.022517399862408638,
0.0934731513261795,
-0.2192734181880951,
0.017497019842267036,
0.1015729084610939,
0.02831427939236164,
0.10070116072893143,
0.0036747294943779707,
-0.09072893112897873,
0.039257586002349854,
0.06999991089105606,
-0.05506383255124092,
-0.0918082594871521,
-0.011787495575845242,
-0.04910454899072647,
-0.07883336395025253,
0.032700806856155396,
0.08940090984106064,
-0.05852813273668289,
-0.010465948842465878,
-0.002519243396818638,
0.017256608232855797,
-0.06399714201688766,
0.1806810200214386,
0.01738983392715454,
0.08028197288513184,
-0.0703742504119873,
0.08409363776445389,
0.10183259099721909,
-0.11675789952278137,
0.022935274988412857,
0.17706485092639923,
-0.08923780918121338,
-0.021961163729429245,
0.0531931109726429,
0.11654463410377502,
-0.027909940108656883,
-0.05747396871447563,
-0.09242319315671921,
-0.07265877723693848,
0.022869134321808815,
0.021974530071020126,
0.07374663650989532,
0.08502396196126938,
-0.033855196088552475,
0.005982079543173313,
-0.10432693362236023,
0.10141194611787796,
0.06341227144002914,
0.054905522614717484,
-0.1311589628458023,
0.1068425253033638,
0.05101775377988815,
0.0787888914346695,
0.002025717869400978,
0.017358746379613876,
-0.10652933269739151,
0.03674113377928734,
-0.0303209088742733,
0.040274232625961304,
-0.011711467988789082,
0.05094609409570694,
-0.04285465180873871,
0.03139267489314079,
-0.026306532323360443,
0.0552758127450943,
-0.03506721928715706,
-0.02847214601933956,
-0.038241393864154816,
0.03347142040729523,
-0.06861723214387894,
-0.015809496864676476,
0.012430752627551556,
-0.07674543559551239,
0.10053643584251404,
-0.0643872618675232,
-0.012198235839605331,
0.003002034965902567,
-0.00594171229749918,
0.06831083446741104,
0.03306811302900314,
0.036534491926431656,
-0.010826623067259789,
-0.009074356406927109,
0.03314604610204697,
0.015701502561569214,
-0.00603689718991518,
-0.004533702041953802,
0.07495817542076111,
-0.15404529869556427,
-0.07535095512866974,
-0.08196287602186203,
-0.08283372968435287,
-0.06290649622678757,
0.07964440435171127,
0.09541090577840805,
0.0838603600859642,
0.08041097968816757,
-0.036826688796281815,
0.0015120799653232098,
-0.15243655443191528,
-0.04614291340112686,
0.04908165708184242,
-0.013316899538040161,
-0.11388586461544037,
-0.031521011143922806,
0.04619937017560005,
-0.0452723354101181,
0.12527966499328613,
-0.014146034605801105,
0.057955823838710785,
-0.012881394475698471,
-0.05961509048938751,
-0.03361029550433159,
-0.0032335419673472643,
0.18107852339744568,
-0.10030080378055573,
0.01141815260052681,
0.003198738908395171,
0.010429066605865955,
0.03076835162937641,
0.17935799062252045,
0.07653822004795074,
0.12514451146125793,
0.042718056589365005,
0.08341768383979797,
-0.04757220298051834,
-0.027511902153491974,
-0.1554388403892517,
0.09508141875267029,
-0.02215869352221489,
0.047430992126464844,
-0.04251045361161232,
0.13579894602298737,
0.13786731660366058,
-0.14226096868515015,
0.10716806352138519,
0.018853887915611267,
-0.0915377214550972,
-0.04634421691298485,
-0.10503789782524109,
-0.05033634975552559,
-0.11194323748350143,
0.008616349659860134,
-0.10582444816827774,
0.02667233906686306,
0.08220566809177399,
0.04015776142477989,
-0.029579542577266693,
0.15272754430770874,
0.0025061366613954306,
-0.06440111249685287,
0.027685675770044327,
0.04472877085208893,
0.030491814017295837,
0.10602552443742752,
0.027399221435189247,
0.07022245973348618,
-0.06766846776008606,
0.06236022710800171,
0.03665708377957344,
0.0069702561013400555,
0.002403260674327612,
0.0074699330143630505,
0.0011535283410921693,
-0.0484304279088974,
0.008570108562707901,
0.06860342621803284,
0.1581052988767624,
0.05473664775490761,
-0.05437087267637253,
-0.04461603984236717,
0.2062831073999405,
-0.04877118766307831,
-0.05064164102077484,
-0.12008807808160782,
0.15901592373847961,
0.047248534858226776,
0.010441928170621395,
0.014020966365933418,
-0.0712025836110115,
-0.032608408480882645,
0.2160826474428177,
0.04213417321443558,
-0.017083387821912766,
-0.03142311051487923,
-0.011926282197237015,
-0.010100031271576881,
-0.033824168145656586,
0.14853613078594208,
0.005067664664238691,
0.2404753565788269,
0.012205258011817932,
-0.0025877703446894884,
-0.043020885437726974,
-0.04949067160487175,
-0.028617316856980324,
0.19727948307991028,
-0.04463456943631172,
0.02022196352481842,
-0.09493200480937958,
-0.013943348079919815,
0.026588084176182747,
-0.1177225187420845,
0.1208372563123703,
-0.12871937453746796,
-0.07704418152570724,
0.015105161815881729,
0.06773561984300613,
-0.03807300329208374,
0.032748881727457047,
-0.019546255469322205,
0.06096512824296951,
0.03866599500179291,
-0.02850043587386608,
-0.10350027680397034,
-0.1563286930322647,
0.04189460724592209,
-0.015727654099464417,
0.13140438497066498,
0.01419229619204998,
0.0712796077132225,
0.08316843211650848,
0.018229642882943153,
-0.07493415474891663,
0.10524995625019073,
0.03282811492681503,
-0.00401283847168088,
0.049695808440446854,
0.13173604011535645,
-0.04103164002299309,
0.15696483850479126,
0.011675311252474785,
-0.017761515453457832,
-0.027782263234257698,
-0.015137499198317528,
0.00015267476555891335,
-0.15958203375339508,
0.0032521153334528208,
-0.06442730873823166,
0.12942172586917877,
0.19548383355140686,
-0.03871417045593262,
-0.031013622879981995,
-0.05775502324104309,
0.0859842374920845,
-0.021595682948827744,
0.07873944193124771,
-0.000625701155513525,
-0.164497509598732,
0.01836315728724003,
0.015894528478384018,
0.01255734171718359,
-0.1878727227449417,
-0.050140462815761566,
-0.028259912505745888,
-0.026371169835329056,
-0.0824526771903038,
0.1416039615869522,
0.06498446315526962,
0.03633209317922592,
-0.035160910338163376,
-0.1883796751499176,
-0.0032373592257499695,
0.052660536020994186,
-0.13194292783737183,
-0.12152605503797531
] |
null | null | transformers |
# CodeTrans model for code comment generation java
Pretrained model on programming language java using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized java code functions: it works best with tokenized java functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code comment generation task for the java function/method.
## Intended uses & limitations
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_comment_generation_java_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_comment_generation_java_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "protected String renderUri ( URI uri ) { return uri . toASCIIString ( ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/code%20comment%20generation/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V3-8 for 80,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 37.98 |
| CodeTrans-ST-Base | 38.07 |
| CodeTrans-TF-Small | 38.56 |
| CodeTrans-TF-Base | 39.06 |
| CodeTrans-TF-Large | **39.50** |
| CodeTrans-MT-Small | 20.15 |
| CodeTrans-MT-Base | 27.44 |
| CodeTrans-MT-Large | 34.69 |
| CodeTrans-MT-TF-Small | 38.37 |
| CodeTrans-MT-TF-Base | 38.90 |
| CodeTrans-MT-TF-Large | 39.25 |
| State of the art | 38.17 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "protected String renderUri ( URI uri ) { return uri . toASCIIString ( ) ; }"}]} | summarization | SEBIS/code_trans_t5_base_code_comment_generation_java_transfer_learning_finetune | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
| CodeTrans model for code comment generation java
================================================
Pretrained model on programming language java using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized java code functions: it works best with tokenized java functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code comment generation task for the java function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V3-8 for 80,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V3-8 for 80,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V3-8 for 80,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
46,
61,
87,
108
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V3-8 for 80,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.10578923672437668,
0.06177182123064995,
-0.0010463641956448555,
0.10627951472997665,
0.04134700447320938,
0.019679641351103783,
0.034546658396720886,
0.10904288291931152,
-0.012114070355892181,
0.06176331639289856,
0.04751165211200714,
-0.05429805442690849,
0.06746933609247208,
0.200022891163826,
0.011364925652742386,
-0.1366788148880005,
-0.02982041984796524,
0.04401329159736633,
-0.06725714355707169,
0.11257745325565338,
0.08023224771022797,
-0.08640307188034058,
0.07775194197893143,
-0.0526195652782917,
-0.12655586004257202,
0.043192196637392044,
-0.022436028346419334,
-0.030359484255313873,
0.09828439354896545,
0.06378240138292313,
0.11932720988988876,
-0.022340640425682068,
0.06230567768216133,
-0.184812530875206,
0.001908443053252995,
0.028881819918751717,
0.06613083183765411,
0.044964954257011414,
0.04818803444504738,
0.09418398141860962,
0.11788654327392578,
-0.020119423046708107,
0.03774065896868706,
0.05671194940805435,
-0.06637374311685562,
-0.054751742631196976,
-0.07035403698682785,
0.0690784677863121,
0.09827948361635208,
0.10246206074953079,
-0.006853996776044369,
0.04160648211836815,
-0.07714444398880005,
0.08496461808681488,
0.11747024953365326,
-0.2201344519853592,
-0.020757358521223068,
0.10454615950584412,
0.09225001931190491,
0.050221193581819534,
-0.08126215636730194,
-0.03744162619113922,
0.11130228638648987,
0.04194661229848862,
0.06749846786260605,
-0.09403441101312637,
-0.032406359910964966,
0.0006953144329600036,
-0.04873340576887131,
-0.048307206481695175,
0.18249769508838654,
0.0453573577105999,
-0.053515929728746414,
-0.10483894497156143,
-0.04269915819168091,
-0.18208646774291992,
0.04549810290336609,
0.0008431963506154716,
0.004307618364691734,
-0.014601003378629684,
0.01729607954621315,
-0.00034273494384251535,
-0.08933178335428238,
-0.1205863431096077,
0.03405752032995224,
-0.01184700708836317,
0.0560232512652874,
0.03233252093195915,
-0.035312801599502563,
0.08585646748542786,
0.042507316917181015,
-0.05278018116950989,
-0.0037587143015116453,
0.008174958638846874,
-0.10818053781986237,
0.007255434058606625,
0.0006039110012352467,
-0.07397341728210449,
-0.009160034358501434,
0.04876011237502098,
-0.09688335657119751,
0.07219958305358887,
0.09581094235181808,
0.02018420398235321,
0.009828426875174046,
0.208803191781044,
0.04023222625255585,
-0.16436153650283813,
0.02419234812259674,
0.03438157960772514,
-0.008224714547395706,
0.011195403523743153,
-0.05365638807415962,
-0.0553559772670269,
0.031623098999261856,
0.06861725449562073,
-0.13517385721206665,
0.023408377543091774,
-0.05608765780925751,
-0.01825731247663498,
0.07490265369415283,
-0.12940539419651031,
0.030404292047023773,
0.008458626456558704,
-0.06543046981096268,
-0.04146551340818405,
0.09123382717370987,
-0.13290397822856903,
-0.12449967116117477,
0.014319752342998981,
-0.047951649874448776,
-0.0397547222673893,
-0.12529277801513672,
-0.1165839359164238,
-0.004164184909313917,
-0.03322555497288704,
-0.0019925280939787626,
-0.09691856056451797,
-0.09090465307235718,
-0.02902483008801937,
0.038987282663583755,
-0.0043806214816868305,
-0.02933894656598568,
-0.04690811410546303,
0.00642023328691721,
-0.007415411528199911,
-0.029589511454105377,
0.02163774147629738,
-0.0332811139523983,
0.10144039988517761,
0.06398533284664154,
0.053321223706007004,
0.00447758287191391,
0.0312548503279686,
-0.08264969289302826,
0.07965300232172012,
-0.13116319477558136,
0.05535950884222984,
-0.011769719421863556,
0.065629743039608,
-0.10422860831022263,
-0.07548807561397552,
0.02764623984694481,
0.052972305566072464,
0.0755181536078453,
0.038220666348934174,
-0.14894837141036987,
0.030726000666618347,
0.13866859674453735,
-0.1076517105102539,
-0.14888639748096466,
0.10718031972646713,
-0.011814791709184647,
0.05339426174759865,
0.06437534838914871,
0.13424541056156158,
0.14119140803813934,
-0.08547590672969818,
-0.03116159699857235,
0.06935815513134003,
0.04526643827557564,
-0.07456526905298233,
0.052693258970975876,
0.023508159443736076,
-0.006343090906739235,
0.0181327685713768,
0.06384563446044922,
0.059736188501119614,
-0.0022214138880372047,
-0.03507540374994278,
-0.031995147466659546,
-0.09218056499958038,
-0.07123477011919022,
-0.0048148720525205135,
0.022503508254885674,
-0.05782962962985039,
-0.056690942496061325,
0.007281105034053326,
0.16302570700645447,
-0.09902755916118622,
0.022542983293533325,
-0.07716403156518936,
-0.0453345850110054,
-0.07284476608037949,
0.023614540696144104,
-0.11369595676660538,
0.03458087518811226,
0.061020951718091965,
-0.02699745073914528,
0.04430630803108215,
0.09426143020391464,
0.004362522624433041,
0.015783069655299187,
-0.048838574439287186,
-0.03959439694881439,
-0.03559916093945503,
-0.06479940563440323,
-0.11337915062904358,
-0.03181340917944908,
-0.09191656857728958,
-0.018904883414506912,
-0.05829375609755516,
-0.17880019545555115,
-0.00153013551607728,
-0.02026602253317833,
0.03422713279724121,
0.026074964553117752,
-0.02989414893090725,
0.038586657494306564,
0.04679242521524429,
-0.045783527195453644,
-0.08514251559972763,
0.022105231881141663,
0.03564465418457985,
-0.08006970584392548,
-0.023855645209550858,
-0.09314592182636261,
-0.06647003442049026,
0.07638995349407196,
0.09037850052118301,
-0.11898966878652573,
-0.004204264376312494,
-0.026063961908221245,
-0.05344151332974434,
-0.04840339347720146,
-0.059272509068250656,
0.15108445286750793,
0.01351319532841444,
0.16202878952026367,
-0.1452237367630005,
-0.06742238253355026,
-0.02524256519973278,
0.013403023593127728,
0.03781238943338394,
0.15726256370544434,
0.021302536129951477,
-0.11946544051170349,
0.0347866527736187,
-0.02687724679708481,
-0.04762348532676697,
0.16410860419273376,
-0.020103251561522484,
-0.07540609687566757,
0.0029057394713163376,
0.10391902923583984,
-0.00157876405864954,
0.18475687503814697,
-0.0633455142378807,
-0.0013088661944493651,
-0.0036524615716189146,
0.012378796003758907,
0.038077495992183685,
-0.1196761503815651,
0.021150702610611916,
0.03528483584523201,
-0.06737004965543747,
-0.01714482344686985,
-0.021908894181251526,
-0.034443579614162445,
0.044871557503938675,
0.015393506735563278,
0.015263902954757214,
-0.015972912311553955,
-0.03422190248966217,
-0.10340879112482071,
0.1767333745956421,
-0.07204427570104599,
-0.2197648286819458,
-0.16914045810699463,
0.10862541198730469,
-0.010120412334799767,
-0.01667082868516445,
0.02562575973570347,
-0.09048730880022049,
-0.06336257606744766,
-0.10142681747674942,
0.12053333967924118,
-0.0987960547208786,
-0.004585071932524443,
-0.021200546994805336,
0.06455232948064804,
0.05422074720263481,
-0.16697053611278534,
0.031376611441373825,
-0.01184791512787342,
0.020539095625281334,
-0.014947536401450634,
-0.06850277632474899,
0.08209352195262909,
0.11296065151691437,
-0.07904115319252014,
0.020121505483984947,
0.0011261922772973776,
0.16563987731933594,
-0.05950038135051727,
0.05995727702975273,
0.17307783663272858,
0.01172475516796112,
0.01826004683971405,
0.05318973585963249,
0.0027522582095116377,
-0.09589847922325134,
0.0696982741355896,
0.04838370159268379,
-0.0274526234716177,
-0.2281150370836258,
-0.015215408056974411,
-0.06826373189687729,
0.07653354108333588,
0.11678264290094376,
0.045662771910429,
-0.14827419817447662,
0.023516060784459114,
-0.0009965747594833374,
0.16400614380836487,
-0.030920032411813736,
0.05385308340191841,
-0.004815428983420134,
0.012950021773576736,
-0.005733322352170944,
-0.09977421164512634,
0.013944392092525959,
0.07085692137479782,
0.10815919190645218,
0.19710156321525574,
-0.09596780687570572,
0.1706172525882721,
0.012058095075190067,
0.11036091297864914,
0.04060506820678711,
0.11103655397891998,
-0.13114458322525024,
0.012964422814548016,
0.002593227429315448,
-0.01879984699189663,
-0.07310670614242554,
0.04526102915406227,
-0.049209464341402054,
0.08589143306016922,
-0.07048072665929794,
0.01942828856408596,
0.012273564003407955,
0.19003021717071533,
0.07676906883716583,
-0.16881237924098969,
-0.1283014565706253,
0.012282945215702057,
-0.09322943538427353,
-0.11957082897424698,
0.07359578460454941,
0.23691049218177795,
-0.054010987281799316,
0.005009841173887253,
-0.01311501394957304,
0.1356048583984375,
-0.0946589782834053,
-0.019076362252235413,
0.03374112397432327,
0.06109991297125816,
0.0043482049368321896,
0.11891186982393265,
-0.2672259509563446,
0.08147309720516205,
0.01789180003106594,
0.09111401438713074,
-0.017165785655379295,
0.052865393459796906,
-0.049625881016254425,
-0.002326976042240858,
0.08134529739618301,
0.01088965218514204,
-0.041506439447402954,
-0.1897241324186325,
-0.04427532106637955,
0.0237050112336874,
0.032694388180971146,
-0.0038306706119328737,
0.08537901192903519,
-0.015360611490905285,
0.041375309228897095,
-0.027566978707909584,
-0.1202843189239502,
-0.06784749776124954,
-0.12387114763259888,
-0.0426320917904377,
0.0023116504307836294,
-0.046421658247709274,
-0.026690037921071053,
0.0425855815410614,
0.04276515170931816,
0.22846932709217072,
-0.1441890001296997,
-0.07631196826696396,
-0.0867144986987114,
0.055739473551511765,
0.13897430896759033,
-0.0821889191865921,
0.014742188155651093,
0.025931505486369133,
0.04967170208692551,
-0.04146342724561691,
-0.062033385038375854,
0.03803597763180733,
-0.05385550111532211,
-0.07776794582605362,
-0.03707769885659218,
0.10601621121168137,
-0.0077511402778327465,
0.045896466821432114,
0.012453427538275719,
-0.08878948539495468,
-0.03248387947678566,
-0.12285300344228745,
-0.07621698826551437,
-0.017457202076911926,
0.05326789245009422,
-0.01759282685816288,
-0.126814603805542,
0.07460827380418777,
-0.005983686540275812,
-0.0944303497672081,
0.06549353897571564,
0.1549845039844513,
-0.06863071024417877,
0.0256887786090374,
0.0846920982003212,
-0.05667916685342789,
-0.17235113680362701,
-0.027584923431277275,
0.04006237909197807,
0.08687835931777954,
-0.03171014040708542,
-0.1303723305463791,
0.06114289537072182,
0.0024815264623612165,
0.02167169749736786,
0.022342512384057045,
-0.24885578453540802,
-0.12623243033885956,
0.007831108756363392,
0.07759223133325577,
0.046101152896881104,
-0.10072363168001175,
-0.04385976493358612,
-0.062465135008096695,
-0.0780298262834549,
0.07248105108737946,
0.06257441639900208,
0.10740151256322861,
-0.03957170248031616,
0.029639923945069313,
0.04477853700518608,
-0.03205787017941475,
0.057750117033720016,
-0.015909194946289062,
0.10572837293148041,
-0.024725735187530518,
0.008330918848514557,
0.039354532957077026,
-0.06717172265052795,
0.18939520418643951,
-0.17167794704437256,
0.09859422594308853,
-0.19578267633914948,
-0.04079856351017952,
-0.034032173454761505,
0.0005220260936766863,
-0.041375305503606796,
-0.04937073960900307,
-0.11487630754709244,
0.04349798336625099,
0.05054473876953125,
-0.025152510032057762,
0.04001389816403389,
-0.01798110082745552,
-0.04164297506213188,
0.07020791620016098,
0.09041149914264679,
-0.012184271588921547,
-0.11014874279499054,
0.041172076016664505,
0.023079868406057358,
0.1025865450501442,
-0.1949702948331833,
0.026118271052837372,
0.1078815683722496,
0.009406487457454205,
0.1004621759057045,
0.005543556064367294,
-0.09174596518278122,
0.01677718758583069,
0.06504576653242111,
-0.07370161265134811,
-0.06594738364219666,
-0.01706843078136444,
-0.023767394945025444,
-0.08382416516542435,
0.025328241288661957,
0.08674345910549164,
-0.06471565365791321,
-0.009036931209266186,
-0.006035116966813803,
0.010975562036037445,
-0.07617390155792236,
0.18203790485858917,
0.02189571037888527,
0.08670084178447723,
-0.05569169297814369,
0.08388321846723557,
0.09608262777328491,
-0.11756511777639389,
0.03518906608223915,
0.16702407598495483,
-0.09071341156959534,
-0.02031830884516239,
0.09197627753019333,
0.14721140265464783,
-0.020069530233740807,
-0.054908864200115204,
-0.0990910530090332,
-0.08014100790023804,
0.022997545078396797,
0.05978035926818848,
0.06336996704339981,
0.09149052202701569,
-0.01661062426865101,
-0.006551158614456654,
-0.12990204989910126,
0.1000695452094078,
0.08025321364402771,
0.04383339732885361,
-0.12919586896896362,
0.14128927886486053,
0.033209919929504395,
0.08185277134180069,
-0.00016781197336968035,
0.03858928009867668,
-0.11079927533864975,
0.03586261719465256,
-0.03687747195363045,
0.040679167956113815,
0.0008402985404245555,
0.041233841329813004,
-0.043881941586732864,
0.04236713424324989,
-0.024984821677207947,
0.04722951725125313,
-0.03591961786150932,
-0.024082530289888382,
-0.03653503581881523,
0.030466333031654358,
-0.04919484630227089,
-0.022640708833932877,
0.0063101015985012054,
-0.08624409884214401,
0.08952472358942032,
-0.06952749937772751,
-0.013530180789530277,
-0.005643768701702356,
0.015662090852856636,
0.05467803776264191,
0.01061931625008583,
0.0466025248169899,
0.00478695472702384,
-0.007378087844699621,
0.026310905814170837,
0.018207643181085587,
-0.011533987708389759,
-0.010441998019814491,
0.07470563799142838,
-0.13926126062870026,
-0.07983320951461792,
-0.08053649216890335,
-0.06806687265634537,
-0.06650908291339874,
0.08432311564683914,
0.08701735734939575,
0.07406824827194214,
0.08623757213354111,
-0.034761201590299606,
0.005964912008494139,
-0.16725830733776093,
-0.04077282175421715,
0.05298620089888573,
-0.006560169160366058,
-0.1006571352481842,
-0.0346250981092453,
0.06198347359895706,
-0.039596375077962875,
0.10050119459629059,
-0.006831891369074583,
0.03736243024468422,
-0.009101473726332188,
-0.07680245488882065,
-0.05667085945606232,
0.007841561920940876,
0.18482692539691925,
-0.11269082874059677,
0.0113295279443264,
-0.009023211896419525,
0.007960356771945953,
0.02854773961007595,
0.16743533313274384,
0.1035582423210144,
0.13869553804397583,
0.03167346119880676,
0.08543600887060165,
-0.04647442698478699,
-0.035129278898239136,
-0.11451616883277893,
0.0834626778960228,
-0.02054716646671295,
0.04578086733818054,
-0.0360177680850029,
0.13597749173641205,
0.09551449865102768,
-0.14073967933654785,
0.10079112648963928,
-0.003259766148403287,
-0.09983433783054352,
-0.03641730919480324,
-0.07908045500516891,
-0.04370142146945,
-0.10111735016107559,
0.002884251531213522,
-0.10640732944011688,
0.007841424085199833,
0.0529487319290638,
0.03514916077256203,
-0.0365259163081646,
0.1506982296705246,
-0.023559292778372765,
-0.053615324199199677,
0.04402103275060654,
0.05030661076307297,
0.020880529657006264,
0.07581117004156113,
0.03175986558198929,
0.06144697591662407,
-0.06313735991716385,
0.06083579361438751,
0.029255138710141182,
0.002564012771472335,
0.01278601959347725,
0.036478087306022644,
-0.009409306570887566,
-0.04302992299199104,
-0.012888739816844463,
0.08324407041072845,
0.15186189115047455,
0.04539310187101364,
-0.04013888165354729,
-0.051359791308641434,
0.1908944696187973,
-0.05772921442985535,
-0.055547744035720825,
-0.12128140777349472,
0.1500096172094345,
0.03367965295910835,
0.009718912653625011,
0.018114589154720306,
-0.07294681668281555,
-0.02200750820338726,
0.24497132003307343,
0.04019204527139664,
-0.04863246530294418,
-0.03526190295815468,
-0.012132183648645878,
-0.008394082076847553,
-0.05489703267812729,
0.1523198038339615,
0.016736604273319244,
0.2221604436635971,
0.012806067243218422,
-0.00701715424656868,
-0.0467769093811512,
-0.04676150158047676,
-0.000745566445402801,
0.18649975955486298,
-0.03442278504371643,
0.02916470356285572,
-0.09269164502620697,
-0.017734242603182793,
0.021993400529026985,
-0.15374520421028137,
0.11796838045120239,
-0.13168932497501373,
-0.0725802555680275,
0.005837431643158197,
0.06581484526395798,
-0.04926025867462158,
0.04458317160606384,
-0.01984579488635063,
0.07535982877016068,
0.053785573691129684,
-0.02917960099875927,
-0.09519819170236588,
-0.1410323977470398,
0.045933354645967484,
-0.025179311633110046,
0.12927031517028809,
0.013348253443837166,
0.08142681419849396,
0.0810496062040329,
0.005780497100204229,
-0.08377072960138321,
0.09051711857318878,
0.03313719108700752,
0.01297022495418787,
0.05367923900485039,
0.1265823394060135,
-0.041329894214868546,
0.16721270978450775,
0.004574198741465807,
-0.03633003309369087,
-0.03330516815185547,
-0.04081070423126221,
-0.013643879443407059,
-0.16031980514526367,
0.0023877413477748632,
-0.0539884939789772,
0.14430201053619385,
0.1957222819328308,
-0.050646211951971054,
-0.014915965497493744,
-0.05538414791226387,
0.08664611726999283,
-0.012166532687842846,
0.07884998619556427,
0.002914187265560031,
-0.15672841668128967,
0.010736075229942799,
-0.027602946385741234,
0.008614989928901196,
-0.1970195323228836,
-0.04074773192405701,
-0.042472049593925476,
-0.04452075809240341,
-0.09895016998052597,
0.1462591290473938,
0.06656481325626373,
0.04376378655433655,
-0.04337482154369354,
-0.12595851719379425,
-0.010316151194274426,
0.05594661459326744,
-0.12424181401729584,
-0.12239664793014526
] |
null | null | transformers |
# CodeTrans model for code documentation generation go
Pretrained model on programming language go using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized go code functions: it works best with tokenized go functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used single-task training on CodeSearchNet Corpus go dataset.
## Intended uses & limitations
The model could be used to generate the description for the go function or be fine-tuned on other go code tasks. It can be used on unparsed and untokenized go code. However, if the go code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate go function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_go"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_go", skip_special_tokens=True),
device=0
)
tokenized_code = "func ( pr * Progress ) needSnapshotAbort ( ) bool { return pr . State == ProgressStateSnapshot && pr . Match >= pr . PendingSnapshot }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/single%20task/function%20documentation%20generation/go/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "func ( pr * Progress ) needSnapshotAbort ( ) bool { return pr . State == ProgressStateSnapshot && pr . Match >= pr . PendingSnapshot }"}]} | summarization | SEBIS/code_trans_t5_base_code_documentation_generation_go | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
| CodeTrans model for code documentation generation go
====================================================
Pretrained model on programming language go using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized go code functions: it works best with tokenized go functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used single-task training on CodeSearchNet Corpus go dataset.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the go function or be fine-tuned on other go code tasks. It can be used on unparsed and untokenized go code. However, if the go code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate go function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
46,
111
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.1305740624666214,
-0.001536276307888329,
-0.0005928365862928331,
0.057752467691898346,
0.15307648479938507,
0.016240764409303665,
0.10347497463226318,
0.0405380018055439,
-0.00245728250592947,
-0.03848239406943321,
0.08634305745363235,
0.13428667187690735,
0.021736083552241325,
0.13678644597530365,
-0.02379005216062069,
-0.20039023458957672,
-0.0231971126049757,
0.06319737434387207,
-0.16838021576404572,
0.1269746869802475,
0.11521846801042557,
-0.05123274028301239,
0.09449580311775208,
0.007601823657751083,
-0.19779285788536072,
0.04573289677500725,
-0.005628336686640978,
-0.07600626349449158,
0.14423483610153198,
0.08849720656871796,
0.11097340285778046,
0.02551395073533058,
0.004209680482745171,
-0.19190627336502075,
0.03829766809940338,
-0.027505509555339813,
-0.0018536653369665146,
0.052532318979501724,
0.03838716074824333,
-0.05812481418251991,
0.20889343321323395,
0.007043040357530117,
0.012061215937137604,
0.0654730349779129,
-0.1105181872844696,
-0.07155917584896088,
-0.015301277860999107,
-0.007541670463979244,
0.07561913132667542,
0.08376073837280273,
0.016701748594641685,
0.1082598865032196,
-0.1373218446969986,
0.13040691614151,
0.0994233787059784,
-0.1797366440296173,
-0.02207821235060692,
0.10585751384496689,
0.09999801218509674,
-0.06336317956447601,
-0.04484592005610466,
0.021915512159466743,
0.07377739995718002,
0.019191423431038857,
0.02216506376862526,
-0.11449835449457169,
-0.13119228184223175,
0.05960915982723236,
-0.07787511497735977,
-0.05632990598678589,
0.26738840341567993,
-0.00036008734605275095,
-0.039746496826410294,
-0.06661273539066315,
-0.04265927895903587,
0.012592862360179424,
0.002200973453000188,
0.007296379189938307,
-0.002611152594909072,
-0.017451703548431396,
-0.02296188659965992,
-0.04019221290946007,
-0.1160086989402771,
-0.12920750677585602,
-0.004421847406774759,
0.09837908297777176,
-0.011919255368411541,
0.032984644174575806,
-0.16841788589954376,
0.1021563857793808,
0.07929874956607819,
-0.07365693151950836,
0.016078930348157883,
-0.05959511548280716,
-0.04523163661360741,
-0.01678507961332798,
-0.06883779913187027,
-0.13521525263786316,
0.0835840180516243,
0.07811698317527771,
-0.04751210659742355,
0.04900570958852768,
0.03329010307788849,
0.07375609129667282,
0.05080727860331535,
0.19232682883739471,
0.0016203389968723059,
-0.06920599192380905,
0.04502689838409424,
-0.030881443992257118,
-0.04418344423174858,
-0.004404961131513119,
-0.07631026208400726,
-0.03940526768565178,
0.016977068036794662,
0.12048440426588058,
-0.08829441666603088,
0.0734417662024498,
-0.0693669244647026,
-0.04071054980158806,
0.0183318629860878,
-0.13043615221977234,
-0.02892308123409748,
0.019464338198304176,
-0.05772261321544647,
-0.05438899248838425,
0.12296406179666519,
-0.07011373341083527,
-0.10839149355888367,
-0.007921811193227768,
-0.0691855400800705,
-0.0021712956950068474,
-0.09576688706874847,
-0.07249964773654938,
0.0052142515778541565,
0.02596447989344597,
0.07088510692119598,
-0.12307821214199066,
-0.166789710521698,
0.001689121825620532,
0.08843676745891571,
0.009665869176387787,
0.03454995155334473,
-0.08913630992174149,
-0.013716435991227627,
-0.033180635422468185,
-0.021595243364572525,
0.02914874441921711,
-0.06990744173526764,
0.08575762063264847,
0.08795399218797684,
0.037520427256822586,
-0.05973232537508011,
0.04425359144806862,
-0.11960764974355698,
0.06150435656309128,
-0.17120948433876038,
0.08127877861261368,
-0.046565547585487366,
0.10504702478647232,
-0.0943060964345932,
-0.05612112209200859,
0.04444438964128494,
0.06836895644664764,
0.06190604344010353,
0.14158064126968384,
-0.09343307465314865,
-0.07086613029241562,
0.13028092682361603,
-0.10746628791093826,
-0.21819856762886047,
0.07131563872098923,
-0.06608860194683075,
0.18596577644348145,
0.05598190799355507,
0.16252976655960083,
0.18471641838550568,
-0.09553492814302444,
0.06505260616540909,
0.08474840223789215,
-0.06801070272922516,
-0.08912947028875351,
0.05595678091049194,
0.057919133454561234,
-0.12593449652194977,
0.05318603664636612,
-0.012155530974268913,
0.12224314361810684,
-0.04522550478577614,
-0.048422157764434814,
-0.00535226846113801,
-0.07468993961811066,
0.05284133553504944,
-0.01253866869956255,
0.0905207023024559,
-0.008428119122982025,
-0.013044314458966255,
0.03404989838600159,
0.1000220850110054,
-0.10642031580209732,
-0.0058611249551177025,
-0.11876475065946579,
0.06940217316150665,
-0.11803528666496277,
0.027304936200380325,
-0.2145960032939911,
-0.014186758548021317,
-0.001171562005765736,
0.042881108820438385,
0.06493803858757019,
0.03422807529568672,
0.018291594460606575,
0.019791854545474052,
-0.0067564500495791435,
-0.005239933263510466,
-0.003131341887637973,
-0.021092955023050308,
-0.03832285478711128,
-0.10490678250789642,
-0.04445014148950577,
-0.055904537439346313,
0.024808771908283234,
-0.18173813819885254,
0.0030457633547484875,
0.0475335456430912,
0.0592452697455883,
0.015325761400163174,
0.025116901844739914,
0.028329845517873764,
0.05834192782640457,
-0.04698362201452255,
-0.015075773932039738,
0.06976282596588135,
0.031821027398109436,
-0.12326686829328537,
0.04543894901871681,
-0.06811615079641342,
0.058741793036460876,
0.11708506941795349,
-0.15253004431724548,
-0.07095914334058762,
-0.05961533635854721,
-0.03885438293218613,
-0.020282160490751266,
0.013845120556652546,
-0.024431059136986732,
0.23086872696876526,
0.0076032234355807304,
0.1681898534297943,
-0.09956032782793045,
-0.04410731792449951,
-0.02457272820174694,
-0.011523633264005184,
0.02893521636724472,
0.138804093003273,
0.11218208074569702,
-0.19908034801483154,
0.03973936662077904,
0.08752129226922989,
-0.0218991469591856,
0.2150958925485611,
-0.0396268367767334,
-0.033261626958847046,
-0.029283467680215836,
0.06407146155834198,
-0.02350614406168461,
0.1755300909280777,
-0.219285786151886,
-0.02260003238916397,
0.0057699838653206825,
-0.01608825847506523,
0.12145530432462692,
-0.13617071509361267,
-0.0006475838017649949,
0.028043998405337334,
-0.03075435943901539,
-0.1254659742116928,
0.04158554598689079,
0.00041420350316911936,
0.036195747554302216,
-0.010167376138269901,
-0.014051608741283417,
0.04329006001353264,
-0.03372902423143387,
-0.13015437126159668,
0.23332823812961578,
-0.06727183610200882,
-0.23339250683784485,
-0.19360066950321198,
0.0555756613612175,
-0.0365544818341732,
0.0016647636657580733,
0.05555139109492302,
-0.05815541371703148,
-0.03129637613892555,
-0.02404516562819481,
0.15386538207530975,
-0.044763050973415375,
-0.03054937534034252,
-0.012762513943016529,
0.07406214624643326,
-0.006546338088810444,
-0.1886359006166458,
-0.007944336161017418,
-0.0027477361727505922,
0.0564122349023819,
0.016866542398929596,
-0.1380755454301834,
0.10744542628526688,
0.10022799670696259,
-0.037628211081027985,
0.046258147805929184,
-0.041539765894412994,
0.23057536780834198,
-0.07839520275592804,
-0.07072210311889648,
0.18812096118927002,
-0.0850873664021492,
0.014004330150783062,
0.012279890477657318,
0.009927317500114441,
-0.11330681294202805,
0.031399279832839966,
-0.03836512565612793,
-0.07261180132627487,
-0.24461990594863892,
-0.10318231582641602,
-0.08955243229866028,
0.09526600688695908,
0.03001771867275238,
0.026525773108005524,
-0.06487542390823364,
0.06551409512758255,
0.06446918845176697,
0.10926029831171036,
-0.005082296673208475,
0.05282200500369072,
0.05603354424238205,
-0.006409443914890289,
-0.002277011750265956,
-0.1074008196592331,
-0.055033836513757706,
0.04298017919063568,
0.08915170282125473,
0.20211616158485413,
0.0015253277961164713,
0.14452703297138214,
0.05578801780939102,
0.025468720123171806,
0.027807818725705147,
0.19070470333099365,
-0.0911194235086441,
0.01555985864251852,
-0.006306934170424938,
-0.035512540489435196,
-0.13411958515644073,
0.03825877979397774,
-0.012907605618238449,
0.022698355838656425,
-0.1430777609348297,
-0.047494035214185715,
0.06440355628728867,
0.0811474397778511,
-0.009144102223217487,
-0.2596582770347595,
-0.12588433921337128,
0.02387578971683979,
-0.051688093692064285,
-0.06562033295631409,
0.058542221784591675,
0.12628409266471863,
-0.11918618530035019,
0.008885812014341354,
-0.051638733595609665,
0.16078951954841614,
-0.07190486043691635,
0.02141321450471878,
-0.05837496742606163,
-0.03349977731704712,
0.0031296347733587027,
0.1601579189300537,
-0.2008456587791443,
0.22613677382469177,
0.003485820721834898,
0.03440289944410324,
-0.08447521179914474,
0.027588849887251854,
0.02003948576748371,
0.07967568933963776,
0.12840621173381805,
-0.02530241571366787,
-0.03412029892206192,
-0.15637315809726715,
0.028862733393907547,
0.07696479558944702,
0.08083286881446838,
-0.036588240414857864,
0.06750606745481491,
-0.017965497449040413,
0.02682211995124817,
-0.009648654609918594,
-0.11289588361978531,
-0.09499853849411011,
-0.10146941989660263,
-0.0002855109632946551,
-0.03463665395975113,
0.04003310203552246,
-0.02772645838558674,
0.010251537896692753,
0.06734120845794678,
0.18765679001808167,
-0.09639225900173187,
-0.061547812074422836,
-0.1143869087100029,
0.020302949473261833,
0.10926157981157303,
-0.08493746817111969,
0.022576479241251945,
0.007170788943767548,
0.028581151738762856,
-0.00900499988347292,
-0.14164939522743225,
0.06926410645246506,
-0.07561299949884415,
-0.011402692645788193,
-0.0341072753071785,
0.11554013192653656,
-0.0067319078370928764,
-0.009580317884683609,
0.04455043375492096,
-0.07528458535671234,
-0.05864742398262024,
-0.14960575103759766,
-0.08087347447872162,
-0.06863117963075638,
0.03799286112189293,
0.05310600996017456,
-0.1327892392873764,
0.022909218445420265,
-0.013090485706925392,
-0.031230902299284935,
0.20586596429347992,
0.11694502830505371,
-0.032139603048563004,
0.022311203181743622,
0.1314665824174881,
-0.0973023995757103,
-0.250497967004776,
-0.007304079830646515,
-0.02199728973209858,
0.08612018078565598,
0.012252682819962502,
-0.1452939659357071,
0.09560225158929825,
-0.038547445088624954,
0.03528788685798645,
0.04907767102122307,
-0.2606939375400543,
-0.11264250427484512,
0.12173369526863098,
0.1300930380821228,
0.09715650975704193,
-0.10220164060592651,
-0.061461467295885086,
-0.08069785684347153,
-0.17926694452762604,
0.1498131901025772,
-0.10182469338178635,
0.09951938688755035,
0.002245514187961817,
0.053813692182302475,
0.023157531395554543,
-0.05926167219877243,
0.11732221394777298,
0.002369206864386797,
0.08977871388196945,
-0.019735293462872505,
-0.11456586420536041,
0.14259503781795502,
-0.03135295212268829,
0.12414165586233139,
-0.10315001010894775,
0.08465199172496796,
-0.21248012781143188,
-0.04910813271999359,
-0.04138840734958649,
0.056402355432510376,
-0.011096634902060032,
-0.0630398616194725,
-0.056769803166389465,
0.014500231482088566,
0.01853727363049984,
0.0018763660918921232,
0.08137498795986176,
-0.050823044031858444,
-0.01033120695501566,
0.09886344522237778,
0.16052119433879852,
-0.026068922132253647,
-0.05682605877518654,
0.035537153482437134,
0.016954269260168076,
0.10972937941551208,
-0.21049970388412476,
0.07803498953580856,
0.12640529870986938,
0.03245534375309944,
0.10091803222894669,
0.09031806886196136,
-0.03334743157029152,
0.04445189982652664,
0.09646061807870865,
-0.13233335316181183,
-0.06333586573600769,
-0.07327014952898026,
-0.08649828284978867,
-0.0012754520867019892,
0.10454213619232178,
0.15172620117664337,
-0.04079530015587807,
0.004250272177159786,
-0.011058005504310131,
-0.026696765795350075,
-0.13649483025074005,
0.131781205534935,
0.03670816496014595,
0.07163078337907791,
-0.07875912636518478,
0.06660746037960052,
0.04079888015985489,
-0.14141057431697845,
-0.027048522606492043,
0.09472790360450745,
-0.11868826299905777,
-0.07787942886352539,
-0.01909167505800724,
0.27727010846138,
-0.12236838042736053,
-0.09023813903331757,
-0.14563828706741333,
-0.06439707428216934,
0.0029632439836859703,
0.23237180709838867,
0.09624150395393372,
0.08391645550727844,
-0.06334330141544342,
0.005605854094028473,
-0.10090510547161102,
0.05526295676827431,
0.089497409760952,
-0.0018138113664463162,
-0.11082585155963898,
0.09663373231887817,
0.0008401903905905783,
0.15462379157543182,
-0.06625483185052872,
-0.03463286533951759,
-0.17861266434192657,
0.09058396518230438,
-0.09779512882232666,
0.05200584605336189,
-0.06388288736343384,
0.03385213017463684,
0.006802236661314964,
0.0018453667871654034,
-0.043400220572948456,
0.04627086594700813,
-0.08558633178472519,
0.01854366436600685,
0.0035984686110168695,
0.06684158742427826,
-0.0903913602232933,
-0.006721809972077608,
0.08417186886072159,
-0.07188931107521057,
0.10581223666667938,
0.03490222245454788,
-0.06760949641466141,
0.10208511352539062,
-0.1714564561843872,
-0.015254287049174309,
0.03433467820286751,
0.02411489561200142,
0.04186641052365303,
-0.048637326806783676,
0.040535036474466324,
0.01888425648212433,
0.03270094469189644,
-0.0063417875207960606,
0.10854287445545197,
-0.12690222263336182,
-0.09172374755144119,
-0.020586732774972916,
-0.11860791593790054,
-0.0449027381837368,
0.028633510693907738,
0.0337211899459362,
0.07816356420516968,
0.08387017995119095,
-0.017397195100784302,
0.042411644011735916,
-0.08556482940912247,
-0.010010638274252415,
0.0542062483727932,
-0.07498496770858765,
-0.04791194573044777,
-0.10288385301828384,
0.04384240508079529,
-0.05413348600268364,
0.20575156807899475,
-0.012233654968440533,
0.14863009750843048,
-0.013651486486196518,
-0.014057681895792484,
0.021608971059322357,
0.045896511524915695,
0.23540231585502625,
-0.03382575511932373,
0.05727936699986458,
-0.06319757550954819,
0.07082857191562653,
0.0227362010627985,
0.04324840009212494,
0.10675454139709473,
0.08102831989526749,
-0.019004810601472855,
0.11029982566833496,
0.023747432976961136,
0.02667415514588356,
-0.07787494361400604,
-0.10760360956192017,
0.08576703071594238,
0.04591166600584984,
-0.03965363651514053,
0.08362548798322678,
0.12023937702178955,
-0.08953656256198883,
0.09944610297679901,
0.0031782451551407576,
-0.09538371860980988,
-0.038463544100522995,
-0.013542991131544113,
-0.03765561804175377,
-0.1341780126094818,
0.015187326818704605,
-0.11436082422733307,
-0.07403147220611572,
0.0570279099047184,
0.024748552590608597,
-0.05896846577525139,
0.2159520983695984,
-0.023185942322015762,
-0.06429194658994675,
0.0593782439827919,
-0.014010777696967125,
0.024839352816343307,
-0.005386857315897942,
0.06405293196439743,
-0.011257216334342957,
-0.018768545240163803,
0.008557773195207119,
0.03480442613363266,
-0.06807819753885269,
0.021891947835683823,
-0.05017554759979248,
-0.025192495435476303,
-0.04422607272863388,
0.04199495539069176,
-0.002738822251558304,
0.03793569281697273,
0.0059901452623307705,
-0.029413534328341484,
-0.01100857648998499,
0.21319767832756042,
-0.05996967852115631,
-0.08851315081119537,
-0.13671621680259705,
0.2142811119556427,
0.039944667369127274,
0.051314275711774826,
0.002939465921372175,
-0.06887925416231155,
-0.03677211329340935,
0.29467350244522095,
0.20846261084079742,
-0.07150891423225403,
0.006576078478246927,
0.021350642666220665,
0.020000934600830078,
0.015539680607616901,
0.12501271069049835,
0.03254389017820358,
0.2602800726890564,
-0.030431024730205536,
-0.0788365975022316,
-0.04663652926683426,
-0.04091299697756767,
0.040898751467466354,
0.13160887360572815,
0.03700241819024086,
-0.03306939825415611,
-0.05092494562268257,
0.1046488955616951,
-0.14504466950893402,
-0.12072880566120148,
0.04306644946336746,
-0.1356850266456604,
-0.07961083948612213,
-0.06618871539831161,
0.054762911051511765,
-0.028630923479795456,
0.06293455511331558,
-0.03909807279706001,
-0.02939603663980961,
0.048172492533922195,
0.028426427394151688,
-0.1502830684185028,
-0.07906326651573181,
0.04504679515957832,
-0.05108141526579857,
0.13439655303955078,
-0.033212028443813324,
0.1124752089381218,
0.10536039620637894,
0.038420356810092926,
-0.02703840471804142,
0.04294181987643242,
0.06368723511695862,
-0.0005124967428855598,
0.0679805651307106,
0.05040312558412552,
-0.03283046931028366,
0.11814983189105988,
-0.04571625217795372,
-0.11309029906988144,
0.05188390985131264,
0.018105775117874146,
-0.01619999296963215,
-0.10480298101902008,
-0.011791789904236794,
-0.10175760835409164,
0.09701035916805267,
0.1529921591281891,
-0.0444144532084465,
0.011524162255227566,
-0.07336949557065964,
0.13349846005439758,
0.013997439295053482,
-0.0163683220744133,
-0.07687301933765411,
-0.1465895175933838,
-0.021612761542201042,
0.040865007787942886,
-0.027012653648853302,
-0.2327060103416443,
-0.015713069587945938,
-0.03736817464232445,
-0.0021143495105206966,
-0.04073689505457878,
0.10547812283039093,
0.12835730612277985,
0.031024547293782234,
-0.027342911809682846,
-0.13634690642356873,
-0.0247044675052166,
0.06806901842355728,
-0.1338096708059311,
-0.15587221086025238
] |
null | null | transformers |
# CodeTrans model for code documentation generation go
Pretrained model on programming language go using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized go code functions: it works best with tokenized go functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the go function or be fine-tuned on other go code tasks. It can be used on unparsed and untokenized go code. However, if the go code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate go function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_go_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_go_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "func ( pr * Progress ) needSnapshotAbort ( ) bool { return pr . State == ProgressStateSnapshot && pr . Match >= pr . PendingSnapshot }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/function%20documentation%20generation/go/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 340,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "func ( pr * Progress ) needSnapshotAbort ( ) bool { return pr . State == ProgressStateSnapshot && pr . Match >= pr . PendingSnapshot }"}]} | summarization | SEBIS/code_trans_t5_base_code_documentation_generation_go_multitask | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
| CodeTrans model for code documentation generation go
====================================================
Pretrained model on programming language go using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized go code functions: it works best with tokenized go functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the go function or be fine-tuned on other go code tasks. It can be used on unparsed and untokenized go code. However, if the go code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate go function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 340,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 340,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 340,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
46,
60,
143
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 340,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.1540256142616272,
-0.032214198261499405,
-0.0001879863702924922,
0.12594641745090485,
0.11956120282411575,
0.0323236882686615,
0.06293468922376633,
0.06172460317611694,
-0.037407319992780685,
0.027376526966691017,
0.05089357867836952,
0.005650153383612633,
0.029152462258934975,
0.19737985730171204,
0.009095990099012852,
-0.10472337156534195,
-0.0274683628231287,
0.0493512824177742,
-0.06253276765346527,
0.13202255964279175,
0.08099769055843353,
-0.06660246104001999,
0.05812747776508331,
-0.06589803099632263,
-0.23742572963237762,
0.05766869708895683,
0.012659881263971329,
-0.05930839106440544,
0.09998320788145065,
0.058818891644477844,
0.1253940910100937,
-0.01105958130210638,
0.018441474065184593,
-0.12907327711582184,
0.012200181372463703,
0.004345435183495283,
0.023959102109074593,
0.014845972880721092,
0.03102404810488224,
0.036496590822935104,
0.16893038153648376,
0.00881879311054945,
0.03819600120186806,
0.06604119390249252,
-0.07428204268217087,
-0.11619344353675842,
-0.0009527758811600506,
0.02897518128156662,
0.034245215356349945,
0.11727835983037949,
-0.01765284314751625,
0.12401584535837173,
-0.14395476877689362,
0.1344253420829773,
0.08536320924758911,
-0.22775614261627197,
-0.01495254784822464,
0.11901668459177017,
0.08496467024087906,
0.08516401052474976,
-0.05004993453621864,
-0.052256882190704346,
0.10676081478595734,
0.0456048883497715,
0.02775752730667591,
-0.09029782563447952,
-0.06742274761199951,
0.010228736326098442,
-0.08146676421165466,
-0.06784268468618393,
0.2160416692495346,
0.0051714093424379826,
-0.08152154088020325,
-0.056401949375867844,
-0.03009343147277832,
-0.1439492106437683,
0.036384448409080505,
0.035843104124069214,
0.0018421841086819768,
-0.038691576570272446,
-0.0004472459258977324,
0.026711778715252876,
-0.07562191784381866,
-0.15290872752666473,
0.023136992007493973,
0.11061970889568329,
0.051881130784749985,
0.027488378807902336,
-0.09690597653388977,
0.10014117509126663,
0.03960074856877327,
-0.0537089966237545,
-0.02462160401046276,
-0.009680726565420628,
-0.10241131484508514,
0.029357576742768288,
-0.0607675239443779,
-0.16956886649131775,
0.013826100155711174,
0.032974936068058014,
-0.05617612972855568,
0.04946665093302727,
0.043292827904224396,
0.03962485492229462,
0.007270392961800098,
0.21862807869911194,
0.06373993307352066,
-0.12608759105205536,
0.0600590854883194,
0.04415522515773773,
-0.020025208592414856,
-0.005903212819248438,
-0.07696311920881271,
-0.09729692339897156,
0.09885665029287338,
0.10077682137489319,
-0.12971168756484985,
0.040195904672145844,
-0.07137428969144821,
-0.036181624978780746,
0.00928573403507471,
-0.15221719443798065,
0.002111664740368724,
0.03475695103406906,
-0.07385953515768051,
-0.05378144979476929,
0.10753122717142105,
-0.17733359336853027,
-0.14507097005844116,
-0.03602195158600807,
-0.06971680372953415,
-0.04014819860458374,
-0.16212350130081177,
-0.15998421609401703,
-0.011168375611305237,
-0.034678488969802856,
0.026712536811828613,
-0.10207010060548782,
-0.13637661933898926,
-0.03179735317826271,
0.02149682678282261,
0.014274937100708485,
-0.005434074904769659,
-0.08202168345451355,
-0.011066177859902382,
-0.021056486293673515,
-0.03559160605072975,
0.0032556166406720877,
-0.04686468839645386,
0.13178543746471405,
0.11152306199073792,
0.0516790896654129,
-0.03992877155542374,
0.05805797874927521,
-0.07876783609390259,
0.05635877698659897,
-0.11237282305955887,
0.09438851475715637,
-0.04974934458732605,
0.07794440537691116,
-0.02785794623196125,
-0.11130376160144806,
0.0709906592965126,
0.06921089440584183,
0.07756124436855316,
0.04751995578408241,
-0.11147208511829376,
-0.04229053482413292,
0.182246595621109,
-0.1258271187543869,
-0.14660412073135376,
0.10496773570775986,
-0.025451458990573883,
0.08402156084775925,
0.09304347634315491,
0.13388724625110626,
0.15352872014045715,
-0.04808979108929634,
0.008110983297228813,
0.044056959450244904,
0.03300093859434128,
-0.1452654004096985,
0.07892858237028122,
0.061821918934583664,
-0.0914095938205719,
0.05451708287000656,
-0.006999646779149771,
0.11511711031198502,
-0.013351279310882092,
-0.03081393800675869,
-0.04825429618358612,
-0.09053882956504822,
0.005981681868433952,
0.01896318607032299,
0.07538487762212753,
-0.08513117581605911,
-0.07820874452590942,
0.08634523302316666,
0.16549289226531982,
-0.12991198897361755,
0.0005762719083577394,
-0.08976816385984421,
0.06692077219486237,
-0.08030223101377487,
0.028651993721723557,
-0.16673225164413452,
0.02017677016556263,
0.06481222063302994,
-0.006419200915843248,
0.0764298290014267,
0.1246156394481659,
0.02265269309282303,
0.04286520183086395,
-0.012040642090141773,
-0.022097187116742134,
-0.11671161651611328,
-0.06275615096092224,
-0.07271528244018555,
-0.06818988174200058,
-0.08184905350208282,
-0.05531884729862213,
0.002977109747007489,
-0.20475627481937408,
0.014062625356018543,
0.008322142995893955,
-0.009854909032583237,
0.018485503271222115,
-0.013315940275788307,
0.021282590925693512,
0.07855092734098434,
-0.06024744361639023,
-0.03280080109834671,
0.04218919575214386,
0.024502677842974663,
-0.06524690240621567,
-0.0710848793387413,
-0.09417815506458282,
0.011107600294053555,
0.12476867437362671,
0.03439231589436531,
-0.09762220084667206,
0.018266893923282623,
-0.017932338640093803,
-0.04364347830414772,
0.022069621831178665,
-0.07000169157981873,
0.16233834624290466,
-0.013987069018185139,
0.19626174867153168,
-0.15309683978557587,
-0.036351658403873444,
-0.028203457593917847,
0.027691015973687172,
0.06236860901117325,
0.14630666375160217,
-0.008055274374783039,
-0.08688557893037796,
0.06464085727930069,
0.016459355130791664,
-0.11042861640453339,
0.2365443855524063,
-0.04738235101103783,
-0.09480147063732147,
0.034073356539011,
0.10311512649059296,
-0.004468541592359543,
0.1792811006307602,
-0.20760224759578705,
-0.029041685163974762,
0.0061819241382181644,
-0.0037731400225311518,
0.07004709541797638,
-0.13042742013931274,
0.007471146527677774,
0.013875605538487434,
-0.07119223475456238,
-0.08796179294586182,
-0.003039504401385784,
-0.015803134068846703,
0.04663388803601265,
-0.007744114380329847,
-0.03308694437146187,
0.017202986404299736,
-0.03278139606118202,
-0.12153766304254532,
0.2218315750360489,
-0.08489307761192322,
-0.20641237497329712,
-0.1995900720357895,
0.11346897482872009,
-0.06604303419589996,
-0.012692281976342201,
0.03689255937933922,
-0.08880725502967834,
-0.04228338226675987,
-0.05125149339437485,
0.18093295395374298,
-0.06942970305681229,
-0.005007337778806686,
-0.027389848604798317,
0.07418784499168396,
0.017432672902941704,
-0.20197685062885284,
0.03504105657339096,
-0.022847434505820274,
-0.015959370881319046,
0.015083367936313152,
-0.10777828097343445,
0.09912275522947311,
0.16654406487941742,
-0.07876572012901306,
0.019675040617585182,
-0.005846160929650068,
0.19951412081718445,
-0.048347774893045425,
-0.058114588260650635,
0.1427946537733078,
-0.0166912954300642,
-0.011960971169173717,
0.012045997194945812,
-0.014096644707024097,
-0.1031331717967987,
0.06656214594841003,
-0.01690533757209778,
-0.0326850451529026,
-0.27135998010635376,
-0.021264169365167618,
-0.07594167441129684,
0.04560016468167305,
0.04242460057139397,
0.03920074179768562,
-0.0926763191819191,
0.03048892505466938,
0.052359290421009064,
0.13697932660579681,
-0.010513074696063995,
0.045924652367830276,
0.060787513852119446,
0.0014715471770614386,
0.01513107679784298,
-0.10226675122976303,
0.011094614863395691,
0.07518361508846283,
0.09438539296388626,
0.26538509130477905,
-0.10074041783809662,
0.18580591678619385,
0.03722909092903137,
0.042869728058576584,
0.04549876227974892,
0.13933056592941284,
-0.12027692049741745,
0.03157025948166847,
0.014645694755017757,
-0.007030973210930824,
-0.1160222664475441,
0.02140911854803562,
-0.03290186822414398,
0.08659633994102478,
-0.12121521681547165,
-0.048901066184043884,
0.006067054811865091,
0.1371094137430191,
0.051921818405389786,
-0.23245972394943237,
-0.1430121213197708,
0.011555714532732964,
-0.07575438171625137,
-0.09504635632038116,
0.06342975795269012,
0.2233044058084488,
-0.06876775622367859,
-0.023844890296459198,
-0.005619549658149481,
0.13376101851463318,
-0.028200650587677956,
-0.03020665794610977,
-0.03740479797124863,
0.056869231164455414,
0.013851633295416832,
0.12969201803207397,
-0.29685869812965393,
0.13719432055950165,
-0.009059712290763855,
0.06699760258197784,
-0.034276556223630905,
0.04213954508304596,
-0.028691081330180168,
0.07588712126016617,
0.04107305780053139,
-0.00870584324002266,
0.03823278844356537,
-0.17520160973072052,
0.003997563850134611,
0.03888499364256859,
0.023572247475385666,
0.06174008920788765,
0.0692291185259819,
-0.000511963211465627,
0.055586077272892,
-0.013992885127663612,
-0.13834315538406372,
-0.06799693405628204,
-0.06438442319631577,
-0.027673929929733276,
-0.029430292546749115,
-0.021409913897514343,
-0.03974486142396927,
-0.0192234106361866,
0.06794055551290512,
0.1942620724439621,
-0.09243215620517731,
-0.08174409717321396,
-0.07681751996278763,
0.061704184859991074,
0.0904538705945015,
-0.09284968674182892,
0.04037024453282356,
-0.0029752282425761223,
0.02518133446574211,
-0.009317039512097836,
-0.08119694143533707,
0.06639143079519272,
-0.04028889909386635,
-0.06561556458473206,
-0.0075471485033631325,
0.07292697578668594,
0.004850344266742468,
0.040929511189460754,
0.008461621589958668,
-0.09537320584058762,
-0.04163419455289841,
-0.11803345382213593,
-0.11470459401607513,
-0.052187345921993256,
0.003549699205905199,
0.05625728890299797,
-0.1464327573776245,
-0.06256072223186493,
-0.005905210040509701,
-0.03548998758196831,
0.14173626899719238,
0.16366395354270935,
-0.060025766491889954,
0.013998922891914845,
0.1247708722949028,
-0.055493030697107315,
-0.20693616569042206,
0.039798904210329056,
0.051315583288669586,
0.12674549221992493,
-0.052033036947250366,
-0.16191056370735168,
0.04758608713746071,
0.0013274479424580932,
0.03701958805322647,
0.07529159635305405,
-0.29218441247940063,
-0.13423489034175873,
0.09187664836645126,
0.16364607214927673,
0.13884030282497406,
-0.1316526234149933,
-0.03607324883341789,
-0.06246640905737877,
-0.11515425145626068,
0.0762774720788002,
-0.05251266807317734,
0.13497719168663025,
-0.06809398531913757,
0.02291274257004261,
0.03303566575050354,
-0.04282321035861969,
0.07268360257148743,
0.02188560552895069,
0.10296232998371124,
-0.03807807341217995,
0.015728430822491646,
0.1384744793176651,
-0.03273722529411316,
0.17253535985946655,
-0.14871878921985626,
0.0928877517580986,
-0.23205359280109406,
-0.059849195182323456,
-0.07397395372390747,
0.015353829599916935,
-0.03436286002397537,
-0.03789125755429268,
-0.08135304600000381,
0.02776585891842842,
-0.008838454261422157,
-0.008977404795587063,
0.017858419567346573,
-0.043931007385253906,
-0.019689826294779778,
0.08826664835214615,
0.12061933428049088,
-0.005031541455537081,
-0.07251352816820145,
0.06296496093273163,
0.044661153107881546,
0.11640679091215134,
-0.18782275915145874,
0.022391296923160553,
0.11678048223257065,
0.020372958853840828,
0.11239144951105118,
0.04460097476840019,
-0.10371769964694977,
0.04926580190658569,
0.09341058135032654,
-0.061935532838106155,
-0.06619229167699814,
-0.028887512162327766,
-0.1115439385175705,
-0.07238198071718216,
0.05328289046883583,
0.0995761975646019,
-0.04491525515913963,
-0.009753013029694557,
-0.026808220893144608,
-0.028737150132656097,
-0.12088701874017715,
0.19305358827114105,
0.07726682722568512,
0.0804680809378624,
-0.06222658231854439,
0.05367036908864975,
0.07079773396253586,
-0.08382610231637955,
0.01595047302544117,
0.16861708462238312,
-0.09835480898618698,
-0.045248936861753464,
0.0673011764883995,
0.2151004672050476,
-0.04596107453107834,
-0.0631813108921051,
-0.14015725255012512,
-0.07925689965486526,
0.02731267921626568,
0.17006199061870575,
0.11064007133245468,
0.07788173109292984,
-0.027223920449614525,
0.00596409710124135,
-0.10736013203859329,
0.08568964898586273,
0.06963680684566498,
0.04029490426182747,
-0.1077895313501358,
0.1313437670469284,
0.04775827005505562,
0.1161198616027832,
-0.03067963570356369,
-0.009242075495421886,
-0.1504083275794983,
0.07443811744451523,
-0.09602056443691254,
0.030223486945033073,
-0.004296524450182915,
0.05204836651682854,
-0.028373386710882187,
-0.004120446275919676,
-0.03468878194689751,
0.06366674602031708,
-0.08450164645910263,
0.004503341391682625,
0.011028957553207874,
0.0418100506067276,
-0.056246403604745865,
-0.01801135763525963,
0.023202287033200264,
-0.09372702986001968,
0.12632843852043152,
-0.02341543324291706,
-0.031094374135136604,
0.08593974262475967,
-0.05309731885790825,
0.03771091252565384,
0.023343287408351898,
0.0540865957736969,
0.011657023802399635,
0.015925034880638123,
0.0800933986902237,
0.0396331287920475,
0.05883911997079849,
0.03623043745756149,
0.12796227633953094,
-0.12566600739955902,
-0.07665083557367325,
-0.056948576122522354,
-0.10730139166116714,
-0.05659601092338562,
0.1006523072719574,
0.031676217913627625,
0.10247287154197693,
0.09975536912679672,
-0.03854237496852875,
0.009453488513827324,
-0.13182826340198517,
-0.06031137704849243,
0.027126789093017578,
-0.022248148918151855,
-0.09423209726810455,
-0.05745648220181465,
0.05065683275461197,
-0.02675943449139595,
0.12152769416570663,
0.008101014420390129,
0.04279070347547531,
-0.019117679446935654,
-0.04141770675778389,
0.0012348816962912679,
0.012258092872798443,
0.21948759257793427,
-0.07798060029745102,
0.05110081657767296,
0.00626275734975934,
0.019954124465584755,
0.01688639260828495,
0.11897285282611847,
0.14083422720432281,
0.1528463065624237,
-0.03968426212668419,
0.11213504523038864,
0.008347532711923122,
0.0009981077164411545,
-0.0831117033958435,
-0.0025877179577946663,
0.008668582886457443,
0.05426590144634247,
-0.040536489337682724,
0.18950578570365906,
0.09291171282529831,
-0.11363133043050766,
0.10104702413082123,
0.02105806954205036,
-0.1329137086868286,
-0.03857262432575226,
0.029156064614653587,
-0.03645152971148491,
-0.14765042066574097,
0.02977205254137516,
-0.11772111803293228,
-0.04182472079992294,
0.036523714661598206,
0.05016341805458069,
-0.07996851950883865,
0.19107772409915924,
0.011817886494100094,
-0.0539548322558403,
0.053585417568683624,
-0.006590516772121191,
0.0209769569337368,
0.028599603101611137,
0.032086946070194244,
0.029297299683094025,
-0.04070666432380676,
0.043426308780908585,
0.026095617562532425,
-0.04945363476872444,
0.0014106739545240998,
-0.006421365309506655,
-0.000009101416253542993,
-0.022016795352101326,
0.03143308311700821,
0.0646945983171463,
0.1693812906742096,
0.030112607404589653,
-0.06889133155345917,
-0.02422025054693222,
0.14995644986629486,
-0.033071763813495636,
-0.10096436738967896,
-0.12275253236293793,
0.15867382287979126,
0.038449469953775406,
0.006560538429766893,
0.015755847096443176,
-0.09075740724802017,
-0.043422047048807144,
0.22703473269939423,
0.07754544913768768,
-0.03876867517828941,
-0.01866275444626808,
0.006038240622729063,
0.0007552957395091653,
-0.03389504551887512,
0.2064768522977829,
0.026778027415275574,
0.23554065823554993,
0.01788056455552578,
-0.030519362539052963,
-0.07643108069896698,
-0.03754303604364395,
0.015112725086510181,
0.11892116814851761,
-0.025020353496074677,
-0.04109348729252815,
-0.08633428066968918,
0.007450758945196867,
-0.006666332017630339,
-0.07691844552755356,
0.10453885048627853,
-0.14311319589614868,
-0.0948033556342125,
-0.04670325666666031,
0.0394432507455349,
-0.05199122801423073,
0.022578982636332512,
-0.028536539524793625,
0.03653869405388832,
0.05910472199320793,
-0.03627839684486389,
-0.12090139091014862,
-0.1609574258327484,
0.08464391529560089,
-0.054078903049230576,
0.1328168660402298,
-0.0197641309350729,
0.16358332335948944,
0.09448640048503876,
0.038422759622335434,
-0.04763440415263176,
0.11644714325666428,
0.032040610909461975,
0.03420040011405945,
0.05756818503141403,
0.11154663562774658,
-0.05273595079779625,
0.13625174760818481,
-0.049823980778455734,
-0.02066202275454998,
-0.01330261304974556,
-0.06711895018815994,
-0.022025909274816513,
-0.16394168138504028,
-0.013330278918147087,
-0.10708855837583542,
0.0988871157169342,
0.1961042433977127,
-0.04048774763941765,
-0.032762352377176285,
-0.09060381352901459,
0.10285738110542297,
-0.0026292181573808193,
0.06062573194503784,
-0.03405969962477684,
-0.18622826039791107,
-0.00030110159423202276,
-0.0018299914663657546,
0.0043863398022949696,
-0.2866016924381256,
-0.008790088817477226,
-0.04864583536982536,
-0.023577004671096802,
-0.09378628432750702,
0.16103602945804596,
0.07520197331905365,
0.04092162847518921,
-0.040234945714473724,
-0.13260455429553986,
-0.039277125149965286,
0.06595174968242645,
-0.15885905921459198,
-0.1488313376903534
] |
null | null | transformers |
# CodeTrans model for code documentation generation go
Pretrained model on programming language go using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized go code functions: it works best with tokenized go functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the go function/method.
## Intended uses & limitations
The model could be used to generate the description for the go function or be fine-tuned on other go code tasks. It can be used on unparsed and untokenized go code. However, if the go code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate go function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_go_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_go_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "func ( pr * Progress ) needSnapshotAbort ( ) bool { return pr . State == ProgressStateSnapshot && pr . Match >= pr . PendingSnapshot }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/function%20documentation%20generation/go/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "func ( pr * Progress ) needSnapshotAbort ( ) bool { return pr . State == ProgressStateSnapshot && pr . Match >= pr . PendingSnapshot }"}]} | summarization | SEBIS/code_trans_t5_base_code_documentation_generation_go_multitask_finetune | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
| CodeTrans model for code documentation generation go
====================================================
Pretrained model on programming language go using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized go code functions: it works best with tokenized go functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the go function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the go function or be fine-tuned on other go code tasks. It can be used on unparsed and untokenized go code. However, if the go code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate go function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
46,
60,
88,
107
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.1258632242679596,
0.05267159640789032,
-0.0008433153270743787,
0.09481534361839294,
0.03763256594538689,
0.029692726209759712,
0.052412137389183044,
0.09670394659042358,
-0.019575342535972595,
0.06786224246025085,
0.050830643624067307,
-0.07841450721025467,
0.058543648570775986,
0.19150269031524658,
0.018528016284108162,
-0.1232183575630188,
-0.025341257452964783,
0.046050626784563065,
-0.07628627866506577,
0.107062928378582,
0.07034357637166977,
-0.08811192959547043,
0.0693044438958168,
-0.03471162170171738,
-0.13650061190128326,
0.032723523676395416,
-0.026224929839372635,
-0.01062187273055315,
0.10063770413398743,
0.07019755989313126,
0.12193035334348679,
-0.014668040908873081,
0.05883945897221565,
-0.18412046134471893,
0.004600078798830509,
0.018477991223335266,
0.06381873041391373,
0.04258674755692482,
0.044765654951334,
0.08520428836345673,
0.10970284789800644,
-0.007884913124144077,
0.03536755591630936,
0.05124645307660103,
-0.06428711861371994,
-0.04057648405432701,
-0.08204372972249985,
0.08260439336299896,
0.0767662525177002,
0.09964385628700256,
-0.0037714915815740824,
0.03899044916033745,
-0.07705364376306534,
0.08893048763275146,
0.11767547577619553,
-0.22806352376937866,
-0.023617830127477646,
0.10934585332870483,
0.0990336686372757,
0.03256478160619736,
-0.0756261870265007,
-0.03860890492796898,
0.10372262448072433,
0.04012662544846535,
0.03945799916982651,
-0.0845118910074234,
-0.018917357549071312,
-0.011477450840175152,
-0.05049584433436394,
-0.043210018426179886,
0.1723494678735733,
0.04079290106892586,
-0.060601502656936646,
-0.10022469609975815,
-0.0415055938065052,
-0.19779907166957855,
0.03950837254524231,
0.012577647343277931,
0.0002962102589663118,
-0.0150861581787467,
0.0024073293898254633,
-0.009165318682789803,
-0.09699750691652298,
-0.12033740431070328,
0.015843311324715614,
0.028784919530153275,
0.05453238636255264,
0.03504171967506409,
-0.0424860380589962,
0.08308117091655731,
0.025742657482624054,
-0.04195340722799301,
-0.019005853682756424,
0.020521804690361023,
-0.12390772253274918,
-0.0009570811525918543,
-0.02003701776266098,
-0.07577507942914963,
-0.004709569737315178,
0.0663929209113121,
-0.10966091603040695,
0.08094283938407898,
0.08950957655906677,
0.02367754653096199,
0.018977424129843712,
0.20610447227954865,
0.044628653675317764,
-0.1493232548236847,
0.026391321793198586,
0.021367201581597328,
0.004939729813486338,
0.004796852357685566,
-0.047608889639377594,
-0.04196959733963013,
0.027843013405799866,
0.06054327264428139,
-0.12221713364124298,
0.022956645116209984,
-0.06135163828730583,
-0.015511282719671726,
0.09112262725830078,
-0.12587344646453857,
0.03531094267964363,
0.02324536070227623,
-0.04685060679912567,
-0.04110868647694588,
0.0902235358953476,
-0.13144069910049438,
-0.1133827343583107,
0.03815245255827904,
-0.043070171028375626,
-0.03419181704521179,
-0.11604493111371994,
-0.10275347530841827,
0.0022671115584671497,
-0.009219382889568806,
-0.006458901334553957,
-0.0845469981431961,
-0.09078848361968994,
-0.02184193953871727,
0.0463927760720253,
0.00007812811236362904,
-0.03481360152363777,
-0.03146624192595482,
0.006215497385710478,
-0.007307909429073334,
-0.022627195343375206,
0.019794559106230736,
-0.024879299104213715,
0.09533592313528061,
0.07697290182113647,
0.03943051025271416,
-0.020631704479455948,
0.025369741022586823,
-0.07648969441652298,
0.0837218165397644,
-0.10928592830896378,
0.05346040800213814,
-0.005534496624022722,
0.053397275507450104,
-0.08765605092048645,
-0.06989714503288269,
-0.004773137625306845,
0.05611380562186241,
0.0789414718747139,
0.029776381328701973,
-0.12852777540683746,
0.02185518853366375,
0.1506490409374237,
-0.11901719868183136,
-0.1448238044977188,
0.10350799560546875,
-0.010906267911195755,
0.03698477894067764,
0.06136361137032509,
0.11402996629476547,
0.14620241522789001,
-0.08646445721387863,
-0.03221094235777855,
0.06438503414392471,
0.04682697728276253,
-0.08073607832193375,
0.05431324988603592,
0.026027290150523186,
-0.011290219612419605,
0.023148661479353905,
0.0594317689538002,
0.05621035769581795,
-0.004953390918672085,
-0.04066609963774681,
-0.03194260969758034,
-0.09541864693164825,
-0.06550242751836777,
-0.012726415880024433,
0.03006543032824993,
-0.054888445883989334,
-0.05202163755893707,
0.005514012183994055,
0.16637833416461945,
-0.09368697553873062,
0.03263961896300316,
-0.08293084055185318,
-0.037400584667921066,
-0.07961221784353256,
0.02658647857606411,
-0.13524112105369568,
0.02214248664677143,
0.057274412363767624,
-0.04522417485713959,
0.05297660082578659,
0.08219194412231445,
-0.0009562612976878881,
0.025366276502609253,
-0.06197325885295868,
-0.027839601039886475,
-0.03911607712507248,
-0.07332374900579453,
-0.1086781695485115,
-0.04377193748950958,
-0.0897403135895729,
-0.030121758580207825,
-0.03272540122270584,
-0.17646872997283936,
-0.0012569986283779144,
0.013666615821421146,
0.021689441055059433,
0.0156873632222414,
-0.04409606382250786,
0.022252412512898445,
0.05648021772503853,
-0.05147657170891762,
-0.06832875311374664,
0.02295098826289177,
0.05528343468904495,
-0.09899510443210602,
-0.03933372348546982,
-0.09046754986047745,
-0.07784371823072433,
0.08257222920656204,
0.10600333660840988,
-0.13237668573856354,
-0.009410936385393143,
-0.02873552218079567,
-0.05090387538075447,
-0.048961810767650604,
-0.05662000924348831,
0.16443035006523132,
0.01787339709699154,
0.15455469489097595,
-0.1352207362651825,
-0.06724650412797928,
-0.02643345668911934,
0.016724631190299988,
0.02905498445034027,
0.1421971619129181,
0.03436288610100746,
-0.11846080422401428,
0.03170941397547722,
-0.044249627739191055,
-0.0603691004216671,
0.16277937591075897,
-0.017302481457591057,
-0.06130930408835411,
-0.0060619330033659935,
0.12168823927640915,
0.004242944996803999,
0.20776331424713135,
-0.0699700340628624,
0.0020242289174348116,
-0.012598216533660889,
0.010754763148725033,
0.04908331483602524,
-0.13084910809993744,
0.029446657747030258,
0.03073233738541603,
-0.06183687224984169,
-0.0335262157022953,
-0.02826022170484066,
-0.036790888756513596,
0.04399176687002182,
0.01967141404747963,
0.04224805906414986,
-0.009135127067565918,
-0.03616606071591377,
-0.11405816674232483,
0.17981240153312683,
-0.06465431302785873,
-0.20874139666557312,
-0.16357268393039703,
0.09480661153793335,
-0.03955584019422531,
-0.01617981307208538,
0.030658256262540817,
-0.08006350696086884,
-0.048707976937294006,
-0.0914362445473671,
0.12192486226558685,
-0.1001761183142662,
0.003525781212374568,
-0.00022142712259665132,
0.059748098254203796,
0.062002621591091156,
-0.15925829112529755,
0.03206038475036621,
-0.025889785960316658,
0.021091334521770477,
-0.007391439285129309,
-0.05322455242276192,
0.08127862215042114,
0.11520782113075256,
-0.058261971920728683,
0.018865272402763367,
0.0023341539781540632,
0.16345690190792084,
-0.06254011392593384,
0.04565488174557686,
0.16862313449382782,
-0.0007879838231019676,
0.027779661118984222,
0.0520441010594368,
0.014987699687480927,
-0.09604576975107193,
0.0562441311776638,
0.04239898920059204,
-0.043529532849788666,
-0.21204672753810883,
-0.029778392985463142,
-0.08547943085432053,
0.05488777905702591,
0.10580933094024658,
0.051445282995700836,
-0.15104541182518005,
0.02591775916516781,
-0.00832541473209858,
0.160191610455513,
-0.02785501442849636,
0.055705949664115906,
0.0012855600798502564,
0.01593233086168766,
0.002005633432418108,
-0.1061990037560463,
0.00831871572881937,
0.07724668830633163,
0.10669069737195969,
0.20351818203926086,
-0.08586785942316055,
0.15759433805942535,
0.015043037943542004,
0.0984482690691948,
0.04557683691382408,
0.1087878867983818,
-0.1311432272195816,
0.007511183153837919,
0.009537671692669392,
-0.017056703567504883,
-0.059480566531419754,
0.047213874757289886,
-0.039271119982004166,
0.06864911317825317,
-0.06106865778565407,
0.0061726318672299385,
0.017048420384526253,
0.19502602517604828,
0.07427825033664703,
-0.161990225315094,
-0.14142082631587982,
0.00923637393862009,
-0.07321732491254807,
-0.10697705298662186,
0.061484143137931824,
0.21586699783802032,
-0.05305825173854828,
0.02791197970509529,
-0.01373195368796587,
0.1310252547264099,
-0.09544051438570023,
-0.01944963075220585,
0.03472970798611641,
0.05810336023569107,
0.007778710685670376,
0.11038707941770554,
-0.24066980183124542,
0.08589199930429459,
0.014010191895067692,
0.08460980653762817,
-0.03127435967326164,
0.05847522243857384,
-0.044107891619205475,
-0.006129154469817877,
0.073470838367939,
0.014185969717800617,
-0.04949823394417763,
-0.18310075998306274,
-0.04170528054237366,
0.020984452217817307,
0.05034476891160011,
0.0028119771741330624,
0.08875104784965515,
-0.007041803561151028,
0.04825736954808235,
-0.028079858049750328,
-0.11510825157165527,
-0.059119850397109985,
-0.13096417486667633,
-0.0255681611597538,
0.0076820384711027145,
-0.07506278902292252,
-0.025587068870663643,
0.03889644891023636,
0.040638528764247894,
0.2540448307991028,
-0.1577322632074356,
-0.062125008553266525,
-0.09508121013641357,
0.06270492076873779,
0.1295587569475174,
-0.08743306249380112,
0.012125657871365547,
0.013548347167670727,
0.06624579429626465,
-0.051644787192344666,
-0.06878960132598877,
0.03083035722374916,
-0.059698887169361115,
-0.09183172136545181,
-0.04093369096517563,
0.11489029973745346,
-0.008882480673491955,
0.04211292788386345,
0.003535793861374259,
-0.08351122587919235,
-0.02959204837679863,
-0.1346648633480072,
-0.07022924721240997,
-0.030164800584316254,
0.029737623408436775,
-0.01337111834436655,
-0.13702769577503204,
0.07368596643209457,
0.007906997576355934,
-0.0957985669374466,
0.07167059183120728,
0.18056832253932953,
-0.07232499122619629,
0.03458641096949577,
0.07622506469488144,
-0.05335216596722603,
-0.19617918133735657,
-0.03140422701835632,
0.04963603988289833,
0.08789247274398804,
-0.025539323687553406,
-0.14283430576324463,
0.07317661494016647,
-0.00028710553306154907,
0.010078562423586845,
0.022521421313285828,
-0.2315693199634552,
-0.12761253118515015,
0.005714651197195053,
0.07471546530723572,
0.05366038903594017,
-0.09952808171510696,
-0.050857435911893845,
-0.06523182988166809,
-0.04117949679493904,
0.06262659281492233,
0.06770120561122894,
0.10950260609388351,
-0.03045392967760563,
0.02663368172943592,
0.039001621305942535,
-0.03584303706884384,
0.06505254656076431,
-0.014126627705991268,
0.09600112587213516,
-0.016580138355493546,
0.003734394209459424,
0.06555015593767166,
-0.061336856335401535,
0.17896737158298492,
-0.15890845656394958,
0.09665320068597794,
-0.15809467434883118,
-0.03605024516582489,
-0.030104180797934532,
0.0020534174982458353,
-0.043745506554841995,
-0.03178303316235542,
-0.11828166246414185,
0.04069721698760986,
0.0485948771238327,
-0.03264697268605232,
0.039161887019872665,
-0.012034161016345024,
-0.05042067542672157,
0.06796707957983017,
0.07925336807966232,
-0.004026638809591532,
-0.11835384368896484,
0.0343439057469368,
0.015363937243819237,
0.09126738458871841,
-0.17091140151023865,
0.02944396808743477,
0.10372553020715714,
0.019652534276247025,
0.0883757695555687,
0.016980629414319992,
-0.09547006338834763,
0.022981567308306694,
0.07763569802045822,
-0.07474654912948608,
-0.06152312830090523,
-0.01529654860496521,
-0.026862116530537605,
-0.0914686843752861,
0.0455707423388958,
0.09048335999250412,
-0.04462691396474838,
-0.0022217517253011465,
-0.005651490762829781,
0.010969751514494419,
-0.08084623515605927,
0.16868238151073456,
0.005207060370594263,
0.08472899347543716,
-0.06239712983369827,
0.07060170918703079,
0.09744171053171158,
-0.09908303618431091,
0.02911144308745861,
0.1459353119134903,
-0.08425331115722656,
-0.0196896530687809,
0.08321525156497955,
0.13253706693649292,
-0.018867356702685356,
-0.05571238696575165,
-0.10050371289253235,
-0.08889231830835342,
0.019399922341108322,
0.06307581812143326,
0.06899888813495636,
0.09111341834068298,
-0.020874017849564552,
-0.0007177648949436843,
-0.1251470297574997,
0.09360389411449432,
0.07433392107486725,
0.04815712198615074,
-0.12764324247837067,
0.12728500366210938,
0.037649281322956085,
0.08263423293828964,
-0.0006545347278006375,
0.030109036713838577,
-0.12243998795747757,
0.034464724361896515,
-0.026337170973420143,
0.02966686338186264,
-0.009749261662364006,
0.041414398699998856,
-0.04001938924193382,
0.0368322990834713,
-0.03800026327371597,
0.04456619545817375,
-0.04144176468253136,
-0.022570917382836342,
-0.04370669275522232,
0.01922663114964962,
-0.05636431649327278,
-0.01502912025898695,
0.013313088566064835,
-0.09566673636436462,
0.09076616168022156,
-0.05442797392606735,
-0.007851531729102135,
-0.004073019605129957,
0.027866628021001816,
0.047620296478271484,
0.0050254520028829575,
0.05659732222557068,
-0.01221408974379301,
-0.01358070783317089,
0.019279155880212784,
0.032874319702386856,
-0.006356494966894388,
0.0012073135003447533,
0.09862454235553741,
-0.13321925699710846,
-0.08422495424747467,
-0.09124834835529327,
-0.07909856736660004,
-0.05799366533756256,
0.07247722148895264,
0.0890907347202301,
0.08399763703346252,
0.08128274977207184,
-0.033553171902894974,
0.005233678035438061,
-0.16765888035297394,
-0.03938912972807884,
0.05350740626454353,
-0.00039767942507751286,
-0.12167526036500931,
-0.039797332137823105,
0.06512412428855896,
-0.031137580052018166,
0.13214413821697235,
-0.036753423511981964,
0.03255723416805267,
-0.00940768513828516,
-0.05933674797415733,
-0.050511207431554794,
0.005767017137259245,
0.17473295331001282,
-0.1052529439330101,
0.0051237186416983604,
-0.004901978652924299,
0.007601122837513685,
0.019596794620156288,
0.15272915363311768,
0.12618091702461243,
0.11997253447771072,
0.03808155655860901,
0.08459131419658661,
-0.03913727030158043,
-0.03450740501284599,
-0.10077618062496185,
0.07027456909418106,
-0.04393460601568222,
0.03018343634903431,
-0.03219069540500641,
0.14315474033355713,
0.08063483983278275,
-0.14017286896705627,
0.10590015351772308,
-0.0012730596354231238,
-0.09679071605205536,
-0.029370279982686043,
-0.08327268809080124,
-0.043519362807273865,
-0.0928436890244484,
0.004815689288079739,
-0.10513710975646973,
-0.012940444983541965,
0.052428651601076126,
0.03030533343553543,
-0.026464330032467842,
0.17059241235256195,
-0.048996489495038986,
-0.04364310950040817,
0.026035765185952187,
0.048990555107593536,
0.02464725822210312,
0.09293410181999207,
0.024740010499954224,
0.06400210410356522,
-0.048956386744976044,
0.07474400103092194,
0.04037081077694893,
0.0024358462542295456,
0.024186886847019196,
0.043852467089891434,
-0.010920674540102482,
-0.04336719214916229,
-0.023219885304570198,
0.09202089160680771,
0.13202954828739166,
0.031780801713466644,
-0.0347302220761776,
-0.05509330704808235,
0.16169984638690948,
-0.056562405079603195,
-0.05931331589818001,
-0.12445519119501114,
0.1679198443889618,
0.031815290451049805,
0.000751937972381711,
0.014858387410640717,
-0.07807409018278122,
-0.02076624147593975,
0.2542401850223541,
0.06049337610602379,
-0.05948079749941826,
-0.023274218663573265,
0.0007880500052124262,
-0.009314223192632198,
-0.038707248866558075,
0.13608363270759583,
0.0045743160881102085,
0.25438109040260315,
0.018870074301958084,
-0.017753170803189278,
-0.04879312589764595,
-0.04294969141483307,
0.003331327112391591,
0.2031099647283554,
-0.03376505896449089,
0.03000479005277157,
-0.11218743771314621,
-0.013314913958311081,
0.027606215327978134,
-0.15684843063354492,
0.13324709236621857,
-0.14351525902748108,
-0.0800204649567604,
0.021692171692848206,
0.06945549696683884,
-0.05851779505610466,
0.042876407504081726,
-0.023449573665857315,
0.06885997951030731,
0.037463583052158356,
-0.03055761381983757,
-0.09586386382579803,
-0.13526783883571625,
0.04942606762051582,
-0.01267449464648962,
0.13441777229309082,
0.014611613936722279,
0.09443474560976028,
0.08536328375339508,
0.011989499442279339,
-0.07775171846151352,
0.08446397632360458,
0.025059999898076057,
-0.020678378641605377,
0.044991299510002136,
0.12476222962141037,
-0.05001666024327278,
0.15404173731803894,
0.012370944023132324,
-0.02627786435186863,
-0.025868581607937813,
-0.02718539535999298,
-0.011915713548660278,
-0.153049036860466,
0.0009854402160272002,
-0.0680563896894455,
0.14624209702014923,
0.1959424465894699,
-0.04438295215368271,
-0.01973707601428032,
-0.04703110456466675,
0.09480766952037811,
-0.011312801390886307,
0.08994722366333008,
0.0023820113856345415,
-0.18741394579410553,
0.02487756311893463,
-0.04164612665772438,
0.009109544567763805,
-0.2039228081703186,
-0.06651464104652405,
-0.028087005019187927,
-0.034541357308626175,
-0.09457504749298096,
0.1340385377407074,
0.0670241117477417,
0.032902851700782776,
-0.048162464052438736,
-0.11328588426113129,
-0.015019621700048447,
0.042690519243478775,
-0.11659780889749527,
-0.12524275481700897
] |
null | null | transformers |
# CodeTrans model for code documentation generation go
Pretrained model on programming language go using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized go code functions: it works best with tokenized go functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the go function/method.
## Intended uses & limitations
The model could be used to generate the description for the go function or be fine-tuned on other go code tasks. It can be used on unparsed and untokenized go code. However, if the go code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate go function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_go_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_go_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "func ( pr * Progress ) needSnapshotAbort ( ) bool { return pr . State == ProgressStateSnapshot && pr . Match >= pr . PendingSnapshot }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/function%20documentation%20generation/go/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "func ( pr * Progress ) needSnapshotAbort ( ) bool { return pr . State == ProgressStateSnapshot && pr . Match >= pr . PendingSnapshot }"}]} | summarization | SEBIS/code_trans_t5_base_code_documentation_generation_go_transfer_learning_finetune | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
| CodeTrans model for code documentation generation go
====================================================
Pretrained model on programming language go using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized go code functions: it works best with tokenized go functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the go function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the go function or be fine-tuned on other go code tasks. It can be used on unparsed and untokenized go code. However, if the go code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate go function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
46,
60,
87,
107
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.12570014595985413,
0.023297645151615143,
-0.0006885755574330688,
0.09114361554384232,
0.03857513517141342,
0.027795813977718353,
0.053068794310092926,
0.10099451243877411,
-0.020206207409501076,
0.06784006208181381,
0.033973824232816696,
-0.0687367394566536,
0.06602361053228378,
0.2078540176153183,
0.01828087493777275,
-0.12425848096609116,
-0.022189779207110405,
0.044891756027936935,
-0.0918470099568367,
0.10742461681365967,
0.07156406342983246,
-0.09251435101032257,
0.07445237785577774,
-0.03295157477259636,
-0.13284708559513092,
0.02587813511490822,
-0.023461923003196716,
-0.010174536146223545,
0.1015775054693222,
0.07454608380794525,
0.12192028760910034,
-0.01192662212997675,
0.06458315998315811,
-0.19436582922935486,
0.004567568656057119,
0.02068379521369934,
0.06463635712862015,
0.046754851937294006,
0.04468593746423721,
0.08445153385400772,
0.09783090651035309,
-0.01259192731231451,
0.03648710623383522,
0.05403623357415199,
-0.06511186808347702,
-0.05957464501261711,
-0.08260788768529892,
0.09401800483465195,
0.08065157383680344,
0.09934262186288834,
-0.006631313357502222,
0.043866802006959915,
-0.0785602256655693,
0.08516692370176315,
0.12118088454008102,
-0.23356305062770844,
-0.023296548053622246,
0.12128732353448868,
0.0992652028799057,
0.040105681866407394,
-0.07079761475324631,
-0.03226786479353905,
0.1070353165268898,
0.028771504759788513,
0.043504130095243454,
-0.07274061441421509,
0.02638331614434719,
-0.010830538347363472,
-0.05998538061976433,
-0.04160919412970543,
0.16888630390167236,
0.04223606735467911,
-0.05989727005362511,
-0.10351281613111496,
-0.03681958466768265,
-0.19528117775917053,
0.03855838254094124,
-0.0007563576218672097,
-0.005837328266352415,
-0.009429492056369781,
-0.008215622045099735,
-0.0007918885676190257,
-0.08969590812921524,
-0.11920224130153656,
0.023686546832323074,
0.03511790186166763,
0.05900946259498596,
0.03253184258937836,
-0.05694509297609329,
0.0854811891913414,
0.051089804619550705,
-0.04445469379425049,
-0.020895391702651978,
0.015555350109934807,
-0.12447445839643478,
-0.008902361616492271,
-0.017426058650016785,
-0.06951245665550232,
-0.013894581235945225,
0.07787049561738968,
-0.08467450737953186,
0.07880682498216629,
0.08622739464044571,
0.024303752928972244,
0.01910999044775963,
0.21067695319652557,
0.046227023005485535,
-0.14930714666843414,
0.027855686843395233,
0.021104300394654274,
0.005300446879118681,
0.006666332017630339,
-0.04817542806267738,
-0.04386560991406441,
0.02707425132393837,
0.06500180065631866,
-0.1159437820315361,
0.03021889552474022,
-0.06746183335781097,
-0.02472343109548092,
0.08050050586462021,
-0.12884320318698883,
0.03274675831198692,
0.021440938115119934,
-0.04750095680356026,
-0.03514675796031952,
0.09799106419086456,
-0.1338077336549759,
-0.10883084684610367,
0.037802666425704956,
-0.0439131073653698,
-0.03860766068100929,
-0.11806999891996384,
-0.10325297713279724,
-0.000642686034552753,
-0.008345912210643291,
-0.006499012466520071,
-0.09247918426990509,
-0.07925833761692047,
-0.013113044202327728,
0.046552252024412155,
0.003110966645181179,
-0.03544667735695839,
-0.036091070622205734,
0.00691327266395092,
-0.007902695797383785,
-0.017755260691046715,
0.014167669229209423,
-0.023952268064022064,
0.09733515232801437,
0.07617243379354477,
0.04631776735186577,
-0.007231603376567364,
0.026806654408574104,
-0.07712776958942413,
0.08224474638700485,
-0.1318359524011612,
0.06841222941875458,
-0.012559865601360798,
0.04324454814195633,
-0.09703425318002701,
-0.06851263344287872,
-0.0008934384677559137,
0.04971007630228996,
0.0838271826505661,
0.03645031526684761,
-0.1263534277677536,
0.015173542313277721,
0.15134094655513763,
-0.12928327918052673,
-0.14889314770698547,
0.10801317542791367,
-0.012742731720209122,
0.05479966849088669,
0.06326363980770111,
0.11010254919528961,
0.14958520233631134,
-0.10149305313825607,
-0.0389510840177536,
0.061465099453926086,
0.04646308347582817,
-0.07144347578287125,
0.05151139572262764,
0.03010513260960579,
-0.030326738953590393,
0.01442589983344078,
0.05155180022120476,
0.05791294574737549,
-0.011623797006905079,
-0.046619780361652374,
-0.03048122487962246,
-0.09437470138072968,
-0.060543693602085114,
-0.014785378240048885,
0.027864066883921623,
-0.04181162267923355,
-0.05549208074808121,
-0.0007418070454150438,
0.1618446707725525,
-0.0993368923664093,
0.03587659075856209,
-0.09541255980730057,
-0.026941880583763123,
-0.0713806301355362,
0.030654683709144592,
-0.1303960084915161,
0.015898127108812332,
0.058867573738098145,
-0.05298744514584541,
0.051579125225543976,
0.07204317301511765,
-0.0017605498433113098,
0.021357174962759018,
-0.0612853541970253,
-0.032384805381298065,
-0.046211518347263336,
-0.07278521358966827,
-0.11168009042739868,
-0.048791613429784775,
-0.09774541109800339,
-0.029748711735010147,
-0.04370551556348801,
-0.17325885593891144,
-0.0010596257634460926,
0.017371581867337227,
0.02724648080766201,
0.026481611654162407,
-0.04743357375264168,
0.013211586512625217,
0.053964462131261826,
-0.05036868900060654,
-0.06817150115966797,
0.019046463072299957,
0.04920768365263939,
-0.10633603483438492,
-0.03192462772130966,
-0.09053237736225128,
-0.08916187286376953,
0.0881325975060463,
0.09730928391218185,
-0.13337844610214233,
-0.021087758243083954,
-0.02977503463625908,
-0.05187995359301567,
-0.05480535700917244,
-0.05708212032914162,
0.16125047206878662,
0.018219515681266785,
0.14354661107063293,
-0.1324903666973114,
-0.06526708602905273,
-0.02606535516679287,
0.023633413016796112,
0.031247297301888466,
0.13875004649162292,
0.03682373836636543,
-0.10586686432361603,
0.030406367033720016,
-0.032757289707660675,
-0.04554641619324684,
0.15178771317005157,
-0.023054422810673714,
-0.061380185186862946,
-0.0060569969937205315,
0.11388351768255234,
0.011322497390210629,
0.21795065701007843,
-0.05825325846672058,
0.004515432752668858,
-0.009270106442272663,
0.010507429018616676,
0.04476273059844971,
-0.1281786412000656,
0.027802404016256332,
0.033726949244737625,
-0.06351425498723984,
-0.04108218848705292,
-0.026928482577204704,
-0.04092559590935707,
0.04040159285068512,
0.01896940916776657,
0.049551039934158325,
-0.0077761150896549225,
-0.034328900277614594,
-0.11450324207544327,
0.17542433738708496,
-0.058658208698034286,
-0.20369921624660492,
-0.1657722145318985,
0.10093239694833755,
-0.019640345126390457,
-0.019308973103761673,
0.028079291805624962,
-0.08620012551546097,
-0.055453281849622726,
-0.08592861890792847,
0.1411411017179489,
-0.09535940736532211,
0.008522934280335903,
-0.0004026455571874976,
0.054660260677337646,
0.05785836651921272,
-0.16145099699497223,
0.032499801367521286,
-0.026105130091309547,
0.020140457898378372,
-0.004685079213231802,
-0.05986160784959793,
0.0782167837023735,
0.11298669874668121,
-0.05682311952114105,
0.022087810561060905,
-0.006123469676822424,
0.16688843071460724,
-0.07129213958978653,
0.049526043236255646,
0.16432930529117584,
-0.0056761931627988815,
0.02626330778002739,
0.05912663787603378,
0.007785039022564888,
-0.10230713337659836,
0.06331949681043625,
0.041520047932863235,
-0.04354042559862137,
-0.21613115072250366,
-0.03251210227608681,
-0.08161794394254684,
0.060349155217409134,
0.105542853474617,
0.04076828435063362,
-0.1574031561613083,
0.03042764775454998,
-0.009822499006986618,
0.16078390181064606,
-0.01701013371348381,
0.05716291815042496,
0.005986104719340801,
0.022277791053056717,
0.008505648002028465,
-0.10126793384552002,
0.012022197246551514,
0.07180969417095184,
0.09713921695947647,
0.21226832270622253,
-0.08539732545614243,
0.1569855809211731,
0.007881389930844307,
0.12342324107885361,
0.05063767358660698,
0.11006845533847809,
-0.11964020878076553,
0.013678435236215591,
0.008754640817642212,
-0.019869426265358925,
-0.06094885990023613,
0.048687644302845,
-0.042135100811719894,
0.07249100506305695,
-0.06795520335435867,
0.0273990947753191,
0.01699867844581604,
0.2017800360918045,
0.0714147612452507,
-0.1635221242904663,
-0.14589671790599823,
0.003628678619861603,
-0.0684565007686615,
-0.09068696200847626,
0.060676876455545425,
0.20809641480445862,
-0.055758122354745865,
0.0264588575810194,
-0.018870534375309944,
0.13048619031906128,
-0.09545867890119553,
-0.020413300022482872,
0.03526194766163826,
0.05981956794857979,
0.0043940297327935696,
0.1034533679485321,
-0.2570081353187561,
0.08760299533605576,
0.012264463119208813,
0.08451841026544571,
-0.026377227157354355,
0.055985480546951294,
-0.04187210649251938,
-0.00040886978968046606,
0.07585091143846512,
0.01781470514833927,
-0.05414023622870445,
-0.19654108583927155,
-0.05179501324892044,
0.026000777259469032,
0.05664263293147087,
-0.013108034618198872,
0.08422847837209702,
-0.01522531546652317,
0.048445478081703186,
-0.022655000910162926,
-0.10297121852636337,
-0.06789418309926987,
-0.12198488414287567,
-0.03440576419234276,
-0.005216774996370077,
-0.04580439627170563,
-0.027171721681952477,
0.03308014199137688,
0.03762782737612724,
0.24909131228923798,
-0.1568084955215454,
-0.05774940177798271,
-0.0959431603550911,
0.06403715908527374,
0.12006383389234543,
-0.09357606619596481,
0.014235327951610088,
0.021792815998196602,
0.06304506957530975,
-0.04943143576383591,
-0.0817866399884224,
0.04182705283164978,
-0.0645851269364357,
-0.09621388465166092,
-0.03752704709768295,
0.1041102483868599,
0.009006441570818424,
0.04455077275633812,
0.010210515931248665,
-0.08649270981550217,
-0.018836399540305138,
-0.13160625100135803,
-0.06371978670358658,
-0.016577448695898056,
0.022912662476301193,
-0.005037431605160236,
-0.1384403556585312,
0.05316279083490372,
-0.009075186215341091,
-0.09179501980543137,
0.05654367804527283,
0.17951704561710358,
-0.07467164099216461,
0.032021839171648026,
0.07505226135253906,
-0.054896436631679535,
-0.20094263553619385,
-0.02540808729827404,
0.048458896577358246,
0.08706426620483398,
-0.02084268443286419,
-0.14485812187194824,
0.08357951045036316,
-0.01037683617323637,
0.009525843895971775,
0.014328046701848507,
-0.22835080325603485,
-0.13488031923770905,
0.010122866369783878,
0.07384498417377472,
0.05697430297732353,
-0.08937577903270721,
-0.048618923872709274,
-0.05150653049349785,
-0.04397239536046982,
0.07310114055871964,
0.08457719534635544,
0.10232943296432495,
-0.025737829506397247,
0.022393345832824707,
0.04188867285847664,
-0.030340004712343216,
0.053671129047870636,
-0.005966612603515387,
0.09891603887081146,
-0.01719927042722702,
0.008416324853897095,
0.05045892670750618,
-0.06004800274968147,
0.16828016936779022,
-0.14753064513206482,
0.0972415879368782,
-0.15807946026325226,
-0.0336211696267128,
-0.02969004213809967,
0.0033804415725171566,
-0.04232991486787796,
-0.0413038432598114,
-0.13235057890415192,
0.04838331416249275,
0.05543600022792816,
-0.030749989673495293,
0.047058653086423874,
-0.00328086712397635,
-0.04924384877085686,
0.06224939227104187,
0.083519347012043,
-0.0026617730036377907,
-0.11707495152950287,
0.03314971923828125,
0.01223448384553194,
0.10510998964309692,
-0.15458473563194275,
0.034018535166978836,
0.10441606491804123,
0.01782197877764702,
0.09068788588047028,
0.02065982297062874,
-0.10144516825675964,
0.02527572028338909,
0.07111184298992157,
-0.07080928981304169,
-0.07048394531011581,
-0.016289619728922844,
-0.028597164899110794,
-0.08699396252632141,
0.04538453742861748,
0.0921717956662178,
-0.050150077790021896,
-0.00007911510328995064,
-0.004407679196447134,
0.012536554597318172,
-0.07957903295755386,
0.16814640164375305,
0.006067619193345308,
0.08450807631015778,
-0.057147253304719925,
0.07477925717830658,
0.09448486566543579,
-0.11204579472541809,
0.032034166157245636,
0.12729768455028534,
-0.09121352434158325,
-0.011117838323116302,
0.09574940800666809,
0.12581323087215424,
-0.02758811227977276,
-0.04795460402965546,
-0.09588290750980377,
-0.09219622611999512,
0.017136195674538612,
0.07329368591308594,
0.07242756336927414,
0.08984718471765518,
-0.017837295308709145,
0.008457351475954056,
-0.12723296880722046,
0.09580473601818085,
0.07871198654174805,
0.049537647515535355,
-0.12603159248828888,
0.13761234283447266,
0.0384829007089138,
0.07754445821046829,
-0.00643963785842061,
0.0286290030926466,
-0.13163666427135468,
0.03311247006058693,
-0.02289886772632599,
0.03491179272532463,
-0.012881291098892689,
0.03897371515631676,
-0.04512248933315277,
0.031109005212783813,
-0.02873460203409195,
0.045737601816654205,
-0.039426978677511215,
-0.02193157561123371,
-0.035963673144578934,
0.018005430698394775,
-0.06214923784136772,
-0.014253707602620125,
0.01755751296877861,
-0.09731610864400864,
0.09528407454490662,
-0.05108684301376343,
-0.0045665232464671135,
-0.005089580547064543,
0.03274267911911011,
0.048531752079725266,
0.0042256442829966545,
0.054148200899362564,
-0.009895769879221916,
-0.02926625870168209,
0.017270272597670555,
0.024969615042209625,
-0.015163743868470192,
0.004188677296042442,
0.10920248180627823,
-0.13084594905376434,
-0.07654653489589691,
-0.09398427605628967,
-0.06492024660110474,
-0.05594870448112488,
0.07478483766317368,
0.07773608714342117,
0.0810377225279808,
0.0838666707277298,
-0.03686503320932388,
0.005068640224635601,
-0.17228716611862183,
-0.0419037789106369,
0.05417352169752121,
0.006879512686282396,
-0.12066897004842758,
-0.042135074734687805,
0.06738526374101639,
-0.03533897176384926,
0.11398535221815109,
-0.03796479105949402,
0.022846201434731483,
-0.007007373031228781,
-0.058994077146053314,
-0.06463214010000229,
0.006625378970056772,
0.17723317444324493,
-0.0984129011631012,
0.01347300224006176,
0.0030631658155471087,
0.00712934322655201,
0.01929810829460621,
0.15295076370239258,
0.1399673968553543,
0.1192636713385582,
0.019512144848704338,
0.08261680603027344,
-0.03732607513666153,
-0.03250456228852272,
-0.10841316729784012,
0.06280586123466492,
-0.056980013847351074,
0.027069687843322754,
-0.02384425513446331,
0.1424751728773117,
0.07102405279874802,
-0.14525963366031647,
0.10694262385368347,
-0.0016350055811926723,
-0.09533172100782394,
-0.03669887036085129,
-0.0961361974477768,
-0.03812716528773308,
-0.10527011752128601,
0.0038980948738753796,
-0.10276616364717484,
-0.020700572058558464,
0.048556793481111526,
0.0303468257188797,
-0.025082863867282867,
0.16857469081878662,
-0.06518919765949249,
-0.04172980785369873,
0.02793905697762966,
0.050455715507268906,
0.009118343703448772,
0.08576113730669022,
0.02076122537255287,
0.05977568030357361,
-0.044920664280653,
0.07033723592758179,
0.03630270063877106,
-0.004580911248922348,
0.02975565567612648,
0.050413839519023895,
-0.01117019634693861,
-0.042211465537548065,
-0.023640690371394157,
0.0895427018404007,
0.13143792748451233,
0.03234775736927986,
-0.02447773516178131,
-0.05608067288994789,
0.16760039329528809,
-0.054876990616321564,
-0.053174588829278946,
-0.1285189390182495,
0.1760130077600479,
0.020879097282886505,
0.002646405017003417,
0.018335917964577675,
-0.07764153927564621,
-0.02186083421111107,
0.2577205002307892,
0.05801311507821083,
-0.05547294393181801,
-0.021517254412174225,
-0.001437376718968153,
-0.007534054573625326,
-0.03704468160867691,
0.1443730592727661,
0.008998055942356586,
0.24279457330703735,
0.01999533921480179,
-0.011466351337730885,
-0.050163377076387405,
-0.05362953990697861,
0.012759355828166008,
0.19094473123550415,
-0.035582076758146286,
0.02245558798313141,
-0.10981308668851852,
-0.007240002509206533,
0.012330939061939716,
-0.15624207258224487,
0.13555045425891876,
-0.1431177854537964,
-0.07317416369915009,
0.009683214128017426,
0.06644438207149506,
-0.05775786563754082,
0.04320564121007919,
-0.02234184555709362,
0.07197079807519913,
0.05061761662364006,
-0.02333490364253521,
-0.09058194607496262,
-0.14052121341228485,
0.05290709808468819,
-0.009667651727795601,
0.12336958199739456,
0.017016706988215446,
0.09010908752679825,
0.07969614863395691,
0.020404115319252014,
-0.07509525865316391,
0.0682322308421135,
0.027184896171092987,
-0.015018651261925697,
0.03522085025906563,
0.12172319740056992,
-0.05204414576292038,
0.16648370027542114,
0.017693225294351578,
-0.029397841542959213,
-0.02430609054863453,
-0.04467213153839111,
-0.01319895964115858,
-0.14966732263565063,
-0.002367348177358508,
-0.06585433334112167,
0.1449221819639206,
0.19627079367637634,
-0.048312246799468994,
-0.017815392464399338,
-0.04775235801935196,
0.09371544420719147,
-0.009849989786744118,
0.08485991507768631,
0.004816785454750061,
-0.18520817160606384,
0.017850110307335854,
-0.04390615597367287,
0.009088069200515747,
-0.1856856495141983,
-0.06732716411352158,
-0.02923526242375374,
-0.037270329892635345,
-0.09666009247303009,
0.13832826912403107,
0.07407690584659576,
0.03227897733449936,
-0.04834919050335884,
-0.1128372773528099,
-0.012274645268917084,
0.04570266976952553,
-0.11977818608283997,
-0.12178845703601837
] |
null | null | transformers |
# CodeTrans model for code documentation generation java
Pretrained model on programming language java using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized java code functions: it works best with tokenized java functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used single-task training on CodeSearchNet Corpus java dataset.
## Intended uses & limitations
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_java"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_java", skip_special_tokens=True),
device=0
)
tokenized_code = "public static < T , U > Function < T , U > castFunction ( Class < U > target ) { return new CastToClass < T , U > ( target ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/single%20task/function%20documentation%20generation/java/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "public static < T , U > Function < T , U > castFunction ( Class < U > target ) { return new CastToClass < T , U > ( target ) ; }"}]} | summarization | SEBIS/code_trans_t5_base_code_documentation_generation_java | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
| CodeTrans model for code documentation generation java
======================================================
Pretrained model on programming language java using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized java code functions: it works best with tokenized java functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used single-task training on CodeSearchNet Corpus java dataset.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
46,
112
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.0875348374247551,
0.013697436079382896,
-0.0003455282421782613,
0.06394645571708679,
0.12497641891241074,
-0.003056386485695839,
0.07015896588563919,
0.0622900165617466,
0.008274346590042114,
-0.0482400581240654,
0.08769276738166809,
0.1497582495212555,
0.02762714959681034,
0.1312154233455658,
-0.03489409759640694,
-0.20416918396949768,
-0.0008053651545196772,
0.06260603666305542,
-0.1364232748746872,
0.12527789175510406,
0.13377933204174042,
-0.05561942979693413,
0.1017858013510704,
-0.009229477494955063,
-0.22977487742900848,
0.06711380928754807,
-0.024753574281930923,
-0.08570939302444458,
0.1276760995388031,
0.08113759756088257,
0.10691884160041809,
0.037618692964315414,
0.002452101558446884,
-0.22514183819293976,
0.0342220775783062,
-0.03248739242553711,
0.015348796732723713,
0.05422268807888031,
0.04324764385819435,
-0.03354261815547943,
0.19429150223731995,
-0.003324622754007578,
0.008941693231463432,
0.05411025509238243,
-0.10770773887634277,
-0.09157682210206985,
-0.017219066619873047,
-0.012689750641584396,
0.09100747853517532,
0.07052944600582123,
0.021522730588912964,
0.12098428606987,
-0.13321104645729065,
0.13232064247131348,
0.09553149342536926,
-0.15953649580478668,
-0.022312473505735397,
0.12433511018753052,
0.09789133816957474,
-0.04756445810198784,
-0.058550458401441574,
0.006124221254140139,
0.07550051063299179,
0.024089157581329346,
0.044558025896549225,
-0.14467084407806396,
-0.21428675949573517,
0.07047666609287262,
-0.05522088706493378,
-0.054558660835027695,
0.28957539796829224,
-0.001740014529787004,
-0.03579780086874962,
-0.04674946889281273,
-0.02384253218770027,
0.037143558263778687,
0.001583861536346376,
-0.01219947636127472,
0.013403519056737423,
-0.006495804525911808,
0.0001787974761100486,
-0.020048044621944427,
-0.1046537458896637,
-0.12809570133686066,
0.012318591587245464,
0.0632692351937294,
-0.008053308352828026,
0.028492100536823273,
-0.1689462959766388,
0.09569334238767624,
0.08020750433206558,
-0.09261907637119293,
0.02069508098065853,
-0.06649632006883621,
-0.01483930740505457,
-0.015727058053016663,
-0.04498734325170517,
-0.16703221201896667,
0.090079665184021,
0.0329529233276844,
-0.06404729187488556,
0.0523817278444767,
0.008363490924239159,
0.07629864662885666,
0.05736444145441055,
0.17173059284687042,
-0.005652987863868475,
-0.07473986595869064,
0.048203229904174805,
-0.026784855872392654,
-0.06084217131137848,
0.012999911792576313,
-0.07063441723585129,
-0.03990170732140541,
0.013609337620437145,
0.12503035366535187,
-0.10864166170358658,
0.07088617980480194,
-0.06570059806108475,
-0.034918107092380524,
0.013724357821047306,
-0.13763315975666046,
-0.028480611741542816,
0.003789276583120227,
-0.06344790011644363,
-0.04812151566147804,
0.1105416938662529,
-0.056825656443834305,
-0.11266995966434479,
-0.03793490305542946,
-0.07759636640548706,
-0.002286880975589156,
-0.10701865702867508,
-0.07657937705516815,
0.017111066728830338,
0.04218428209424019,
0.06876492500305176,
-0.11328937113285065,
-0.18311932682991028,
-0.00378196919336915,
0.0859009325504303,
-0.009058515541255474,
0.043713636696338654,
-0.09632625430822372,
-0.02627558447420597,
-0.03624427691102028,
-0.023713121190667152,
0.06400034576654434,
-0.06666994839906693,
0.07964087277650833,
0.08765355497598648,
0.05573558062314987,
-0.06110719218850136,
0.05496655032038689,
-0.1368434876203537,
0.06887786090373993,
-0.17572492361068726,
0.09287060797214508,
-0.04748747497797012,
0.12308106571435928,
-0.10658205300569534,
-0.056166231632232666,
0.04451017454266548,
0.06461703032255173,
0.051738351583480835,
0.1248263269662857,
-0.14438219368457794,
-0.03378137946128845,
0.1368870586156845,
-0.10836703330278397,
-0.21830464899539948,
0.06135636940598488,
-0.07434312254190445,
0.21410374343395233,
0.048365335911512375,
0.19819733500480652,
0.14343023300170898,
-0.02868485078215599,
0.07107708603143692,
0.09276288002729416,
-0.04263358190655708,
-0.08118170499801636,
0.061137605458498,
0.06848236173391342,
-0.13355247676372528,
0.06377530097961426,
-0.028956551104784012,
0.10540744662284851,
-0.03231040760874748,
-0.04164525493979454,
-0.010478834621608257,
-0.06200810521841049,
0.016047755256295204,
-0.004955985117703676,
0.08650654554367065,
-0.006605913396924734,
0.01244751363992691,
0.0616019144654274,
0.1054123044013977,
-0.12497183680534363,
-0.006952561903744936,
-0.093503437936306,
0.028618594631552696,
-0.11639437079429626,
0.032513875514268875,
-0.21166153252124786,
0.027220679447054863,
0.01945713721215725,
0.011345173232257366,
0.026673752814531326,
0.04674104228615761,
0.002225014613941312,
0.008788947947323322,
0.012042907066643238,
-0.00012668300769291818,
0.012038164772093296,
-0.01339554600417614,
-0.028828123584389687,
-0.10568856447935104,
-0.048531509935855865,
-0.05472125485539436,
-0.018339067697525024,
-0.1854812502861023,
-0.007206879090517759,
0.03074919991195202,
0.06931772083044052,
0.030629904940724373,
0.037145815789699554,
0.050374578684568405,
0.06326886266469955,
-0.047165270894765854,
-0.020603148266673088,
0.06363126635551453,
0.022122951224446297,
-0.09089092165231705,
0.08014687150716782,
-0.05041682347655296,
0.0392286479473114,
0.09161380678415298,
-0.16121985018253326,
-0.0484076589345932,
-0.04355262964963913,
-0.03548724204301834,
-0.03213665261864662,
0.005714211147278547,
-0.02016386389732361,
0.19718383252620697,
-0.00277286721393466,
0.17384465038776398,
-0.1252789944410324,
-0.056693870574235916,
-0.03044029138982296,
-0.018076809123158455,
0.02996285818517208,
0.1407017558813095,
0.08248013257980347,
-0.2236068695783615,
0.0575043261051178,
0.08426006138324738,
-0.021634353324770927,
0.214022696018219,
-0.041451066732406616,
-0.02952142432332039,
-0.030555350705981255,
0.06863059848546982,
-0.04190429672598839,
0.14710868895053864,
-0.22171849012374878,
-0.03171179071068764,
0.019685082137584686,
-0.007631672080606222,
0.1122758537530899,
-0.11735276877880096,
-0.002150582382455468,
0.01657246984541416,
-0.03956224396824837,
-0.09183894842863083,
0.04511590301990509,
0.003761471714824438,
0.029469814151525497,
-0.006956758908927441,
-0.016358038410544395,
0.034719642251729965,
-0.03919856995344162,
-0.11604341864585876,
0.23168928921222687,
-0.08178829401731491,
-0.26040002703666687,
-0.19569189846515656,
0.07312697917222977,
-0.013156517408788204,
-0.009771275334060192,
0.05569472163915634,
-0.03973342850804329,
-0.05276770517230034,
-0.043325118720531464,
0.11012377589941025,
-0.028776198625564575,
-0.048147059977054596,
-0.010282251052558422,
0.08146045356988907,
-0.00270624621771276,
-0.19431070983409882,
-0.013962035067379475,
0.02327197976410389,
0.07291260361671448,
0.0076109846122562885,
-0.13866771757602692,
0.10433061420917511,
0.07624371349811554,
-0.05397311970591545,
0.04713505133986473,
-0.02445952594280243,
0.20842666923999786,
-0.06126997619867325,
-0.06025610491633415,
0.16236791014671326,
-0.09642329066991806,
-0.003352442290633917,
0.028680074959993362,
0.003236632328480482,
-0.1170538067817688,
0.03786719590425491,
-0.039697349071502686,
-0.05638670548796654,
-0.25277459621429443,
-0.08142433315515518,
-0.08600790798664093,
0.09921654313802719,
0.02161657251417637,
0.026387520134449005,
-0.07102641463279724,
0.053963709622621536,
0.08214443922042847,
0.14172638952732086,
-0.0023885478731244802,
0.0621887668967247,
0.0461786612868309,
0.00000787666613177862,
-0.005724714137613773,
-0.11177843809127808,
-0.04886482656002045,
0.029043223708868027,
0.0967540293931961,
0.1904471069574356,
-0.0007648671162314713,
0.1683170348405838,
0.07690630853176117,
0.04392387717962265,
0.03241889551281929,
0.17099055647850037,
-0.12204890698194504,
0.019687142223119736,
-0.017568619921803474,
-0.047731269150972366,
-0.13728395104408264,
0.02280402183532715,
-0.07212063670158386,
0.061882730573415756,
-0.1281474381685257,
-0.057739004492759705,
0.06905921548604965,
0.09702833741903305,
-0.014093056321144104,
-0.25188183784484863,
-0.11165375262498856,
0.03986385837197304,
-0.07736600190401077,
-0.07006209343671799,
0.05375116318464279,
0.17215187847614288,
-0.12860530614852905,
-0.015176679007709026,
-0.04420911520719528,
0.16267751157283783,
-0.0767742246389389,
0.03270037844777107,
-0.04839164763689041,
-0.03533410280942917,
0.01931367628276348,
0.1675824224948883,
-0.21182399988174438,
0.23328521847724915,
0.005609455052763224,
0.029242709279060364,
-0.06598269194364548,
0.03204849734902382,
0.0026204015593975782,
0.09018553048372269,
0.1278182566165924,
-0.017440814524888992,
-0.037498705089092255,
-0.1433279663324356,
0.042860161513090134,
0.088055320084095,
0.0523323230445385,
-0.026728661730885506,
0.057409390807151794,
-0.022015240043401718,
0.022249603644013405,
-0.01924106292426586,
-0.07525847852230072,
-0.10074827075004578,
-0.09780948609113693,
-0.003032986307516694,
-0.03312069922685623,
0.05907544866204262,
-0.026102658361196518,
0.026925021782517433,
0.10447284579277039,
0.17686131596565247,
-0.08071305602788925,
-0.05718423053622246,
-0.10008548200130463,
0.02711847424507141,
0.1261013001203537,
-0.07769280672073364,
-0.010893245227634907,
0.0021051305811852217,
0.03895604982972145,
0.0037654642947018147,
-0.13072633743286133,
0.05351848527789116,
-0.06955447793006897,
0.00780505733564496,
-0.030631467700004578,
0.09126220643520355,
-0.018012814223766327,
-0.01911618560552597,
0.06623081117868423,
-0.0735074058175087,
-0.055885396897792816,
-0.14300747215747833,
-0.10915983468294144,
-0.04150675609707832,
0.06663099676370621,
0.024734467267990112,
-0.14429457485675812,
0.024876385927200317,
-0.004918987862765789,
-0.03161003440618515,
0.20293238759040833,
0.08884996175765991,
-0.02574251964688301,
0.02489391341805458,
0.1639450043439865,
-0.11602018773555756,
-0.2334480732679367,
0.005280734039843082,
-0.03230864554643631,
0.07774461060762405,
0.017152797430753708,
-0.1297965943813324,
0.0966779887676239,
-0.0212701428681612,
0.038408175110816956,
-0.0028145096730440855,
-0.2850101590156555,
-0.0927276760339737,
0.09940031915903091,
0.13139492273330688,
0.07567469775676727,
-0.1134088858962059,
-0.07127122581005096,
-0.08849947154521942,
-0.24174600839614868,
0.16580738127231598,
-0.10594948381185532,
0.0877404436469078,
-0.01802125759422779,
0.05255144089460373,
0.027449941262602806,
-0.056967593729496,
0.10794489830732346,
0.016404593363404274,
0.11107928305864334,
-0.028078187257051468,
-0.1109815165400505,
0.09532010555267334,
-0.03439773619174957,
0.1620626449584961,
-0.11545030772686005,
0.08731027692556381,
-0.22446514666080475,
-0.03686108812689781,
-0.047640107572078705,
0.04783904552459717,
-0.010010837577283382,
-0.0743076354265213,
-0.04631466791033745,
0.02392154559493065,
0.0339743047952652,
0.016516052186489105,
0.12039679288864136,
-0.04230856895446777,
0.0035285227932035923,
0.12763555347919464,
0.14849890768527985,
-0.043803393840789795,
-0.017025446519255638,
0.040696121752262115,
0.03064838983118534,
0.1105346530675888,
-0.2375105619430542,
0.08407941460609436,
0.11292080581188202,
0.015424934215843678,
0.12785789370536804,
0.06999306380748749,
-0.034878864884376526,
0.02407427504658699,
0.08550633490085602,
-0.13883844017982483,
-0.04545171931385994,
-0.06494591385126114,
-0.032525431364774704,
0.011911292560398579,
0.06969189643859863,
0.1323765367269516,
-0.07027289271354675,
-0.012021319009363651,
-0.0032027200795710087,
-0.019216537475585938,
-0.13435979187488556,
0.12141181528568268,
0.04016401618719101,
0.07558362931013107,
-0.085504911839962,
0.08729752153158188,
0.05737169831991196,
-0.14994265139102936,
-0.027014533057808876,
0.13614708185195923,
-0.13091707229614258,
-0.08283495903015137,
-0.0009928954532369971,
0.30476242303848267,
-0.08785919100046158,
-0.09488599002361298,
-0.13568812608718872,
-0.0589756965637207,
-0.00480257673189044,
0.21958544850349426,
0.09407008439302444,
0.09977125376462936,
-0.0452021062374115,
-0.017672209069132805,
-0.1107776090502739,
0.06677959859371185,
0.07870634645223618,
0.010086342692375183,
-0.09087855368852615,
0.08646351844072342,
-0.004493309184908867,
0.14587192237377167,
-0.05127015709877014,
-0.03204577788710594,
-0.16943508386611938,
0.07053176313638687,
-0.1295023113489151,
0.06592816114425659,
-0.07465513795614243,
0.0265691876411438,
0.01240604929625988,
0.008961480110883713,
-0.03397351875901222,
0.055781904608011246,
-0.0832022875547409,
0.012593800202012062,
-0.0011513333301991224,
0.08368167281150818,
-0.07915686815977097,
-0.018968697637319565,
0.08212091028690338,
-0.05834730342030525,
0.0918813943862915,
0.017350368201732635,
-0.06442798674106598,
0.1012454703450203,
-0.18230204284191132,
-0.015219482593238354,
0.04310361668467522,
0.013158856891095638,
0.06185366213321686,
-0.05620328709483147,
0.03308556228876114,
0.03108653984963894,
0.035992447286844254,
-0.01840687356889248,
0.10512319952249527,
-0.13565194606781006,
-0.09874997287988663,
-0.027431296184659004,
-0.11547094583511353,
-0.039944760501384735,
0.04154624044895172,
0.05203212797641754,
0.07814259082078934,
0.08607788383960724,
-0.02199070155620575,
0.03141273930668831,
-0.07487697154283524,
-0.015214871615171432,
0.048350680619478226,
-0.08761196583509445,
-0.0652468279004097,
-0.093665212392807,
0.019157055765390396,
-0.07295798510313034,
0.17917779088020325,
0.010520649142563343,
0.13619844615459442,
-0.011928456835448742,
-0.058892980217933655,
0.0081118643283844,
0.05523201823234558,
0.21566550433635712,
-0.04356808587908745,
0.048623524606227875,
-0.06566096842288971,
0.07222473621368408,
0.01657106727361679,
0.0591760091483593,
0.08517037332057953,
0.13076718151569366,
-0.016010455787181854,
0.11139610409736633,
0.034444570541381836,
0.041151013225317,
-0.04293122515082359,
-0.0702202096581459,
0.09094005823135376,
0.06136952340602875,
-0.04172206297516823,
0.09653545916080475,
0.12303636968135834,
-0.11214438825845718,
0.09893506020307541,
0.0013752031372860074,
-0.10303182899951935,
-0.04028548300266266,
-0.020668353885412216,
-0.04859589412808418,
-0.12325625866651535,
-0.0005443849368020892,
-0.1319676786661148,
-0.03752988949418068,
0.051451023668050766,
0.025116777047514915,
-0.0677337571978569,
0.18952828645706177,
0.007759132422506809,
-0.07555653154850006,
0.06747636198997498,
-0.006039418745785952,
0.01735023222863674,
-0.028778282925486565,
0.08784612268209457,
-0.0061198752373456955,
-0.020769067108631134,
-0.011700782924890518,
0.04509396851062775,
-0.037726566195487976,
-0.011415199376642704,
-0.07268580049276352,
-0.03392927721142769,
-0.03647568076848984,
0.039575133472681046,
-0.004547376651316881,
0.022212669253349304,
0.027247052639722824,
-0.04432417079806328,
0.0019617003854364157,
0.24082204699516296,
-0.043419938534498215,
-0.07224733382463455,
-0.13918447494506836,
0.17177271842956543,
0.05537329241633415,
0.059337515383958817,
0.015530291944742203,
-0.0547466054558754,
-0.035400863736867905,
0.2699294984340668,
0.18216174840927124,
-0.04876147210597992,
-0.0007213094504550099,
0.00996269192546606,
0.0165316890925169,
-0.005329003091901541,
0.12688973546028137,
0.03906630724668503,
0.21224625408649445,
-0.02446792647242546,
-0.05357332527637482,
-0.04532114416360855,
-0.051101215183734894,
0.0354292057454586,
0.11981749534606934,
0.02639816515147686,
-0.04894211143255234,
-0.03498129919171333,
0.09090722352266312,
-0.14609721302986145,
-0.11333595961332321,
0.03274228423833847,
-0.1482396125793457,
-0.08157680928707123,
-0.07629746198654175,
0.050488971173763275,
-0.032418906688690186,
0.04592205956578255,
-0.03576705977320671,
-0.011583673767745495,
0.06371704488992691,
0.0388018861413002,
-0.13238593935966492,
-0.10421805828809738,
0.049209050834178925,
-0.055664222687482834,
0.12114697694778442,
-0.026003092527389526,
0.10079795867204666,
0.09326222538948059,
0.024225132539868355,
-0.05229032039642334,
0.04322533681988716,
0.06295718997716904,
0.0478549487888813,
0.06272850185632706,
0.06944398581981659,
-0.024032380431890488,
0.14079567790031433,
-0.04788777977228165,
-0.119390107691288,
0.03177432343363762,
-0.01158355176448822,
-0.008139731362462044,
-0.11275683343410492,
-0.027326466515660286,
-0.08268268406391144,
0.09143614768981934,
0.1697247177362442,
-0.046309929341077805,
0.017417678609490395,
-0.07789275795221329,
0.1435597538948059,
0.004004329442977905,
-0.015456177294254303,
-0.07638128846883774,
-0.1352270543575287,
-0.019851481541991234,
0.03283907100558281,
-0.016769496724009514,
-0.22979772090911865,
-0.00013190227036830038,
-0.047559067606925964,
-0.015641456469893456,
-0.04228579252958298,
0.10924817621707916,
0.13639593124389648,
0.05022285133600235,
-0.026618963107466698,
-0.17581267654895782,
-0.010601256042718887,
0.06815240532159805,
-0.10393248498439789,
-0.15014556050300598
] |
null | null | transformers |
# CodeTrans model for code documentation generation java
Pretrained model on programming language java using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized java code functions: it works best with tokenized java functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_java_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_java_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "public static < T , U > Function < T , U > castFunction ( Class < U > target ) { return new CastToClass < T , U > ( target ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/function%20documentation%20generation/java/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 480,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "public static < T , U > Function < T , U > castFunction ( Class < U > target ) { return new CastToClass < T , U > ( target ) ; }"}]} | summarization | SEBIS/code_trans_t5_base_code_documentation_generation_java_multitask | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
| CodeTrans model for code documentation generation java
======================================================
Pretrained model on programming language java using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized java code functions: it works best with tokenized java functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 480,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 480,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 480,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
46,
61,
143
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 480,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.12625908851623535,
-0.025136111304163933,
-0.0004800467868335545,
0.1329772174358368,
0.10545311868190765,
0.024742096662521362,
0.058886218816041946,
0.06731297075748444,
-0.026749394834041595,
0.019010478630661964,
0.04325202479958534,
0.010033298283815384,
0.03317829221487045,
0.19203782081604004,
0.006131714675575495,
-0.1168055534362793,
-0.014580067247152328,
0.04498186707496643,
-0.03464187681674957,
0.12769803404808044,
0.09435366094112396,
-0.07435121387243271,
0.05373098701238632,
-0.07012023776769638,
-0.24475759267807007,
0.05959472432732582,
-0.004692383110523224,
-0.06324644386768341,
0.09960926324129105,
0.046476397663354874,
0.12625066936016083,
-0.004150141030550003,
0.021399596706032753,
-0.14214487373828888,
0.010713351890444756,
0.011391602456569672,
0.03332711011171341,
0.017729731276631355,
0.04585370793938637,
0.05521157756447792,
0.13669563829898834,
0.011043330654501915,
0.04263927415013313,
0.06180543825030327,
-0.07538974285125732,
-0.11759409308433533,
-0.007635287009179592,
0.02486858144402504,
0.0520443357527256,
0.09958334267139435,
-0.011990239843726158,
0.12245158851146698,
-0.15136699378490448,
0.12841391563415527,
0.10119526088237762,
-0.21889309585094452,
-0.012167822569608688,
0.1254192441701889,
0.09036928415298462,
0.09796377271413803,
-0.059524621814489365,
-0.0666365697979927,
0.10341694205999374,
0.05216624587774277,
0.043303947895765305,
-0.10108000040054321,
-0.10903080552816391,
0.023778226226568222,
-0.07454194873571396,
-0.06427785754203796,
0.22123192250728607,
0.0016500762430951,
-0.07734663039445877,
-0.0543687604367733,
-0.02575811930000782,
-0.1350039839744568,
0.036519333720207214,
0.028614100068807602,
0.0072712767869234085,
-0.03348904848098755,
0.02006683498620987,
0.031045187264680862,
-0.07353420555591583,
-0.15549957752227783,
0.028417186811566353,
0.08940237760543823,
0.056426819413900375,
0.02560744807124138,
-0.09735077619552612,
0.10504303872585297,
0.035071052610874176,
-0.05971665307879448,
-0.026749540120363235,
-0.017192937433719635,
-0.10403761267662048,
0.03320100158452988,
-0.05094814673066139,
-0.18411622941493988,
0.016048984602093697,
0.012252748012542725,
-0.04753049090504646,
0.051711976528167725,
0.027455635368824005,
0.038371436297893524,
0.0214474368840456,
0.1982230842113495,
0.056603893637657166,
-0.12113699316978455,
0.05426353961229324,
0.04412221163511276,
-0.036232445389032364,
-0.005486293695867062,
-0.06959706544876099,
-0.09860879182815552,
0.09425779432058334,
0.10326213389635086,
-0.1372150480747223,
0.03612247109413147,
-0.07137293368577957,
-0.043702855706214905,
0.0006391413044184446,
-0.1586642563343048,
0.003460993990302086,
0.02713293395936489,
-0.06663749366998672,
-0.05411553010344505,
0.09398844093084335,
-0.16951905190944672,
-0.14994613826274872,
-0.04377244785428047,
-0.07949867844581604,
-0.04021994024515152,
-0.16802386939525604,
-0.15642879903316498,
-0.009002278558909893,
-0.03950823098421097,
0.019370034337043762,
-0.08647678047418594,
-0.15814363956451416,
-0.02637692354619503,
0.019731635227799416,
0.004242847673594952,
-0.0022873147390782833,
-0.07823646068572998,
-0.00895591638982296,
-0.029205137863755226,
-0.039027079939842224,
0.013911060988903046,
-0.04753046855330467,
0.12152726948261261,
0.10256288945674896,
0.05487087368965149,
-0.023972300812602043,
0.06101096421480179,
-0.0784740149974823,
0.06522636860609055,
-0.11563316732645035,
0.09513531625270844,
-0.05878845602273941,
0.07968038320541382,
-0.033173080533742905,
-0.10541017353534698,
0.07924834638834,
0.06171056255698204,
0.0665198490023613,
0.035125620663166046,
-0.13915644586086273,
-0.023720769211649895,
0.18277910351753235,
-0.12425358593463898,
-0.13792434334754944,
0.1026422381401062,
-0.03828984126448631,
0.08322964608669281,
0.08237291872501373,
0.14272457361221313,
0.15135981142520905,
-0.024633683264255524,
0.024323945865035057,
0.04935282841324806,
0.04493451863527298,
-0.13403579592704773,
0.07835646718740463,
0.06658630818128586,
-0.08988947421312332,
0.061683110892772675,
-0.016475586220622063,
0.09860582649707794,
-0.011068286374211311,
-0.024621274322271347,
-0.05133240669965744,
-0.07973187416791916,
-0.006075061392039061,
0.008265397511422634,
0.06568735092878342,
-0.08334081619977951,
-0.05890496447682381,
0.09140487015247345,
0.17443835735321045,
-0.1312178373336792,
-0.0020120618864893913,
-0.08077605068683624,
0.03768399730324745,
-0.07685738056898117,
0.02810927852988243,
-0.1614477038383484,
0.03584885969758034,
0.07772760093212128,
-0.02758423052728176,
0.05252065509557724,
0.1316397339105606,
0.013144449330866337,
0.0445614755153656,
0.0015263946261256933,
-0.01477144192904234,
-0.12083344161510468,
-0.056117694824934006,
-0.06358900666236877,
-0.062494877725839615,
-0.08990240097045898,
-0.059530504047870636,
-0.03827532380819321,
-0.1925862729549408,
0.011884447187185287,
0.0024190042167901993,
0.0031542012002319098,
0.027945170179009438,
-0.012897592969238758,
0.029232852160930634,
0.07650481164455414,
-0.060558490455150604,
-0.036501195281744,
0.03281865268945694,
0.023301010951399803,
-0.04166096821427345,
-0.05874509736895561,
-0.08002304285764694,
0.0073761651292443275,
0.10691281408071518,
0.041634876281023026,
-0.07911510020494461,
0.02274022251367569,
-0.020209692418575287,
-0.04926152527332306,
0.009686917066574097,
-0.0644453838467598,
0.14636459946632385,
-0.005943335592746735,
0.19872868061065674,
-0.16429489850997925,
-0.03763729706406593,
-0.024244293570518494,
0.024895068258047104,
0.06249409541487694,
0.13870808482170105,
-0.013018758036196232,
-0.08311334997415543,
0.06515622138977051,
0.01792309805750847,
-0.10079235583543777,
0.23167425394058228,
-0.04728681966662407,
-0.09324251115322113,
0.02266015112400055,
0.10230632871389389,
-0.016481619328260422,
0.1676291972398758,
-0.2037937343120575,
-0.02768973633646965,
0.017479820176959038,
0.007793051190674305,
0.06668803840875626,
-0.12735971808433533,
0.002860683249309659,
0.009320100769400597,
-0.07261746376752853,
-0.06986508518457413,
-0.008713899180293083,
-0.006273619830608368,
0.03821185603737831,
-0.008122164756059647,
-0.0322146974503994,
0.018087485805153847,
-0.039589665830135345,
-0.1064692884683609,
0.2187061905860901,
-0.09676741063594818,
-0.22032079100608826,
-0.20629753172397614,
0.11387703567743301,
-0.06204480305314064,
-0.01345346961170435,
0.035978689789772034,
-0.07900229096412659,
-0.05561254918575287,
-0.05646835267543793,
0.17071256041526794,
-0.06138366460800171,
-0.010941104963421822,
-0.014899303205311298,
0.07609423995018005,
0.010870282538235188,
-0.2093340903520584,
0.03537052124738693,
-0.004445345606654882,
-0.013732175342738628,
0.007151376456022263,
-0.10107721388339996,
0.09153109043836594,
0.15381589531898499,
-0.08218727260828018,
0.020357921719551086,
0.007590852677822113,
0.18883848190307617,
-0.038922347128391266,
-0.05431274697184563,
0.1411723494529724,
-0.018717432394623756,
-0.010226434096693993,
0.01697077415883541,
-0.013188211247324944,
-0.09892197698354721,
0.06373556703329086,
-0.010327504016458988,
-0.02436898462474346,
-0.273200660943985,
-0.007625492289662361,
-0.07957296818494797,
0.05736179277300835,
0.037889886647462845,
0.04134003072977066,
-0.08891265094280243,
0.028384415432810783,
0.06132684648036957,
0.15100082755088806,
-0.005078750196844339,
0.05332179367542267,
0.0576799102127552,
-0.0020898180082440376,
0.007942639291286469,
-0.09900354593992233,
0.01247893925756216,
0.0729178786277771,
0.1119341179728508,
0.2698698937892914,
-0.09904804080724716,
0.19915561378002167,
0.04793638363480568,
0.050123173743486404,
0.04986700043082237,
0.13534978032112122,
-0.13308493793010712,
0.032467570155858994,
0.003389652818441391,
-0.00833326019346714,
-0.11047773063182831,
0.008734997361898422,
-0.0655079185962677,
0.09173490852117538,
-0.10699441283941269,
-0.05839195474982262,
0.010236099362373352,
0.14752601087093353,
0.04282683506608009,
-0.2238348126411438,
-0.1291303187608719,
0.020738327875733376,
-0.09503161162137985,
-0.10642465949058533,
0.06614281237125397,
0.24456770718097687,
-0.07616620510816574,
-0.04042268171906471,
-0.0045422762632369995,
0.13448503613471985,
-0.03776288032531738,
-0.021724838763475418,
-0.0360279306769371,
0.06315318495035172,
0.01623629592359066,
0.13509008288383484,
-0.2973770201206207,
0.1289745569229126,
-0.008337444625794888,
0.06280817836523056,
-0.029777146875858307,
0.049201373010873795,
-0.039392758160829544,
0.07780234515666962,
0.03738342970609665,
-0.00983166228979826,
0.0360453836619854,
-0.16025382280349731,
0.012785029597580433,
0.04177134111523628,
0.016608787700533867,
0.056575145572423935,
0.06296137720346451,
-0.0029487418942153454,
0.057892344892024994,
-0.01911986619234085,
-0.12559062242507935,
-0.07012661546468735,
-0.06543029099702835,
-0.01747133769094944,
-0.02999645285308361,
-0.014812281355261803,
-0.04514969885349274,
-0.010249008424580097,
0.07800386846065521,
0.18375885486602783,
-0.09748639911413193,
-0.07777184993028641,
-0.07483507692813873,
0.050175800919532776,
0.10807822644710541,
-0.08130381256341934,
0.02943793497979641,
-0.002517771441489458,
0.04283895343542099,
-0.00994328036904335,
-0.07467690110206604,
0.052063118666410446,
-0.03838823363184929,
-0.06930925697088242,
-0.012029891833662987,
0.06367725878953934,
-0.0015737962676212192,
0.027555987238883972,
0.01201924029737711,
-0.09572993218898773,
-0.04427060857415199,
-0.11996587365865707,
-0.12668971717357635,
-0.041079387068748474,
0.016394203528761864,
0.043061263859272,
-0.14564920961856842,
-0.057114433497190475,
0.0033239650074392557,
-0.03917403891682625,
0.13064199686050415,
0.15838494896888733,
-0.054691728204488754,
0.031105345115065575,
0.14648541808128357,
-0.06110350042581558,
-0.190225750207901,
0.033707525581121445,
0.045037850737571716,
0.11958883702754974,
-0.043351318687200546,
-0.1637726128101349,
0.04784906655550003,
0.02036207914352417,
0.036562107503414154,
0.053392939269542694,
-0.31137293577194214,
-0.12486202269792557,
0.08150741457939148,
0.16016852855682373,
0.12306094914674759,
-0.12302039563655853,
-0.03880351781845093,
-0.06309760361909866,
-0.1616823673248291,
0.09242071956396103,
-0.048797886818647385,
0.13325923681259155,
-0.07605315744876862,
0.028213316574692726,
0.03566569462418556,
-0.046251073479652405,
0.07341630011796951,
0.03253644332289696,
0.12173793464899063,
-0.043224576860666275,
0.01767292059957981,
0.12320835143327713,
-0.03387003764510155,
0.1836152821779251,
-0.14616040885448456,
0.09712337702512741,
-0.23492474853992462,
-0.05811136215925217,
-0.07503480464220047,
0.0032570173498243093,
-0.034672811627388,
-0.04619951546192169,
-0.07654773443937302,
0.03104906529188156,
-0.0038396758027374744,
-0.007020205724984407,
0.0446375235915184,
-0.030545098707079887,
-0.019755009561777115,
0.10683070123195648,
0.1047181487083435,
-0.020937321707606316,
-0.07052071392536163,
0.05468161031603813,
0.0504007413983345,
0.11517459154129028,
-0.1937192976474762,
0.030548814684152603,
0.10488912463188171,
0.015329780988395214,
0.12618933618068695,
0.044284239411354065,
-0.1041698157787323,
0.04282774403691292,
0.08763515204191208,
-0.07617872953414917,
-0.06464122235774994,
-0.02143372781574726,
-0.07972071319818497,
-0.0665665790438652,
0.05097291246056557,
0.09479986131191254,
-0.05139497295022011,
-0.019002649933099747,
-0.024534905329346657,
-0.019079482182860374,
-0.11301043629646301,
0.186431884765625,
0.07612547278404236,
0.08564707636833191,
-0.06610220670700073,
0.06400159746408463,
0.08457006514072418,
-0.0836351215839386,
0.0077377003617584705,
0.1873185783624649,
-0.10277801007032394,
-0.04796072095632553,
0.07278216630220413,
0.2222600281238556,
-0.027436457574367523,
-0.05928397923707962,
-0.14080747961997986,
-0.077125184237957,
0.031896136701107025,
0.16348321735858917,
0.10175041109323502,
0.09479958564043045,
-0.027121607214212418,
-0.0019160170340910554,
-0.1083836480975151,
0.09249646216630936,
0.06310120970010757,
0.049986161291599274,
-0.10631294548511505,
0.13186612725257874,
0.03911430016160011,
0.12142670899629593,
-0.02676241844892502,
-0.010907446965575218,
-0.13867062330245972,
0.06454332172870636,
-0.11255887895822525,
0.03467875346541405,
-0.008179934695363045,
0.05200931057333946,
-0.02393261156976223,
0.00259783654473722,
-0.031190337613224983,
0.06822595745325089,
-0.08178474754095078,
0.0014275535941123962,
0.004406132735311985,
0.05446489900350571,
-0.05095134675502777,
-0.01957034133374691,
0.032641202211380005,
-0.09246931225061417,
0.12326323240995407,
-0.03852270543575287,
-0.02878660522401333,
0.07961875945329666,
-0.04757547751069069,
0.04012811556458473,
0.014844408258795738,
0.04923771694302559,
0.02046947181224823,
0.014890575781464577,
0.07726211845874786,
0.03664602339267731,
0.05310554802417755,
0.025031356140971184,
0.11973720788955688,
-0.13935789465904236,
-0.08503083884716034,
-0.05483159050345421,
-0.1136004775762558,
-0.05680237337946892,
0.10047619044780731,
0.04742374271154404,
0.10458119958639145,
0.09102780371904373,
-0.03180244565010071,
0.011215021833777428,
-0.12622979283332825,
-0.06625088304281235,
0.02848641388118267,
-0.03015611506998539,
-0.08343155682086945,
-0.056023214012384415,
0.03754067420959473,
-0.03255271539092064,
0.12081488221883774,
0.018242867663502693,
0.03589305281639099,
-0.02045118808746338,
-0.060983989387750626,
-0.016121841967105865,
0.021368321031332016,
0.21326811611652374,
-0.08523987233638763,
0.042259518057107925,
0.00004779921437148005,
0.015978939831256866,
0.00810838583856821,
0.1184166893362999,
0.11822952330112457,
0.16525684297084808,
-0.0364285372197628,
0.10044536739587784,
0.01759527623653412,
-0.0011636296985670924,
-0.07532881200313568,
0.019435659050941467,
0.022024676203727722,
0.06312316656112671,
-0.04806511476635933,
0.1874152272939682,
0.09181362390518188,
-0.12268581241369247,
0.10978932678699493,
0.025414496660232544,
-0.13293461501598358,
-0.03313758969306946,
0.02335263416171074,
-0.036113761365413666,
-0.14742600917816162,
0.023418499156832695,
-0.12919175624847412,
-0.017013156786561012,
0.050024013966321945,
0.051550574600696564,
-0.07902666926383972,
0.1715245544910431,
0.0358104407787323,
-0.058131203055381775,
0.0549473911523819,
-0.0016451344126835465,
0.025785960257053375,
0.02187211625277996,
0.03513849526643753,
0.03612928465008736,
-0.03846859559416771,
0.03609820082783699,
0.02445223182439804,
-0.02415608800947666,
-0.01861872337758541,
-0.019948795437812805,
-0.003442120272666216,
-0.016198989003896713,
0.018808145076036453,
0.056662604212760925,
0.159557044506073,
0.036582738161087036,
-0.07379817217588425,
-0.01809350959956646,
0.17072468996047974,
-0.027121584862470627,
-0.09498025476932526,
-0.12651784718036652,
0.13201965391635895,
0.05172007530927658,
0.010870086960494518,
0.027035700157284737,
-0.08314243704080582,
-0.05344308540225029,
0.2082219123840332,
0.05308171361684799,
-0.03196471929550171,
-0.02392873726785183,
0.007169210817664862,
-0.0017653320683166385,
-0.04092777892947197,
0.20356537401676178,
0.022728078067302704,
0.22332549095153809,
0.023723866790533066,
-0.007037608418613672,
-0.06924150139093399,
-0.041004542261362076,
0.0032071529421955347,
0.11842309683561325,
-0.038379501551389694,
-0.03888537734746933,
-0.08348595350980759,
-0.003012163797393441,
-0.0008342273067682981,
-0.07991590350866318,
0.10142245143651962,
-0.1372438222169876,
-0.09882570058107376,
-0.04886016994714737,
0.05063706263899803,
-0.05985022336244583,
0.016563264653086662,
-0.025470277294516563,
0.0446193553507328,
0.07069329917430878,
-0.03286869823932648,
-0.10157738626003265,
-0.170670747756958,
0.09470922499895096,
-0.051429200917482376,
0.13228902220726013,
-0.0145228561013937,
0.1532781422138214,
0.08480212837457657,
0.026082023978233337,
-0.0644020214676857,
0.11496184021234512,
0.031456366181373596,
0.05794672295451164,
0.04817148670554161,
0.12280849367380142,
-0.050776466727256775,
0.13643553853034973,
-0.050452087074518204,
-0.030395952984690666,
-0.02921920269727707,
-0.07698126882314682,
-0.01874164491891861,
-0.16283760964870453,
-0.020148780196905136,
-0.09487061947584152,
0.09322638809680939,
0.19440878927707672,
-0.043825991451740265,
-0.030734295025467873,
-0.0931408628821373,
0.10946353524923325,
-0.012221835553646088,
0.06429854035377502,
-0.032959215342998505,
-0.17358072102069855,
0.0005498933023773134,
0.010584557428956032,
0.01421813853085041,
-0.27498123049736023,
-0.006278401706367731,
-0.040417157113552094,
-0.02869451604783535,
-0.08789606392383575,
0.15982159972190857,
0.0885312408208847,
0.049194421619176865,
-0.04073819890618324,
-0.161548912525177,
-0.03694755956530571,
0.058777566999197006,
-0.13822123408317566,
-0.14524413645267487
] |
null | null | transformers |
# CodeTrans model for code documentation generation java
Pretrained model on programming language java using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized java code functions: it works best with tokenized java functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the java function/method.
## Intended uses & limitations
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_java_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_java_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "public static < T , U > Function < T , U > castFunction ( Class < U > target ) { return new CastToClass < T , U > ( target ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/function%20documentation%20generation/java/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "public static < T , U > Function < T , U > castFunction ( Class < U > target ) { return new CastToClass < T , U > ( target ) ; }"}]} | summarization | SEBIS/code_trans_t5_base_code_documentation_generation_java_multitask_finetune | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
| CodeTrans model for code documentation generation java
======================================================
Pretrained model on programming language java using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized java code functions: it works best with tokenized java functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the java function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
46,
61,
88,
108
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.0875435620546341,
0.08141651004552841,
-0.0008901564870029688,
0.09888631850481033,
0.03720638155937195,
0.02414027228951454,
0.022098807618021965,
0.10695653408765793,
-0.028096502646803856,
0.058668091893196106,
0.06738431006669998,
-0.08258060365915298,
0.051521457731723785,
0.18190276622772217,
0.022874590009450912,
-0.1577356904745102,
-0.022099176421761513,
0.029071301221847534,
-0.049675244837999344,
0.1053246557712555,
0.08190523087978363,
-0.07999526709318161,
0.06949998438358307,
-0.043209247291088104,
-0.134763702750206,
0.05310754105448723,
-0.039225414395332336,
-0.023805255070328712,
0.09334231168031693,
0.06378041207790375,
0.11332995444536209,
-0.02075537107884884,
0.05692916736006737,
-0.21083268523216248,
0.003084182506427169,
0.024424687027931213,
0.06600337475538254,
0.034945059567689896,
0.04940299317240715,
0.07715380191802979,
0.12907548248767853,
-0.001225484418682754,
0.04537020996212959,
0.05477036163210869,
-0.06505519896745682,
-0.07806854695081711,
-0.06726040691137314,
0.06711797416210175,
0.07795772701501846,
0.09587807208299637,
-0.004583684727549553,
0.01215131115168333,
-0.07114855945110321,
0.09228669106960297,
0.12265842407941818,
-0.20454245805740356,
-0.02234375663101673,
0.1256343126296997,
0.09452889114618301,
0.04465333744883537,
-0.07980823516845703,
-0.03898776322603226,
0.10248562693595886,
0.04445900395512581,
0.06112457811832428,
-0.09619782865047455,
-0.07265875488519669,
-0.0015396008966490626,
-0.042331352829933167,
-0.04837382212281227,
0.16459688544273376,
0.03460408374667168,
-0.048416126519441605,
-0.10732114315032959,
-0.046197738498449326,
-0.19301585853099823,
0.04080532118678093,
0.01177568081766367,
0.017051221802830696,
-0.004199421498924494,
0.01993384212255478,
-0.02163650281727314,
-0.0931726023554802,
-0.1127549558877945,
0.023286236450076103,
0.023179052397608757,
0.05632863938808441,
0.03249315544962883,
-0.026058437302708626,
0.08353812992572784,
-0.008819404989480972,
-0.05196259915828705,
-0.019669247791171074,
0.01751522906124592,
-0.11176574975252151,
0.017150169238448143,
-0.012802299112081528,
-0.07030322402715683,
0.005862453021109104,
0.05847178027033806,
-0.11838806420564651,
0.0814395397901535,
0.0912976786494255,
0.014684596098959446,
0.01878594234585762,
0.21866044402122498,
0.038442213088274,
-0.1544256955385208,
0.0205767210572958,
0.01811967045068741,
-0.0019456925801932812,
0.006223043892532587,
-0.05099482089281082,
-0.03949819505214691,
0.020022153854370117,
0.06884521245956421,
-0.1232624351978302,
0.015706507489085197,
-0.04640747234225273,
-0.005778265651315451,
0.08300811052322388,
-0.12880872189998627,
0.03843969851732254,
0.009572255425155163,
-0.04406577721238136,
-0.037616487592458725,
0.08492802828550339,
-0.12085956335067749,
-0.11596537381410599,
0.03858189284801483,
-0.04360503703355789,
-0.03627791628241539,
-0.12044235318899155,
-0.10479488968849182,
0.001532590831629932,
-0.024438323453068733,
-0.003276645904406905,
-0.08735395222902298,
-0.10139426589012146,
-0.025628861039876938,
0.04284386709332466,
-0.007093669380992651,
-0.02666372060775757,
-0.036121007055044174,
0.0024376516230404377,
-0.00555493775755167,
-0.023646505549550056,
0.04444899782538414,
-0.03049100935459137,
0.08961410075426102,
0.07032596319913864,
0.03890892490744591,
-0.00644752848893404,
0.03346337378025055,
-0.0891975611448288,
0.08448366820812225,
-0.09323849529027939,
0.053558725863695145,
-0.014486797153949738,
0.06340601295232773,
-0.09900689870119095,
-0.07545578479766846,
0.009804797358810902,
0.05438623204827309,
0.0606626532971859,
0.018148144707083702,
-0.12888145446777344,
0.030904250219464302,
0.14772403240203857,
-0.11729560047388077,
-0.13451553881168365,
0.09697473049163818,
-0.006935774814337492,
0.029653236269950867,
0.05870987847447395,
0.1328798085451126,
0.1366286724805832,
-0.07175835967063904,
-0.01754371076822281,
0.07223718613386154,
0.06371985375881195,
-0.06531377136707306,
0.06643512845039368,
0.012615647166967392,
0.011131791397929192,
0.038611602038145065,
0.04247952625155449,
0.054086048156023026,
0.005914498120546341,
-0.03267943114042282,
-0.0410030223429203,
-0.08474276214838028,
-0.07448557764291763,
-0.006223094649612904,
0.02172260358929634,
-0.05667521432042122,
-0.05487848445773125,
0.015211150981485844,
0.17025375366210938,
-0.1010727733373642,
0.03451380506157875,
-0.06453599780797958,
-0.04545686021447182,
-0.0803770050406456,
0.028559476137161255,
-0.1269092559814453,
0.02826189063489437,
0.062125395983457565,
-0.053727373480796814,
0.039068225771188736,
0.08337987959384918,
0.00046771139022894204,
0.019357334822416306,
-0.060696881264448166,
-0.037857115268707275,
-0.037521664053201675,
-0.06310392171144485,
-0.096837617456913,
-0.03738591447472572,
-0.08467265963554382,
-0.035802412778139114,
-0.05283823981881142,
-0.16513028740882874,
-0.0059211100451648235,
0.0006004392053000629,
0.03229919448494911,
0.025273798033595085,
-0.03746683895587921,
0.033791106194257736,
0.05484047532081604,
-0.046348899602890015,
-0.08246186375617981,
0.017059974372386932,
0.04349812865257263,
-0.08826097100973129,
-0.018700694665312767,
-0.08713819831609726,
-0.06255042552947998,
0.06604493409395218,
0.10538557916879654,
-0.11119967699050903,
-0.0042483448050916195,
-0.028110003098845482,
-0.05242530256509781,
-0.05493176728487015,
-0.054792873561382294,
0.1658509522676468,
0.012600071728229523,
0.16759128868579865,
-0.14368021488189697,
-0.07244721800088882,
-0.028692420572042465,
0.007619478739798069,
0.01829702965915203,
0.15003061294555664,
0.011051565408706665,
-0.11214035749435425,
0.04170897603034973,
-0.03508541360497475,
-0.06017998233437538,
0.17593620717525482,
-0.01579153537750244,
-0.07155140489339828,
-0.0049816821701824665,
0.10993583500385284,
-0.01653679646551609,
0.1675434112548828,
-0.07624360918998718,
0.0009087197249755263,
-0.0033209375105798244,
0.0181493628770113,
0.04792533442378044,
-0.12058375775814056,
0.02888551354408264,
0.03762771189212799,
-0.0642533004283905,
-0.01745162159204483,
-0.020817097276449203,
-0.03998024761676788,
0.03987640514969826,
0.01882590353488922,
0.03661872446537018,
-0.024132387712597847,
-0.03501036763191223,
-0.10077951103448868,
0.18622133135795593,
-0.08061687648296356,
-0.21756301820278168,
-0.1697065234184265,
0.08276649564504623,
-0.03909442946314812,
-0.021514125168323517,
0.037904027849435806,
-0.08833121508359909,
-0.0627860501408577,
-0.10753294080495834,
0.10435141623020172,
-0.11716441065073013,
-0.007147642783820629,
-0.01870274543762207,
0.061630185693502426,
0.053426966071128845,
-0.16520243883132935,
0.025452913716435432,
-0.011796128936111927,
0.021166235208511353,
-0.014865902252495289,
-0.04557967931032181,
0.08231957256793976,
0.10907697677612305,
-0.06485006958246231,
0.024144386872649193,
0.004232412204146385,
0.1559784710407257,
-0.05468980595469475,
0.045219358056783676,
0.1794857531785965,
0.010506505146622658,
0.025596437975764275,
0.05674859881401062,
0.014526598155498505,
-0.0910707637667656,
0.05693761631846428,
0.050424862653017044,
-0.03816144913434982,
-0.22251124680042267,
-0.020657971501350403,
-0.07963532209396362,
0.05578739568591118,
0.1122988685965538,
0.06182791292667389,
-0.14735740423202515,
0.010617643594741821,
-0.006099198013544083,
0.15281522274017334,
-0.026371905580163002,
0.05996984988451004,
0.015619169920682907,
0.0033324093092232943,
0.0004863424983341247,
-0.10261048376560211,
0.014074060134589672,
0.07782025635242462,
0.10633789747953415,
0.19861306250095367,
-0.08365907520055771,
0.16741210222244263,
0.025231635197997093,
0.08513634651899338,
0.04421386495232582,
0.08198355883359909,
-0.1423759162425995,
0.008159338496625423,
0.0065124728716909885,
-0.0226009301841259,
-0.054208625108003616,
0.049288176000118256,
-0.048318471759557724,
0.07252773642539978,
-0.04572441801428795,
-0.007070634514093399,
0.01625165529549122,
0.20306909084320068,
0.05650102719664574,
-0.15674501657485962,
-0.12733688950538635,
0.023296993225812912,
-0.09249378740787506,
-0.11403315514326096,
0.06512830406427383,
0.22434964776039124,
-0.055903833359479904,
0.021941937506198883,
-0.005587776191532612,
0.13111943006515503,
-0.0975222960114479,
-0.01943393424153328,
0.04026899114251137,
0.05933269485831261,
0.01299955788999796,
0.12042026221752167,
-0.24534805119037628,
0.08538371324539185,
0.014256041496992111,
0.08013750612735748,
-0.017085691913962364,
0.06675784289836884,
-0.051035910844802856,
0.009567623026669025,
0.07437139004468918,
0.012507012113928795,
-0.05142764002084732,
-0.17426615953445435,
-0.028177661821246147,
0.02547054924070835,
0.04098751023411751,
-0.009155248291790485,
0.08093933761119843,
-0.02135906182229519,
0.042747195810079575,
-0.03137955069541931,
-0.122595876455307,
-0.05384451523423195,
-0.13980767130851746,
-0.02402498759329319,
0.010538674890995026,
-0.05345409736037254,
-0.03006185032427311,
0.03886993229389191,
0.03994783014059067,
0.24924366176128387,
-0.14819833636283875,
-0.07634733617305756,
-0.09846522659063339,
0.07294616848230362,
0.14439713954925537,
-0.08807350695133209,
0.015835395082831383,
0.002688504522666335,
0.06762028485536575,
-0.04802006483078003,
-0.06575681269168854,
0.022777777165174484,
-0.0565817654132843,
-0.08886466175317764,
-0.03158217668533325,
0.11086181551218033,
-0.025726625695824623,
0.03294945880770683,
-0.0024166798684746027,
-0.07205299288034439,
-0.039685092866420746,
-0.13271217048168182,
-0.07632873952388763,
0.013330062851309776,
0.03386945649981499,
-0.02067299745976925,
-0.126504585146904,
0.07965930551290512,
0.021779848262667656,
-0.1006477028131485,
0.07566734403371811,
0.17585553228855133,
-0.07157319784164429,
0.040356095880270004,
0.10220848023891449,
-0.06266596168279648,
-0.17472080886363983,
-0.04289720952510834,
0.04329439997673035,
0.07811270654201508,
-0.02549690753221512,
-0.14004302024841309,
0.06325365602970123,
0.018000660464167595,
0.011601299047470093,
-0.000977483345195651,
-0.26562848687171936,
-0.12455524504184723,
0.004961781203746796,
0.07765267044305801,
0.042490895837545395,
-0.1059039980173111,
-0.051684532314538956,
-0.070097416639328,
-0.06661935895681381,
0.04730164259672165,
0.06893804669380188,
0.10906326770782471,
-0.04345971345901489,
0.021382363513112068,
0.03906524181365967,
-0.03338303416967392,
0.0685025230050087,
-0.02544805221259594,
0.09940070658922195,
-0.017916178330779076,
0.010105559602379799,
0.04820476844906807,
-0.06000813841819763,
0.1890944093465805,
-0.15781189501285553,
0.10082504153251648,
-0.1673775166273117,
-0.04166712984442711,
-0.02774731069803238,
-0.00004746522972709499,
-0.04154399782419205,
-0.05044439435005188,
-0.10992695391178131,
0.03347676247358322,
0.054584842175245285,
-0.033498041331768036,
0.05440245568752289,
-0.030003491789102554,
-0.041934434324502945,
0.08823465555906296,
0.05953970178961754,
-0.01799807697534561,
-0.1097521260380745,
0.03151702508330345,
0.01678636111319065,
0.08243312686681747,
-0.2018255889415741,
0.024241609498858452,
0.09907709062099457,
0.020486818626523018,
0.10627949237823486,
0.0022521899081766605,
-0.09207003563642502,
0.016620101407170296,
0.0715748742222786,
-0.07054521888494492,
-0.07794523239135742,
-0.019805826246738434,
-0.02823239378631115,
-0.08719430863857269,
0.025347664952278137,
0.08820801228284836,
-0.06026872619986534,
-0.012376612983644009,
-0.003547887783497572,
0.018711388111114502,
-0.06711283326148987,
0.1623581200838089,
0.012768835760653019,
0.07948006689548492,
-0.0683821514248848,
0.08396369963884354,
0.1095876470208168,
-0.11064542829990387,
0.02218700386583805,
0.1663595587015152,
-0.08345625549554825,
-0.023733796551823616,
0.07871522754430771,
0.12653091549873352,
-0.015209789387881756,
-0.05737698823213577,
-0.09089650958776474,
-0.08039771020412445,
0.022283175960183144,
0.03755912184715271,
0.06803184002637863,
0.09174465388059616,
-0.02717224881052971,
-0.003604128723964095,
-0.11507096886634827,
0.10789943486452103,
0.06398877501487732,
0.052388980984687805,
-0.12906043231487274,
0.11420224606990814,
0.043599843978881836,
0.07339178770780563,
0.004158806521445513,
0.024859832599759102,
-0.1055838018655777,
0.029017454013228416,
-0.02189207263290882,
0.030940355733036995,
-0.008587596006691456,
0.04799399524927139,
-0.033228710293769836,
0.03133368119597435,
-0.03518454730510712,
0.051003795117139816,
-0.039248138666152954,
-0.027576198801398277,
-0.04424384608864784,
0.03512370586395264,
-0.06096397712826729,
-0.01996161788702011,
0.009522479958832264,
-0.07921648025512695,
0.09024778008460999,
-0.06860145926475525,
-0.006947887130081654,
-0.00004771224848809652,
0.006758549250662327,
0.06042559817433357,
0.021026071161031723,
0.04585784673690796,
-0.01171717420220375,
-0.017485065385699272,
0.025024808943271637,
0.03029089979827404,
-0.021232739090919495,
-0.0059860763140022755,
0.07781349122524261,
-0.1496703028678894,
-0.09465285390615463,
-0.08509912341833115,
-0.07946771383285522,
-0.05735570564866066,
0.07654659450054169,
0.10028185695409775,
0.07992743700742722,
0.0800962820649147,
-0.03404610604047775,
0.0041816881857812405,
-0.15694619715213776,
-0.04179510474205017,
0.04798382148146629,
-0.010726122185587883,
-0.12500204145908356,
-0.0353044718503952,
0.04805953428149223,
-0.04134076461195946,
0.1373317688703537,
-0.016255078837275505,
0.05336795374751091,
-0.007017564959824085,
-0.06627177447080612,
-0.03160379081964493,
0.0070754182524979115,
0.17262567579746246,
-0.10700701177120209,
0.004283281974494457,
-0.0052582575008273125,
0.008575566112995148,
0.025179535150527954,
0.16310220956802368,
0.085753433406353,
0.1306636929512024,
0.05584030598402023,
0.07901954650878906,
-0.04368995502591133,
-0.0298143457621336,
-0.10234920680522919,
0.08730188757181168,
-0.026613425463438034,
0.04737791046500206,
-0.0452764555811882,
0.1380535215139389,
0.11133673042058945,
-0.14855632185935974,
0.10651735216379166,
0.012186373583972454,
-0.08737390488386154,
-0.03517860174179077,
-0.09504774212837219,
-0.05427974462509155,
-0.09629447013139725,
0.001109189703129232,
-0.11340096592903137,
0.01914036273956299,
0.05158750340342522,
0.03127329424023628,
-0.026485633105039597,
0.15142887830734253,
-0.02239520289003849,
-0.061016995459795,
0.031925126910209656,
0.04930508881807327,
0.03494003042578697,
0.09065885841846466,
0.03144717589020729,
0.07140853255987167,
-0.060926567763090134,
0.06339825689792633,
0.04211951047182083,
0.009071785025298595,
0.004217046778649092,
0.019036732614040375,
-0.009447377175092697,
-0.043055277317762375,
-0.012252241373062134,
0.07958734780550003,
0.14013876020908356,
0.04506974667310715,
-0.04706557095050812,
-0.04754962772130966,
0.19998018443584442,
-0.051868729293346405,
-0.05709613859653473,
-0.12497421354055405,
0.14375704526901245,
0.05563545227050781,
0.008098687045276165,
0.020076017826795578,
-0.07208758592605591,
-0.031319860368967056,
0.23146839439868927,
0.04821133241057396,
-0.03180453181266785,
-0.0261901393532753,
0.0028970735147595406,
-0.009949331171810627,
-0.03837298974394798,
0.13580222427845,
0.001996269915252924,
0.22702746093273163,
0.017096826806664467,
-0.0001672228827374056,
-0.04530449956655502,
-0.046497803181409836,
-0.009127597324550152,
0.21401390433311462,
-0.03348279371857643,
0.017207743600010872,
-0.09712100774049759,
-0.018061505630612373,
0.030158596113324165,
-0.13956548273563385,
0.12084215879440308,
-0.14321421086788177,
-0.08131584525108337,
0.016487689688801765,
0.07875066250562668,
-0.04781637713313103,
0.03465736284852028,
-0.016571952030062675,
0.06194167956709862,
0.04771026223897934,
-0.02586449682712555,
-0.09583264589309692,
-0.15366511046886444,
0.04615066573023796,
-0.016943329945206642,
0.13213689625263214,
0.016129061579704285,
0.07141215354204178,
0.08279336243867874,
0.006238623987883329,
-0.08482839167118073,
0.1000342071056366,
0.032080117613077164,
-0.011927654966711998,
0.05252798646688461,
0.13330377638339996,
-0.03994123265147209,
0.15785066783428192,
0.010847105644643307,
-0.021663906052708626,
-0.024414407089352608,
-0.033626213669776917,
-0.0117353405803442,
-0.14934809505939484,
0.00020873155153822154,
-0.06046534702181816,
0.137529656291008,
0.19696873426437378,
-0.041886862367391586,
-0.024832556024193764,
-0.05462617799639702,
0.09383949637413025,
-0.020515691488981247,
0.09515536576509476,
0.01016894169151783,
-0.16795475780963898,
0.02419673092663288,
-0.0170606579631567,
0.014439810998737812,
-0.18906529247760773,
-0.054696012288331985,
-0.029318183660507202,
-0.03218197450041771,
-0.09006984531879425,
0.1429862529039383,
0.06838937848806381,
0.03365242853760719,
-0.03882235288619995,
-0.1544840782880783,
-0.003934538457542658,
0.04246325418353081,
-0.11778603494167328,
-0.11695491522550583
] |
null | null | transformers |
# CodeTrans model for code documentation generation java
Pretrained model on programming language java using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized java code functions: it works best with tokenized java functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the java function/method.
## Intended uses & limitations
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_java_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_java_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "public static < T , U > Function < T , U > castFunction ( Class < U > target ) { return new CastToClass < T , U > ( target ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/function%20documentation%20generation/java/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "public static < T , U > Function < T , U > castFunction ( Class < U > target ) { return new CastToClass < T , U > ( target ) ; }"}]} | summarization | SEBIS/code_trans_t5_base_code_documentation_generation_java_transfer_learning_finetune | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
| CodeTrans model for code documentation generation java
======================================================
Pretrained model on programming language java using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized java code functions: it works best with tokenized java functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the java function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
46,
61,
87,
108
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.09626366198062897,
0.05494412034749985,
-0.0008551637292839587,
0.10396359860897064,
0.039069004356861115,
0.023821929469704628,
0.03190087899565697,
0.10894661396741867,
-0.031034639105200768,
0.05791504681110382,
0.049589864909648895,
-0.06788932532072067,
0.06892198324203491,
0.1981969028711319,
0.014786692336201668,
-0.14493972063064575,
-0.030845679342746735,
0.039104241877794266,
-0.05940210819244385,
0.1074117124080658,
0.07846412807703018,
-0.08835969865322113,
0.07678594440221786,
-0.05134137347340584,
-0.12333054095506668,
0.04908269643783569,
-0.02511829324066639,
-0.02562115713953972,
0.0932442769408226,
0.07278613746166229,
0.11577534675598145,
-0.0200073029845953,
0.0667870044708252,
-0.204081729054451,
0.0017523891292512417,
0.02819323167204857,
0.06526953727006912,
0.04268472269177437,
0.0511593371629715,
0.08870699256658554,
0.11589272320270538,
-0.01844152808189392,
0.03746911883354187,
0.05591243878006935,
-0.06672690808773041,
-0.05523493513464928,
-0.06838513165712357,
0.07470627874135971,
0.09044650197029114,
0.09556017816066742,
-0.004928948823362589,
0.03628138452768326,
-0.07204997539520264,
0.08660419285297394,
0.11477921158075333,
-0.21676675975322723,
-0.02420695312321186,
0.11555522680282593,
0.09540862590074539,
0.056259043514728546,
-0.07836297899484634,
-0.03789176046848297,
0.11083254218101501,
0.04334345459938049,
0.06499260663986206,
-0.09328749030828476,
-0.03438170999288559,
-0.00529506104066968,
-0.048286788165569305,
-0.047827575355768204,
0.18036626279354095,
0.03691844269633293,
-0.051529280841350555,
-0.10685328394174576,
-0.042935267090797424,
-0.1880444884300232,
0.04128218814730644,
0.00313357450067997,
0.007280267309397459,
-0.009962321259081364,
0.013922407291829586,
-0.010838150978088379,
-0.09152978658676147,
-0.11791489273309708,
0.027235206216573715,
0.005304995458573103,
0.0557643286883831,
0.033804427832365036,
-0.043487388640642166,
0.08609503507614136,
0.03189067542552948,
-0.048227183520793915,
-0.01092795841395855,
0.007819260470569134,
-0.10516474395990372,
0.003147897543385625,
-0.003798606339842081,
-0.08333807438611984,
-0.008840743452310562,
0.049290575087070465,
-0.09697107970714569,
0.07493945956230164,
0.09071654081344604,
0.01890823431313038,
0.02160308137536049,
0.20820514857769012,
0.04058782756328583,
-0.1555735021829605,
0.023524079471826553,
0.028585735708475113,
-0.0045240698382258415,
0.012030551210045815,
-0.04558344930410385,
-0.05104488506913185,
0.01868033967912197,
0.0697980523109436,
-0.1254100352525711,
0.022410070523619652,
-0.05596354603767395,
-0.01583423651754856,
0.07332618534564972,
-0.13017329573631287,
0.03557012602686882,
0.010824449360370636,
-0.05504733696579933,
-0.03641827777028084,
0.0893486887216568,
-0.12873852252960205,
-0.1230468899011612,
0.02343786507844925,
-0.045328289270401,
-0.03848245367407799,
-0.12782205641269684,
-0.11214538663625717,
-0.004286849871277809,
-0.009539611637592316,
-0.0049450877122581005,
-0.09607724845409393,
-0.09273718297481537,
-0.02073027938604355,
0.04044897481799126,
-0.005305263679474592,
-0.028752490878105164,
-0.04354219511151314,
0.00532584497705102,
-0.007554002571851015,
-0.02420344576239586,
0.03276514634490013,
-0.0284726619720459,
0.09587891399860382,
0.0723135769367218,
0.04954954609274864,
0.00022481421183329076,
0.028740892186760902,
-0.08603153377771378,
0.08019769936800003,
-0.11746472120285034,
0.06023167446255684,
-0.01249232143163681,
0.059830814599990845,
-0.1078604906797409,
-0.07712110877037048,
0.012158464640378952,
0.048184968531131744,
0.07243100553750992,
0.036559391766786575,
-0.14275366067886353,
0.03345784544944763,
0.1462671011686325,
-0.1179630383849144,
-0.13905023038387299,
0.1015576422214508,
-0.007661576382815838,
0.04781581088900566,
0.06241654232144356,
0.12751632928848267,
0.15232978761196136,
-0.08325353264808655,
-0.027292467653751373,
0.07559753954410553,
0.04785148426890373,
-0.07048819214105606,
0.057117193937301636,
0.023522689938545227,
-0.018452901393175125,
0.023371128365397453,
0.058401595801115036,
0.057131703943014145,
-0.004407807253301144,
-0.03714282438158989,
-0.036070242524147034,
-0.09279008954763412,
-0.06834414601325989,
-0.007643648888915777,
0.02444869838654995,
-0.050556477159261703,
-0.04396996274590492,
-0.0017121107084676623,
0.16119198501110077,
-0.0991191640496254,
0.029794152826070786,
-0.07626649737358093,
-0.047633424401283264,
-0.07453793287277222,
0.02716556377708912,
-0.11649780720472336,
0.035199884325265884,
0.06669086217880249,
-0.043017636984586716,
0.04024970531463623,
0.0824989452958107,
0.00365192792378366,
0.015965523198246956,
-0.05736921355128288,
-0.03956924006342888,
-0.033292874693870544,
-0.06867383420467377,
-0.10724145174026489,
-0.03614615648984909,
-0.09168891608715057,
-0.026257146149873734,
-0.06979098170995712,
-0.16903342306613922,
-0.002996627939864993,
-0.014721455052495003,
0.030940040946006775,
0.031482767313718796,
-0.03197307139635086,
0.03279172256588936,
0.05136745050549507,
-0.04360340163111687,
-0.07999636977910995,
0.016177240759134293,
0.03815310820937157,
-0.08880804479122162,
-0.02399551495909691,
-0.08553804457187653,
-0.07067979127168655,
0.07338967174291611,
0.1017574891448021,
-0.1184607446193695,
-0.008483314886689186,
-0.028004232794046402,
-0.05008116737008095,
-0.05590598285198212,
-0.05468784645199776,
0.16202442348003387,
0.012852047570049763,
0.15758590400218964,
-0.14106322824954987,
-0.06759534031152725,
-0.025955066084861755,
0.01290500070899725,
0.026733724400401115,
0.1477169245481491,
0.029051830992102623,
-0.11666262894868851,
0.03220604360103607,
-0.035774391144514084,
-0.046814799308776855,
0.16301079094409943,
-0.016931673511862755,
-0.07136902958154678,
-0.0024892138317227364,
0.10414846986532211,
-0.0037093935534358025,
0.18752175569534302,
-0.04899071529507637,
0.004556034691631794,
-0.006821994669735432,
0.012043382972478867,
0.04160134866833687,
-0.11936082690954208,
0.02690882980823517,
0.035516876727342606,
-0.06219133734703064,
-0.027897775173187256,
-0.027123739942908287,
-0.0368742011487484,
0.042286358773708344,
0.013580811209976673,
0.0405791699886322,
-0.02214903198182583,
-0.036528080701828,
-0.10686929523944855,
0.17693817615509033,
-0.07219038158655167,
-0.21422643959522247,
-0.16728858649730682,
0.11392298340797424,
-0.013657820411026478,
-0.01912294514477253,
0.026590541005134583,
-0.08353129774332047,
-0.06361951678991318,
-0.10653839260339737,
0.12244369089603424,
-0.1048632264137268,
-0.0018705599941313267,
-0.02197965793311596,
0.0635552629828453,
0.05456232279539108,
-0.1658056527376175,
0.03280101716518402,
-0.009210200048983097,
0.019968733191490173,
-0.012443490326404572,
-0.058306530117988586,
0.07946426421403885,
0.10282796621322632,
-0.07388807833194733,
0.01984856091439724,
-0.005235875956714153,
0.16597507894039154,
-0.058608751744031906,
0.056446708738803864,
0.17090284824371338,
0.015454938635230064,
0.021447498351335526,
0.06121821328997612,
0.005157845560461283,
-0.09429669380187988,
0.06795269250869751,
0.042133431881666183,
-0.028351055458188057,
-0.21909749507904053,
-0.019474079832434654,
-0.07599344849586487,
0.075797900557518,
0.11825443804264069,
0.04392869397997856,
-0.16495788097381592,
0.02088901959359646,
-0.006114759016782045,
0.16790497303009033,
-0.024619102478027344,
0.055794212967157364,
-0.0019854730926454067,
0.00966525450348854,
-0.004406068008393049,
-0.10263864696025848,
0.0094681391492486,
0.07321545481681824,
0.10831516981124878,
0.19598332047462463,
-0.09278740733861923,
0.16156065464019775,
0.008020865730941296,
0.10848043859004974,
0.04294314980506897,
0.10570626705884933,
-0.13301894068717957,
0.013844072818756104,
0.005358180496841669,
-0.01763422042131424,
-0.06956887990236282,
0.044744256883859634,
-0.03939124941825867,
0.08254566043615341,
-0.06204092502593994,
0.01364381704479456,
0.01800389587879181,
0.19333772361278534,
0.06563491374254227,
-0.15687797963619232,
-0.12680530548095703,
0.016438262537121773,
-0.09015499800443649,
-0.10944067686796188,
0.071663498878479,
0.23585577309131622,
-0.05596829578280449,
0.01348634622991085,
-0.01196281984448433,
0.13232819736003876,
-0.09461957216262817,
-0.022522268816828728,
0.03198440000414848,
0.056266508996486664,
0.005393314175307751,
0.11756135523319244,
-0.2694449722766876,
0.07788700610399246,
0.017173755913972855,
0.09201426059007645,
-0.018777402117848396,
0.058205366134643555,
-0.05508055165410042,
0.0033444601576775312,
0.08059421181678772,
0.01362584438174963,
-0.053565215319395065,
-0.18473930656909943,
-0.04184030741453171,
0.028765445575118065,
0.041571587324142456,
-0.00420795101672411,
0.08242764323949814,
-0.020360667258501053,
0.04211005941033363,
-0.03037099540233612,
-0.12220395356416702,
-0.07271316647529602,
-0.13131985068321228,
-0.03661003336310387,
-0.0009831746574491262,
-0.04817876219749451,
-0.02926528826355934,
0.0459437258541584,
0.05275261402130127,
0.22376246750354767,
-0.15729649364948273,
-0.0693427249789238,
-0.09151678532361984,
0.0659833624958992,
0.13714243471622467,
-0.0837424024939537,
0.0154232457280159,
0.022852560505270958,
0.05927543342113495,
-0.04503900185227394,
-0.07153481990098953,
0.03167864680290222,
-0.05830033868551254,
-0.07966773957014084,
-0.035272497683763504,
0.10403729975223541,
-0.011543653905391693,
0.04573630914092064,
0.009668703190982342,
-0.08749599009752274,
-0.030399683862924576,
-0.12660284340381622,
-0.07587740570306778,
-0.008363628759980202,
0.036436229944229126,
-0.016237273812294006,
-0.1308918446302414,
0.051981352269649506,
0.0015091856475919485,
-0.09572858363389969,
0.06788971275091171,
0.15701274573802948,
-0.07085809856653214,
0.028786007314920425,
0.0839838907122612,
-0.06277819722890854,
-0.18054194748401642,
-0.03421330824494362,
0.040595054626464844,
0.08391402661800385,
-0.025452537462115288,
-0.13571836054325104,
0.05979211628437042,
-0.003355117980390787,
0.020398253574967384,
0.008000541478395462,
-0.26217910647392273,
-0.1276608109474182,
-0.0008581784786656499,
0.08090335130691528,
0.03897865116596222,
-0.09490213543176651,
-0.04759978875517845,
-0.06303629279136658,
-0.06336695700883865,
0.06480047851800919,
0.06999164074659348,
0.10519590973854065,
-0.03726793825626373,
0.02475333772599697,
0.04402647912502289,
-0.033111702650785446,
0.053544651716947556,
-0.021136892959475517,
0.10361264646053314,
-0.018912870436906815,
0.004770709201693535,
0.04318640008568764,
-0.059269603341817856,
0.18615959584712982,
-0.16372764110565186,
0.10342755168676376,
-0.1762174665927887,
-0.039674703031778336,
-0.031743329018354416,
-0.00017998524708673358,
-0.04223397374153137,
-0.04948335140943527,
-0.115621417760849,
0.04433320835232735,
0.06098092347383499,
-0.028742408379912376,
0.034274566918611526,
-0.018602536991238594,
-0.04828823730349541,
0.07674174755811691,
0.08578696846961975,
-0.0007108842255547643,
-0.10024967044591904,
0.03738539293408394,
0.018843939527869225,
0.10087171196937561,
-0.18465188145637512,
0.029612723737955093,
0.10546090453863144,
0.014512921683490276,
0.0990782380104065,
0.00806034728884697,
-0.08803097903728485,
0.016838109120726585,
0.07032555341720581,
-0.07124965637922287,
-0.07942213118076324,
-0.01655293069779873,
-0.016419706866145134,
-0.08694702386856079,
0.02559657394886017,
0.08391027897596359,
-0.06754177808761597,
-0.01099496427923441,
-0.007064519915729761,
0.01580895483493805,
-0.07684849202632904,
0.17108389735221863,
0.013810898177325726,
0.08036643266677856,
-0.05533897131681442,
0.08374624699354172,
0.09941462427377701,
-0.12124617397785187,
0.026147887110710144,
0.1601499319076538,
-0.08489030599594116,
-0.021501118317246437,
0.10301246494054794,
0.1343153864145279,
-0.014605659991502762,
-0.050770875066518784,
-0.09221647679805756,
-0.08431828022003174,
0.018993092700839043,
0.05737115070223808,
0.058993276208639145,
0.08738832175731659,
-0.020314686000347137,
-0.0031162395607680082,
-0.12641125917434692,
0.10265841335058212,
0.07212110608816147,
0.0459572933614254,
-0.12358684092760086,
0.14196346700191498,
0.03439484164118767,
0.07287962734699249,
-0.0003911609819624573,
0.03809259086847305,
-0.11242251843214035,
0.033003561198711395,
-0.011885854415595531,
0.03871578350663185,
-0.004517692141234875,
0.043874967843294144,
-0.036688294261693954,
0.039334703236818314,
-0.030698275193572044,
0.04726351425051689,
-0.03562050312757492,
-0.023853568360209465,
-0.041416145861148834,
0.0347612202167511,
-0.05528312548995018,
-0.021264618262648582,
0.011118533089756966,
-0.08407466858625412,
0.08611272275447845,
-0.06628928333520889,
-0.00978239718824625,
-0.004510920494794846,
0.018939251080155373,
0.054212991148233414,
0.008804111741483212,
0.0501646026968956,
-0.005269399378448725,
-0.012440801598131657,
0.02535579353570938,
0.025658581405878067,
-0.01872936077415943,
-0.006197608541697264,
0.08277197182178497,
-0.14303560554981232,
-0.08111336082220078,
-0.08264503628015518,
-0.07075709104537964,
-0.061752207577228546,
0.08094680309295654,
0.08408904820680618,
0.07022697478532791,
0.08461175113916397,
-0.03922991827130318,
0.006828272249549627,
-0.16343213617801666,
-0.040038157254457474,
0.05321421101689339,
-0.004917781800031662,
-0.11636468023061752,
-0.036145783960819244,
0.06023246422410011,
-0.03766645863652229,
0.11301390826702118,
-0.0076994141563773155,
0.035845182836055756,
-0.00814360287040472,
-0.07382601499557495,
-0.05680859088897705,
0.012136375531554222,
0.18573527038097382,
-0.11044050753116608,
0.006647417321801186,
-0.006525726523250341,
0.00815512053668499,
0.023904915899038315,
0.16814401745796204,
0.10471971333026886,
0.12923644483089447,
0.03385152295231819,
0.08202753216028214,
-0.047968581318855286,
-0.033885449171066284,
-0.10002698749303818,
0.07808282226324081,
-0.03128619119524956,
0.044950272887945175,
-0.03635530546307564,
0.1338959038257599,
0.09035168588161469,
-0.1479041427373886,
0.10463356971740723,
-0.0006360607803799212,
-0.09607435762882233,
-0.03334880992770195,
-0.09103723615407944,
-0.045636869966983795,
-0.0991683229804039,
0.0014703253982588649,
-0.1116596907377243,
0.006240015849471092,
0.042364638298749924,
0.029150627553462982,
-0.03186975046992302,
0.1579492837190628,
-0.032072655856609344,
-0.05576683208346367,
0.04017535597085953,
0.05192451924085617,
0.026007384061813354,
0.07476332038640976,
0.03082546778023243,
0.0619179829955101,
-0.06500555574893951,
0.06012200564146042,
0.035173919051885605,
-0.0008981717983260751,
0.01337097492069006,
0.03553131967782974,
-0.007913419976830482,
-0.04228167235851288,
-0.02118481695652008,
0.07835939526557922,
0.14924858510494232,
0.043255362659692764,
-0.03792425990104675,
-0.050625450909137726,
0.1957796961069107,
-0.05803774669766426,
-0.05692613124847412,
-0.12924276292324066,
0.15542231500148773,
0.04351484775543213,
0.012044427916407585,
0.021387195214629173,
-0.07542692869901657,
-0.026049558073282242,
0.24208898842334747,
0.0479048416018486,
-0.048535000532865524,
-0.029304081574082375,
-0.006558835972100496,
-0.0075599332340061665,
-0.041327208280563354,
0.14475388824939728,
0.009695972315967083,
0.21672989428043365,
0.009698995389044285,
0.0028875390999019146,
-0.04379376769065857,
-0.04738093540072441,
0.0012204234953969717,
0.2029152661561966,
-0.03154885768890381,
0.025182586163282394,
-0.09729470312595367,
-0.016040505841374397,
0.0202410239726305,
-0.1394854635000229,
0.12097791582345963,
-0.1357739418745041,
-0.06924176216125488,
0.007348396349698305,
0.07413103431463242,
-0.05104553699493408,
0.04249586537480354,
-0.019934911280870438,
0.07479118555784225,
0.05271433666348457,
-0.02332053892314434,
-0.09438033401966095,
-0.14768929779529572,
0.04500249773263931,
-0.02520507015287876,
0.13010847568511963,
0.018486276268959045,
0.07380395382642746,
0.0773419588804245,
0.0038535960484296083,
-0.08628171682357788,
0.08613818883895874,
0.030986744910478592,
0.004731159191578627,
0.047654084861278534,
0.13086743652820587,
-0.04165612533688545,
0.1795055866241455,
0.003566420404240489,
-0.029260769486427307,
-0.025345470756292343,
-0.036783818155527115,
-0.009628034196794033,
-0.15037502348423004,
0.0037196839693933725,
-0.05502528324723244,
0.14132705330848694,
0.1958775520324707,
-0.04681859537959099,
-0.01588568650186062,
-0.05382921174168587,
0.08882655948400497,
-0.014504291117191315,
0.09150546789169312,
0.0061602406203746796,
-0.15802258253097534,
0.00983403529971838,
-0.03188537433743477,
0.014686045236885548,
-0.1896747350692749,
-0.04624493047595024,
-0.03989245370030403,
-0.03899626061320305,
-0.10001925379037857,
0.14315621554851532,
0.07295674085617065,
0.03515259921550751,
-0.041330255568027496,
-0.14726437628269196,
-0.004807875957340002,
0.04626958444714546,
-0.11646705120801926,
-0.11931516230106354
] |
null | null | transformers |
# CodeTrans model for code documentation generation javascript
Pretrained model on programming language javascript using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized javascript code functions: it works best with tokenized javascript functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used single-task training on CodeSearchNet Corpus javascript dataset.
## Intended uses & limitations
The model could be used to generate the description for the javascript function or be fine-tuned on other javascript code tasks. It can be used on unparsed and untokenized javascript code. However, if the javascript code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_javascript"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_javascript", skip_special_tokens=True),
device=0
)
tokenized_code = "function isStandardBrowserEnv ( ) { if ( typeof navigator !== 'undefined' && ( navigator . product === 'ReactNative' || navigator . product === 'NativeScript' || navigator . product === 'NS' ) ) { return false ; } return ( typeof window !== 'undefined' && typeof document !== 'undefined' ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/single%20task/function%20documentation%20generation/javascript/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "function isStandardBrowserEnv ( ) { if ( typeof navigator !== 'undefined' && ( navigator . product === 'ReactNative' || navigator . product === 'NativeScript' || navigator . product === 'NS' ) ) { return false ; } return ( typeof window !== 'undefined' && typeof document !== 'undefined' ) ; }"}]} | summarization | SEBIS/code_trans_t5_base_code_documentation_generation_javascript | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
| CodeTrans model for code documentation generation javascript
============================================================
Pretrained model on programming language javascript using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized javascript code functions: it works best with tokenized javascript functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used single-task training on CodeSearchNet Corpus javascript dataset.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the javascript function or be fine-tuned on other javascript code tasks. It can be used on unparsed and untokenized javascript code. However, if the javascript code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
46,
111
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.11461012810468674,
-0.0035359400790184736,
-0.0008824251126497984,
0.06468670815229416,
0.15548823773860931,
0.0082349618896842,
0.08658581972122192,
0.04456472396850586,
0.013144658878445625,
-0.04290953651070595,
0.09481451660394669,
0.12595465779304504,
0.022058678790926933,
0.1323317438364029,
-0.01876942627131939,
-0.17378461360931396,
-0.035975851118564606,
0.05388245731592178,
-0.1362789273262024,
0.13216103613376617,
0.10999716073274612,
-0.05213123932480812,
0.09860560297966003,
0.02678159438073635,
-0.21826249361038208,
0.06846225261688232,
0.007867471314966679,
-0.07744510471820831,
0.12747734785079956,
0.07735329121351242,
0.11276119202375412,
0.0368160679936409,
0.007403124123811722,
-0.2215440571308136,
0.03711774945259094,
-0.020954228937625885,
0.0017202511662617326,
0.04505430534482002,
0.04536253586411476,
-0.09547664225101471,
0.19017325341701508,
-0.013913631439208984,
0.004751363303512335,
0.07824850082397461,
-0.11022023111581802,
-0.054692208766937256,
-0.010552224703133106,
-0.04456780105829239,
0.10219511389732361,
0.0783027783036232,
0.014635940082371235,
0.1188167929649353,
-0.15354204177856445,
0.1268504410982132,
0.11059929430484772,
-0.1769643872976303,
-0.02259374037384987,
0.07469552755355835,
0.08469392359256744,
-0.09629032760858536,
-0.04474705085158348,
0.021497055888175964,
0.07222572714090347,
0.015619175508618355,
0.03040517121553421,
-0.1302701234817505,
-0.14820976555347443,
0.05741337314248085,
-0.05717526376247406,
-0.06474626064300537,
0.2790777385234833,
0.022652819752693176,
-0.0365578792989254,
-0.05105200409889221,
-0.040664248168468475,
0.0589628703892231,
-0.017983313649892807,
0.029783470556139946,
-0.00744770048186183,
-0.0051270476542413235,
-0.06681642681360245,
-0.02055198699235916,
-0.10675365477800369,
-0.11751110851764679,
0.0067968606017529964,
0.07791779190301895,
-0.01939668320119381,
0.025235194712877274,
-0.18738779425621033,
0.10748340934515,
0.11155831813812256,
-0.0974305272102356,
0.02152913622558117,
-0.06926069408655167,
-0.023466389626264572,
-0.0266446303576231,
-0.04812353849411011,
-0.13331137597560883,
0.10572692006826401,
0.0865279957652092,
-0.007918286137282848,
0.06255071610212326,
0.010040581226348877,
0.06365688145160675,
0.047667331993579865,
0.20074611902236938,
0.03656597062945366,
-0.09887246787548065,
0.051293693482875824,
0.000435913068940863,
-0.048600584268569946,
0.008032571524381638,
-0.0833040326833725,
-0.037750761955976486,
0.011534435674548149,
0.1323920488357544,
-0.07478129118680954,
0.07874185591936111,
-0.07194870710372925,
-0.020416047424077988,
0.028239451348781586,
-0.14095422625541687,
-0.018585070967674255,
0.029342737048864365,
-0.06843279302120209,
-0.032794203609228134,
0.11565475165843964,
-0.06602802127599716,
-0.10192817449569702,
-0.06694625318050385,
-0.07647140324115753,
0.015075994655489922,
-0.09826536476612091,
-0.06166263297200203,
0.012703930959105492,
0.04028043523430824,
0.07551392167806625,
-0.13581034541130066,
-0.1460268348455429,
-0.005535236094146967,
0.07340206950902939,
0.02713039331138134,
0.0601491741836071,
-0.12842854857444763,
-0.02563612535595894,
-0.015045269392430782,
-0.018831096589565277,
0.026109278202056885,
-0.06543436646461487,
0.07612115889787674,
0.10142695903778076,
0.031110050156712532,
-0.030551951378583908,
0.03943389654159546,
-0.14451858401298523,
0.07473336905241013,
-0.1598336547613144,
0.10665369033813477,
-0.02237825281918049,
0.11507774889469147,
-0.10618549585342407,
-0.05201151221990585,
0.032894980162382126,
0.055891867727041245,
0.05818672478199005,
0.17181137204170227,
-0.10942483693361282,
-0.07752726227045059,
0.14096955955028534,
-0.0769459456205368,
-0.23206613957881927,
0.06856031715869904,
-0.0813804417848587,
0.1598816215991974,
0.07435569167137146,
0.1716044396162033,
0.1629386991262436,
-0.07633230090141296,
0.04213416203856468,
0.1115201860666275,
-0.053551483899354935,
-0.057047635316848755,
0.061795055866241455,
0.03737811744213104,
-0.12498179078102112,
0.04768538475036621,
-0.021212156862020493,
0.15130670368671417,
-0.04510960727930069,
-0.040006574243307114,
-0.005931742489337921,
-0.06309612095355988,
0.07938283681869507,
0.0073220329359173775,
0.06360567361116409,
-0.0017612099181860685,
-0.024186259135603905,
0.045755427330732346,
0.09708250313997269,
-0.12846724689006805,
-0.006975571624934673,
-0.1210377886891365,
0.050632499158382416,
-0.12090007960796356,
0.02274591475725174,
-0.20957958698272705,
-0.048254791647195816,
-0.003249741392210126,
0.02301093004643917,
0.05093787610530853,
0.006152339745312929,
0.02268536202609539,
0.008541957475244999,
0.017662731930613518,
-0.02856879122555256,
-0.010636896826326847,
-0.02813543938100338,
-0.01579553261399269,
-0.09939732402563095,
-0.06873305886983871,
-0.059694789350032806,
0.03201281279325485,
-0.16384732723236084,
-0.0029238108545541763,
0.026544511318206787,
0.0559278279542923,
0.022171583026647568,
0.03512921556830406,
0.03166094422340393,
0.0462224967777729,
-0.05463724583387375,
-0.028270315378904343,
0.06041471287608147,
0.015432738699018955,
-0.12148012220859528,
0.029065342620015144,
-0.06936729699373245,
0.03688448294997215,
0.1150563433766365,
-0.18266618251800537,
-0.0425114706158638,
-0.05483003705739975,
-0.04157252609729767,
-0.006797537207603455,
0.011464055627584457,
-0.039784274995326996,
0.23523816466331482,
0.005660706665366888,
0.17582674324512482,
-0.1106969490647316,
-0.039125509560108185,
-0.030700456351041794,
-0.024550477042794228,
0.02176782861351967,
0.11787008494138718,
0.10358690470457077,
-0.19636526703834534,
0.027277208864688873,
0.11872928589582443,
0.005946303717792034,
0.22557280957698822,
-0.054425228387117386,
-0.02293008379638195,
-0.015975408256053925,
0.05865349993109703,
-0.04294413700699806,
0.12924796342849731,
-0.24062444269657135,
-0.058267783373594284,
-0.004165245220065117,
-0.026490647345781326,
0.11081412434577942,
-0.11800013482570648,
0.002358518773689866,
0.019950183108448982,
-0.03499842435121536,
-0.14659591019153595,
0.03675173968076706,
0.014494446106255054,
0.02814982458949089,
-0.00079946598270908,
-0.03217706456780434,
0.04640364646911621,
-0.0422600656747818,
-0.13529735803604126,
0.22895902395248413,
-0.07077545672655106,
-0.21448981761932373,
-0.1732424795627594,
0.060951221734285355,
-0.03434126079082489,
0.00013869014219380915,
0.04118100926280022,
-0.060483187437057495,
-0.021283568814396858,
-0.03360018879175186,
0.15907523036003113,
-0.04124320298433304,
-0.034780219197273254,
-0.0028410255908966064,
0.06987737119197845,
-0.012825299054384232,
-0.1954353302717209,
-0.01862414740025997,
0.011957436800003052,
0.031236182898283005,
-0.0258120596408844,
-0.1555672138929367,
0.12978962063789368,
0.11581948399543762,
-0.032786719501018524,
0.054724983870983124,
-0.0381384901702404,
0.22038409113883972,
-0.07935962080955505,
-0.05612299218773842,
0.18669156730175018,
-0.0809822753071785,
0.010714615695178509,
-0.0009055985137820244,
0.0060368487611413,
-0.09603305906057358,
0.03865261748433113,
-0.0428406298160553,
-0.07074383646249771,
-0.231038436293602,
-0.10076731443405151,
-0.08940484374761581,
0.0840388610959053,
0.03337441757321358,
0.0029791593551635742,
-0.09602640569210052,
0.0614977665245533,
0.05511915311217308,
0.11589314788579941,
0.009190072305500507,
0.05949791520833969,
0.03684278577566147,
0.022258099168539047,
-0.00197216821834445,
-0.11250118166208267,
-0.07468652725219727,
0.0111958347260952,
0.06488393247127533,
0.19302992522716522,
0.008241245523095131,
0.15282127261161804,
0.06974279880523682,
0.030034275725483894,
0.014151254668831825,
0.16839665174484253,
-0.07568487524986267,
0.017039943486452103,
0.0012183309299871325,
-0.03857753798365593,
-0.14475733041763306,
0.0242433063685894,
-0.022935640066862106,
0.040557172149419785,
-0.16801102459430695,
-0.019959600642323494,
0.09216884523630142,
0.061227455735206604,
0.015148297883570194,
-0.27841269969940186,
-0.10820983350276947,
0.03293969854712486,
-0.07280364632606506,
-0.060630254447460175,
0.06975669413805008,
0.11409500241279602,
-0.1292305588722229,
0.016567660495638847,
-0.048629119992256165,
0.15450121462345123,
-0.05812307447195053,
0.026783686131238937,
-0.03214392066001892,
-0.03196597099304199,
0.011681007221341133,
0.15693742036819458,
-0.19303852319717407,
0.21518924832344055,
0.020716268569231033,
0.05359519273042679,
-0.07884866744279861,
0.036640796810388565,
0.035471558570861816,
0.0838090255856514,
0.12743356823921204,
-0.03228461369872093,
-0.011909481137990952,
-0.16012975573539734,
0.05305321142077446,
0.09288375079631805,
0.09195494651794434,
-0.0484808050096035,
0.0795411616563797,
-0.02227286994457245,
0.009657802060246468,
-0.013028684072196484,
-0.09657083451747894,
-0.1299695074558258,
-0.10758765041828156,
-0.030814768746495247,
-0.05847654119133949,
0.040283411741256714,
-0.018528591841459274,
0.0332365408539772,
0.06259172409772873,
0.1574561446905136,
-0.06422657519578934,
-0.0704076811671257,
-0.08659180253744125,
0.023615680634975433,
0.10658147931098938,
-0.09792820364236832,
0.017683083191514015,
0.0018672867445275187,
0.029695114120841026,
-0.005209050606936216,
-0.16518479585647583,
0.052444394677877426,
-0.08108486980199814,
0.008016930893063545,
-0.029121914878487587,
0.13806012272834778,
-0.007233787793666124,
-0.019400447607040405,
0.07715905457735062,
-0.07641121000051498,
-0.05286146327853203,
-0.14759604632854462,
-0.07843979448080063,
-0.08331295102834702,
0.049832381308078766,
0.048483509570360184,
-0.09719178825616837,
0.013379920274019241,
-0.02924962155520916,
0.026985159143805504,
0.19319245219230652,
0.09494185447692871,
-0.030758721753954887,
-0.013628657907247543,
0.13179911673069,
-0.09628883004188538,
-0.23237363994121552,
0.011583887040615082,
-0.030125798657536507,
0.0685155913233757,
0.012367798015475273,
-0.15640568733215332,
0.09160232543945312,
-0.07026161253452301,
0.04107191041111946,
0.026367032900452614,
-0.31285378336906433,
-0.08523494750261307,
0.11407842487096786,
0.09984198957681656,
0.06677345931529999,
-0.0997757613658905,
-0.05513607710599899,
-0.08281738311052322,
-0.20346599817276,
0.1534106731414795,
-0.15209627151489258,
0.07457318156957626,
0.0006585290539078414,
0.05803006514906883,
0.03000224381685257,
-0.055210694670677185,
0.0949845016002655,
0.05150808021426201,
0.11374648660421371,
-0.011435470543801785,
-0.13553330302238464,
0.13794317841529846,
-0.04455830529332161,
0.1363283097743988,
-0.10337000340223312,
0.09238632023334503,
-0.20546762645244598,
-0.03739846870303154,
-0.05653882026672363,
0.07111496478319168,
-0.007944086566567421,
-0.05659383162856102,
-0.0620049424469471,
-0.0053640566766262054,
0.02001909725368023,
0.032747481018304825,
0.11457230895757675,
-0.037773702293634415,
-0.01502803061157465,
0.09569206833839417,
0.191043421626091,
-0.03979054093360901,
-0.03756558522582054,
0.04190865159034729,
0.018413422629237175,
0.09601085633039474,
-0.23235614597797394,
0.09032157063484192,
0.11152046173810959,
0.035077713429927826,
0.10669742524623871,
0.09210249781608582,
-0.03164904937148094,
0.0628313273191452,
0.08713460713624954,
-0.1210654005408287,
-0.059174906462430954,
-0.09119950979948044,
-0.10305238515138626,
-0.021220613270998,
0.09052182734012604,
0.13690385222434998,
-0.023661743849515915,
-0.005051795393228531,
-0.019589584320783615,
-0.03781585767865181,
-0.13023343682289124,
0.09989959001541138,
0.04617099091410637,
0.06334198266267776,
-0.08138599991798401,
0.06339500844478607,
0.021739989519119263,
-0.13202428817749023,
-0.034819480031728745,
0.09826638549566269,
-0.12570582330226898,
-0.08410654962062836,
-0.03354115039110184,
0.31721988320350647,
-0.10280388593673706,
-0.07525206357240677,
-0.14150799810886383,
-0.05269709974527359,
-0.014219663105905056,
0.20715291798114777,
0.08883816003799438,
0.09669423848390579,
-0.052606262266635895,
0.004290872253477573,
-0.1079057827591896,
0.0608099065721035,
0.09638874232769012,
-0.01299726590514183,
-0.0901162326335907,
0.12076336145401001,
0.0019341500010341406,
0.16419875621795654,
-0.05707070231437683,
-0.03527488932013512,
-0.15689586102962494,
0.0904013141989708,
-0.11245295405387878,
0.051430512219667435,
-0.0825275257229805,
0.03378938511013985,
-0.0025837107095867395,
0.019148893654346466,
-0.05234568566083908,
0.04744087904691696,
-0.08373872190713882,
0.013966095633804798,
0.019603325054049492,
0.07817638665437698,
-0.09686703234910965,
0.00661702174693346,
0.08004160225391388,
-0.07570171356201172,
0.0953608974814415,
0.031796518713235855,
-0.051483988761901855,
0.12433484196662903,
-0.20015500485897064,
-0.041394829750061035,
0.03262448310852051,
0.017192702740430832,
0.039861638098955154,
-0.012444478459656239,
0.053559836000204086,
0.02344585955142975,
0.0430818535387516,
-0.022691288962960243,
0.07754692435264587,
-0.1173877939581871,
-0.0711045190691948,
-0.00788444560021162,
-0.11344754695892334,
-0.049565766006708145,
0.0477462112903595,
0.04155954346060753,
0.06900747865438461,
0.10698019713163376,
-0.025126762688159943,
0.054038580507040024,
-0.11157670617103577,
-0.01706859841942787,
0.04918717220425606,
-0.06384994089603424,
-0.05789225548505783,
-0.10798408091068268,
0.04357178136706352,
-0.07257182896137238,
0.1758565455675125,
0.013937301933765411,
0.1810729205608368,
-0.023164229467511177,
-0.0016429111128672957,
0.04299284517765045,
0.047683678567409515,
0.19100096821784973,
-0.03652071952819824,
0.0448596328496933,
-0.059056755155324936,
0.06484582275152206,
0.050965964794158936,
0.04773273319005966,
0.12417124956846237,
0.07795965671539307,
-0.014808573760092258,
0.1289520561695099,
0.01987878605723381,
0.027321448549628258,
-0.08150225877761841,
-0.11334669589996338,
0.09304061532020569,
0.05152955278754234,
-0.04034040495753288,
0.06475403904914856,
0.13185852766036987,
-0.0798054188489914,
0.07524911314249039,
0.013904367573559284,
-0.09324809163808823,
-0.05802498012781143,
-0.014287559315562248,
-0.04272007197141647,
-0.12446481734514236,
0.025224581360816956,
-0.10637932270765305,
-0.059881389141082764,
0.05125061795115471,
0.0023357104510068893,
-0.06464514136314392,
0.22258877754211426,
0.007582412101328373,
-0.08381974697113037,
0.07541582733392715,
-0.02195902355015278,
0.014528276398777962,
0.027188260108232498,
0.0657709389925003,
-0.0049817501567304134,
-0.05128112807869911,
0.005175197962671518,
0.03824405372142792,
-0.06354720890522003,
0.014407025650143623,
-0.08402188867330551,
-0.0464312769472599,
-0.05124867334961891,
0.05613694712519646,
-0.0033027841709554195,
0.025445464998483658,
0.023517362773418427,
-0.0307264793664217,
-0.014252658002078533,
0.20057256519794464,
-0.043073203414678574,
-0.06078652665019035,
-0.14015421271324158,
0.16885167360305786,
0.04483865201473236,
0.05241524800658226,
-0.004447767976671457,
-0.07230926305055618,
-0.018793554976582527,
0.27158963680267334,
0.19353361427783966,
-0.04916567727923393,
0.0070248376578092575,
-0.009308040142059326,
0.016044916585087776,
0.015197087079286575,
0.14423775672912598,
0.015872206538915634,
0.21281205117702484,
-0.02590256743133068,
-0.08728208392858505,
-0.04721275344491005,
-0.04636382684111595,
0.05449594184756279,
0.1465175598859787,
0.027947846800088882,
-0.03103867545723915,
-0.034943778067827225,
0.10791631788015366,
-0.15445561707019806,
-0.08136054128408432,
0.03016054630279541,
-0.12623102962970734,
-0.05516742914915085,
-0.049556586891412735,
0.04944758862257004,
-0.0035701398737728596,
0.08554328978061676,
-0.0445539616048336,
-0.034309033304452896,
0.058507274836301804,
0.030726589262485504,
-0.16287639737129211,
-0.07619105279445648,
0.041920892894268036,
-0.07809485495090485,
0.14144675433635712,
-0.025214044377207756,
0.10716748982667923,
0.10365436226129532,
0.04132024571299553,
-0.029749484732747078,
0.029682165011763573,
0.0746564045548439,
0.06613517552614212,
0.0810181125998497,
0.04447226598858833,
-0.017833270132541656,
0.12911991775035858,
-0.04402249678969383,
-0.09761583060026169,
0.05818012356758118,
-0.012221543118357658,
-0.02000897377729416,
-0.13730880618095398,
-0.02481418289244175,
-0.08702115714550018,
0.09031267464160919,
0.1392880529165268,
-0.046433623880147934,
0.017740225419402122,
-0.05457593873143196,
0.13277946412563324,
0.019644999876618385,
0.002387539017945528,
-0.10434921085834503,
-0.11779492348432541,
-0.024508653208613396,
0.03128926828503609,
-0.03562212735414505,
-0.20237988233566284,
0.0051552350632846355,
-0.04932582005858421,
-0.011834883131086826,
-0.04681479558348656,
0.1109357699751854,
0.11702036112546921,
0.034412093460559845,
-0.01893928274512291,
-0.14149180054664612,
-0.0019234437495470047,
0.08458065986633301,
-0.13082818686962128,
-0.15198442339897156
] |
null | null | transformers |
# CodeTrans model for code documentation generation javascript
Pretrained model on programming language javascript using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized javascript code functions: it works best with tokenized javascript functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the javascript function or be fine-tuned on other javascript code tasks. It can be used on unparsed and untokenized javascript code. However, if the javascript code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_javascript_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_javascript_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "function isStandardBrowserEnv ( ) { if ( typeof navigator !== 'undefined' && ( navigator . product === 'ReactNative' || navigator . product === 'NativeScript' || navigator . product === 'NS' ) ) { return false ; } return ( typeof window !== 'undefined' && typeof document !== 'undefined' ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/function%20documentation%20generation/javascript/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 440,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "function isStandardBrowserEnv ( ) { if ( typeof navigator !== 'undefined' && ( navigator . product === 'ReactNative' || navigator . product === 'NativeScript' || navigator . product === 'NS' ) ) { return false ; } return ( typeof window !== 'undefined' && typeof document !== 'undefined' ) ; }"}]} | summarization | SEBIS/code_trans_t5_base_code_documentation_generation_javascript_multitask | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
| CodeTrans model for code documentation generation javascript
============================================================
Pretrained model on programming language javascript using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized javascript code functions: it works best with tokenized javascript functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the javascript function or be fine-tuned on other javascript code tasks. It can be used on unparsed and untokenized javascript code. However, if the javascript code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 440,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 440,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 440,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
46,
60,
143
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 440,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.1461721956729889,
-0.03475102782249451,
-0.00047154398635029793,
0.12854890525341034,
0.1260223686695099,
0.027768950909376144,
0.0506826750934124,
0.06339214742183685,
-0.030992712825536728,
0.025279752910137177,
0.05387420207262039,
-0.0011188369244337082,
0.029781730845570564,
0.19196517765522003,
0.013504191301763058,
-0.09143732488155365,
-0.04160836338996887,
0.0451350212097168,
-0.04259480535984039,
0.13597270846366882,
0.07617902010679245,
-0.06642153859138489,
0.05805712938308716,
-0.05708429589867592,
-0.2503727078437805,
0.06198602914810181,
0.01819002442061901,
-0.06109537556767464,
0.09072558581829071,
0.050750233232975006,
0.12316081672906876,
-0.003764547873288393,
0.015296739526093006,
-0.14178267121315002,
0.01300688274204731,
0.009925874881446362,
0.02652878500521183,
0.014486592262983322,
0.04110051319003105,
0.020611125975847244,
0.14765632152557373,
-0.0022159034851938486,
0.02922898717224598,
0.07659286260604858,
-0.0762651264667511,
-0.10712222009897232,
-0.002414089860394597,
0.00044055827311240137,
0.04888107627630234,
0.11320357769727707,
-0.020669911056756973,
0.12190251797437668,
-0.15262873470783234,
0.13061252236366272,
0.09673438221216202,
-0.22728726267814636,
-0.018183834850788116,
0.09707921743392944,
0.07695125043392181,
0.06825920939445496,
-0.05231228470802307,
-0.05360547453165054,
0.10786276310682297,
0.04434598609805107,
0.035304490476846695,
-0.09704745560884476,
-0.07729808241128922,
0.010888863354921341,
-0.07083643227815628,
-0.07024460285902023,
0.22698844969272614,
0.021687570959329605,
-0.08122671395540237,
-0.04941164329648018,
-0.02989129163324833,
-0.12059588730335236,
0.026451140642166138,
0.04715574532747269,
-0.0012235726462677121,
-0.03437728062272072,
-0.02826443687081337,
0.031182629987597466,
-0.0742793157696724,
-0.14581169188022614,
0.02298690751194954,
0.10000603646039963,
0.04549986869096756,
0.01945890672504902,
-0.11004927009344101,
0.10638551414012909,
0.048894383013248444,
-0.06949573755264282,
-0.025436555966734886,
-0.014061801135540009,
-0.09354002773761749,
0.021316856145858765,
-0.04997580125927925,
-0.16796673834323883,
0.030347494408488274,
0.03431946039199829,
-0.02987097203731537,
0.05733228102326393,
0.02687677927315235,
0.035814203321933746,
0.006318363826721907,
0.2266044318675995,
0.08229370415210724,
-0.14574748277664185,
0.06458503752946854,
0.06152109429240227,
-0.02314034104347229,
-0.0017054487252607942,
-0.07897701859474182,
-0.09802708774805069,
0.0991801768541336,
0.10719866305589676,
-0.11731161177158356,
0.039361633360385895,
-0.0708775520324707,
-0.028008397668600082,
0.014534499496221542,
-0.15598797798156738,
0.00631177332252264,
0.042635366320610046,
-0.08035300672054291,
-0.043599750846624374,
0.100400410592556,
-0.17660081386566162,
-0.14507701992988586,
-0.0665535107254982,
-0.07267989218235016,
-0.02705519273877144,
-0.16487182676792145,
-0.15240263938903809,
-0.007163296919316053,
-0.01949448697268963,
0.02908111736178398,
-0.11218826472759247,
-0.1284104883670807,
-0.037363450974226,
0.012246674858033657,
0.024244172498583794,
0.005793984048068523,
-0.10045980662107468,
-0.01774347946047783,
-0.016754215583205223,
-0.037659790366888046,
0.00203526159748435,
-0.04513508453965187,
0.12981878221035004,
0.12131238728761673,
0.044319964945316315,
-0.012499315664172173,
0.05633886530995369,
-0.09365914762020111,
0.06407555192708969,
-0.10961198806762695,
0.10685990750789642,
-0.030440568923950195,
0.08257568627595901,
-0.03646183758974075,
-0.10912083089351654,
0.05937125161290169,
0.05927133187651634,
0.07651155441999435,
0.06635575741529465,
-0.11367384344339371,
-0.05078251287341118,
0.18451404571533203,
-0.10265099257230759,
-0.15023626387119293,
0.10430142283439636,
-0.03403635323047638,
0.06968611478805542,
0.10080492496490479,
0.14222240447998047,
0.15562303364276886,
-0.04240711033344269,
0.001454632030799985,
0.05603957176208496,
0.03758680075407028,
-0.1287815123796463,
0.07985710352659225,
0.047491349279880524,
-0.09209182113409042,
0.05597257614135742,
-0.01391853578388691,
0.13492590188980103,
-0.019778508692979813,
-0.02151540108025074,
-0.04885626584291458,
-0.0835537314414978,
0.025465084239840508,
0.02483857236802578,
0.06103501841425896,
-0.08463243395090103,
-0.08414696902036667,
0.0847615972161293,
0.16260962188243866,
-0.1347140073776245,
-0.002241812413558364,
-0.09284760057926178,
0.05766257271170616,
-0.08737632632255554,
0.02904312126338482,
-0.16447022557258606,
0.0048340787179768085,
0.06458953022956848,
-0.01763400435447693,
0.06829731911420822,
0.10398454964160919,
0.027923358604311943,
0.03659652918577194,
0.0018681121291592717,
-0.03362446278333664,
-0.12527982890605927,
-0.06804122775793076,
-0.05722467973828316,
-0.06354771554470062,
-0.09254476428031921,
-0.05975981056690216,
0.007640114985406399,
-0.1914229542016983,
0.013680566102266312,
-0.004905582405626774,
-0.012898257933557034,
0.02510015480220318,
-0.010456928983330727,
0.02523038722574711,
0.0714564174413681,
-0.06395553052425385,
-0.03862437233328819,
0.03820809721946716,
0.02142629772424698,
-0.05787035450339317,
-0.085011325776577,
-0.09441391378641129,
0.0024167164228856564,
0.12332287430763245,
0.020737146958708763,
-0.08279620110988617,
0.017573116347193718,
-0.01763784885406494,
-0.03473915904760361,
0.018869096413254738,
-0.0770917758345604,
0.16805502772331238,
-0.01060513686388731,
0.20341475307941437,
-0.15619629621505737,
-0.03601137921214104,
-0.027754679322242737,
0.020065587013959885,
0.05941677838563919,
0.13571226596832275,
-0.013358104974031448,
-0.07697484642267227,
0.05275005102157593,
0.03782493993639946,
-0.09380803257226944,
0.2486809492111206,
-0.05453982576727867,
-0.08816292881965637,
0.03485263139009476,
0.10450764745473862,
-0.01712009310722351,
0.15371331572532654,
-0.224626824259758,
-0.04581107571721077,
0.0013170383172109723,
-0.008410261012613773,
0.06326591223478317,
-0.12328179180622101,
0.007946568541228771,
0.011381460353732109,
-0.06927451491355896,
-0.09449739754199982,
-0.004135899245738983,
-0.007759936153888702,
0.04141071066260338,
-0.0026145828887820244,
-0.04100359231233597,
0.021065637469291687,
-0.03870460018515587,
-0.12518049776554108,
0.21975122392177582,
-0.08371220529079437,
-0.1898857057094574,
-0.18854664266109467,
0.11951571702957153,
-0.0684504434466362,
-0.016683561727404594,
0.03311201557517052,
-0.09301640838384628,
-0.03013494983315468,
-0.05304236710071564,
0.18811434507369995,
-0.06871245801448822,
-0.005169150419533253,
-0.017092356458306313,
0.0692482441663742,
0.011858063749969006,
-0.2053203284740448,
0.027780290693044662,
-0.01678917184472084,
-0.03214547783136368,
-0.00427668821066618,
-0.11477140337228775,
0.11347932368516922,
0.1753545105457306,
-0.07189479470252991,
0.025221215561032295,
-0.005150096956640482,
0.19542470574378967,
-0.04970683157444,
-0.05208437144756317,
0.15419402718544006,
-0.0050301202572882175,
-0.010157288052141666,
0.002972444985061884,
-0.017180627211928368,
-0.08861254900693893,
0.06991234421730042,
-0.019378507509827614,
-0.0319322906434536,
-0.2641160190105438,
-0.015987742692232132,
-0.07825858891010284,
0.0404512882232666,
0.043231215327978134,
0.026459679007530212,
-0.1071675568819046,
0.027425246313214302,
0.04718481004238129,
0.13834410905838013,
-0.0002022436383413151,
0.051536671817302704,
0.05376395583152771,
0.015689406543970108,
0.013645052909851074,
-0.09999915212392807,
-0.002092595212161541,
0.05968663468956947,
0.08192137628793716,
0.2682431638240814,
-0.0995749980211258,
0.18846212327480316,
0.04180744290351868,
0.043803825974464417,
0.035080697387456894,
0.12846940755844116,
-0.10717278718948364,
0.03306670859456062,
0.016190484166145325,
-0.008245101198554039,
-0.11926855891942978,
0.00842899177223444,
-0.04022552818059921,
0.09363511204719543,
-0.1359400451183319,
-0.03737657889723778,
0.02357105165719986,
0.12869228422641754,
0.06111076846718788,
-0.24311046302318573,
-0.13488468527793884,
0.014989957213401794,
-0.08724577724933624,
-0.09635018557310104,
0.07105748355388641,
0.2163223922252655,
-0.07447795569896698,
-0.022545546293258667,
-0.005091178230941296,
0.12990044057369232,
-0.021669482812285423,
-0.023558706045150757,
-0.023666242137551308,
0.06294839829206467,
0.014106827788054943,
0.12831832468509674,
-0.2943068742752075,
0.12557020783424377,
-0.0010069707641378045,
0.08272699266672134,
-0.029920663684606552,
0.04580293595790863,
-0.016707953065633774,
0.0737747922539711,
0.039577990770339966,
-0.014651410281658173,
0.051512181758880615,
-0.17193834483623505,
0.021003518253564835,
0.04676037281751633,
0.030894380062818527,
0.05432292819023132,
0.07784704118967056,
-0.0018430688651278615,
0.04640679806470871,
-0.01791948825120926,
-0.13604119420051575,
-0.09021242707967758,
-0.0604281984269619,
-0.03930053487420082,
-0.040624212473630905,
-0.01533300057053566,
-0.03621219843626022,
-0.00509948655962944,
0.06777592748403549,
0.17882457375526428,
-0.08558493852615356,
-0.084000363945961,
-0.06278518587350845,
0.06329833716154099,
0.08576558530330658,
-0.09956555813550949,
0.03771122917532921,
-0.006986661348491907,
0.02770116925239563,
-0.0061891586519777775,
-0.09383438527584076,
0.056560542434453964,
-0.04193832725286484,
-0.05859258398413658,
-0.006801696959882975,
0.08626345545053482,
0.0010684074368327856,
0.035073503851890564,
0.02023569494485855,
-0.10053718835115433,
-0.042172715067863464,
-0.11993855237960815,
-0.1080523207783699,
-0.06014971807599068,
0.007263874169439077,
0.05045229569077492,
-0.12736958265304565,
-0.07520075887441635,
-0.015096450224518776,
-0.008100497536361217,
0.1346835345029831,
0.15475858747959137,
-0.05883351340889931,
-0.007692589890211821,
0.1315516084432602,
-0.052157092839479446,
-0.19252964854240417,
0.050248097628355026,
0.04674241691827774,
0.11717895418405533,
-0.05488608404994011,
-0.17135971784591675,
0.04511537030339241,
-0.010175751522183418,
0.03907515108585358,
0.0687677189707756,
-0.32317519187927246,
-0.12171182781457901,
0.08847954124212265,
0.14539819955825806,
0.1355283558368683,
-0.1267443597316742,
-0.03461812064051628,
-0.06110536307096481,
-0.12922275066375732,
0.07389985024929047,
-0.08751939982175827,
0.12588311731815338,
-0.06760523468255997,
0.024300506338477135,
0.03730512037873268,
-0.044608522206544876,
0.0629410669207573,
0.04583493247628212,
0.11715315282344818,
-0.03167051449418068,
0.0018279412761330605,
0.13990488648414612,
-0.04168831184506416,
0.1764182597398758,
-0.14260590076446533,
0.09491229802370071,
-0.22757549583911896,
-0.0536988265812397,
-0.08741962164640427,
0.02225222997367382,
-0.029666472226381302,
-0.033249545842409134,
-0.08229239284992218,
0.01575634442269802,
-0.011391396634280682,
0.006482141092419624,
0.03194412216544151,
-0.037172894924879074,
-0.02429010346531868,
0.09085388481616974,
0.13317254185676575,
-0.010680422186851501,
-0.0664016604423523,
0.06568486988544464,
0.04469122737646103,
0.11017193645238876,
-0.19958534836769104,
0.027400517836213112,
0.1116400808095932,
0.026790568605065346,
0.11540678888559341,
0.047575488686561584,
-0.10426714271306992,
0.06396394222974777,
0.0880155861377716,
-0.0601700097322464,
-0.06501705199480057,
-0.04160083830356598,
-0.121307872235775,
-0.08600084483623505,
0.050911109894514084,
0.09831006079912186,
-0.02943728119134903,
-0.013731295242905617,
-0.033411432057619095,
-0.03204610198736191,
-0.11591845750808716,
0.17605966329574585,
0.08175894618034363,
0.07720911502838135,
-0.06303757429122925,
0.050691016018390656,
0.06300922483205795,
-0.07761278748512268,
0.006824926473200321,
0.16915598511695862,
-0.10085216164588928,
-0.048615917563438416,
0.05898603796958923,
0.23736345767974854,
-0.03445018082857132,
-0.05480821430683136,
-0.13977983593940735,
-0.07401962578296661,
0.019025210291147232,
0.15718534588813782,
0.1035449430346489,
0.08167567104101181,
-0.02016509510576725,
0.006620385218411684,
-0.1145528107881546,
0.09054671972990036,
0.0758499801158905,
0.031668007373809814,
-0.09724721312522888,
0.14709220826625824,
0.04518691822886467,
0.12541674077510834,
-0.027523517608642578,
-0.014888289384543896,
-0.13639315962791443,
0.07555737346410751,
-0.09539835155010223,
0.028552722185850143,
-0.01593981496989727,
0.052397362887859344,
-0.033383242785930634,
0.0030537762213498354,
-0.04413077235221863,
0.06324443966150284,
-0.08257748186588287,
0.0006420640856958926,
0.018123578280210495,
0.05301346629858017,
-0.060759007930755615,
-0.012950584292411804,
0.02038763463497162,
-0.0988578051328659,
0.12576837837696075,
-0.02318238653242588,
-0.021909482777118683,
0.0973418578505516,
-0.06437066197395325,
0.022774139419198036,
0.017331425100564957,
0.04731133207678795,
0.007391284219920635,
0.033058684319257736,
0.09050800651311874,
0.03949273005127907,
0.06081200763583183,
0.024320611730217934,
0.11167654395103455,
-0.1267164647579193,
-0.06799270957708359,
-0.04809236153960228,
-0.1064857617020607,
-0.06341984122991562,
0.11016752570867538,
0.032430294901132584,
0.09771550446748734,
0.10592526197433472,
-0.039769336581230164,
0.018875161185860634,
-0.1518322080373764,
-0.06490090489387512,
0.027459463104605675,
-0.020381610840559006,
-0.09467986226081848,
-0.06051938608288765,
0.05281451344490051,
-0.034233227372169495,
0.10619176179170609,
0.021056344732642174,
0.0604327954351902,
-0.024219216778874397,
-0.0362495556473732,
0.008202091790735722,
0.011963078752160072,
0.18992485105991364,
-0.08226322382688522,
0.04230941832065582,
0.004338833969086409,
0.020265482366085052,
0.03167346492409706,
0.11845024675130844,
0.1454741358757019,
0.14729617536067963,
-0.03490286320447922,
0.1193094253540039,
-0.001331716077402234,
-0.003936673514544964,
-0.08902934193611145,
0.0018201654311269522,
0.020140046253800392,
0.059248365461826324,
-0.041998110711574554,
0.17366187274456024,
0.10824865102767944,
-0.1065104529261589,
0.08903609216213226,
0.025890175253152847,
-0.12816202640533447,
-0.04488939419388771,
0.03014037385582924,
-0.03813565894961357,
-0.14818175137043,
0.03651631623506546,
-0.11352873593568802,
-0.03453434631228447,
0.03378213196992874,
0.03824051097035408,
-0.08323030918836594,
0.19255822896957397,
0.04083004593849182,
-0.07128029316663742,
0.06839360296726227,
-0.015887031331658363,
0.022806372493505478,
0.04220728948712349,
0.03338240832090378,
0.03669949620962143,
-0.058421894907951355,
0.04510090872645378,
0.0284119900316,
-0.04773577302694321,
-0.0028581528458744287,
-0.025532463565468788,
-0.015066913329064846,
-0.02513285167515278,
0.040829237550497055,
0.06397855281829834,
0.16147540509700775,
0.03745589777827263,
-0.06925810128450394,
-0.022347623482346535,
0.14732596278190613,
-0.023353932425379753,
-0.08305123448371887,
-0.12412385642528534,
0.13083511590957642,
0.04699990898370743,
0.004787911660969257,
0.012320185080170631,
-0.09510859102010727,
-0.03378831595182419,
0.21380867063999176,
0.06538640707731247,
-0.02588542364537716,
-0.019459731876850128,
-0.009759264998137951,
-0.0018138287123292685,
-0.029431229457259178,
0.2100636512041092,
0.01900200918316841,
0.21420204639434814,
0.01967708393931389,
-0.029943648725748062,
-0.07353822886943817,
-0.0384088009595871,
0.02410438098013401,
0.12824037671089172,
-0.03159660845994949,
-0.035410527139902115,
-0.08195573836565018,
0.012466924265027046,
-0.0072531988844275475,
-0.06224536523222923,
0.10113722085952759,
-0.13427378237247467,
-0.08142920583486557,
-0.03785472363233566,
0.037746530026197433,
-0.038410138338804245,
0.03927929699420929,
-0.033587321639060974,
0.030235664919018745,
0.06625133752822876,
-0.03770500421524048,
-0.12252343446016312,
-0.15989287197589874,
0.08275885134935379,
-0.06367228925228119,
0.1375512182712555,
-0.017415110021829605,
0.1607178896665573,
0.09390483051538467,
0.03678538650274277,
-0.051585983484983444,
0.11324840784072876,
0.03764181584119797,
0.06889589875936508,
0.06553928554058075,
0.10704399645328522,
-0.04446743428707123,
0.1439579874277115,
-0.04450453072786331,
-0.011563403531908989,
-0.011793947778642178,
-0.08030911535024643,
-0.02745916321873665,
-0.18472938239574432,
-0.01899777725338936,
-0.09651563316583633,
0.09246350079774857,
0.1841532588005066,
-0.04596111178398132,
-0.027706412598490715,
-0.08190211653709412,
0.1011669710278511,
0.0033532509114593267,
0.0764496847987175,
-0.048698924481868744,
-0.16741864383220673,
-0.000676899217069149,
0.005436685401946306,
0.0016830224776640534,
-0.2719140648841858,
0.0016514805611222982,
-0.0527130588889122,
-0.027883380651474,
-0.09458748251199722,
0.16331905126571655,
0.07042437046766281,
0.04042552039027214,
-0.03673728555440903,
-0.13312307000160217,
-0.029116082936525345,
0.0750172957777977,
-0.16127954423427582,
-0.14774511754512787
] |
null | null | transformers |
# CodeTrans model for code documentation generation javascript
Pretrained model on programming language javascript using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized javascript code functions: it works best with tokenized javascript functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the javascript function/method.
## Intended uses & limitations
The model could be used to generate the description for the javascript function or be fine-tuned on other javascript code tasks. It can be used on unparsed and untokenized javascript code. However, if the javascript code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_javascript_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_javascript_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "function isStandardBrowserEnv ( ) { if ( typeof navigator !== 'undefined' && ( navigator . product === 'ReactNative' || navigator . product === 'NativeScript' || navigator . product === 'NS' ) ) { return false ; } return ( typeof window !== 'undefined' && typeof document !== 'undefined' ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/function%20documentation%20generation/javascript/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "function isStandardBrowserEnv ( ) { if ( typeof navigator !== 'undefined' && ( navigator . product === 'ReactNative' || navigator . product === 'NativeScript' || navigator . product === 'NS' ) ) { return false ; } return ( typeof window !== 'undefined' && typeof document !== 'undefined' ) ; }"}]} | summarization | SEBIS/code_trans_t5_base_code_documentation_generation_javascript_multitask_finetune | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
| CodeTrans model for code documentation generation javascript
============================================================
Pretrained model on programming language javascript using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized javascript code functions: it works best with tokenized javascript functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the javascript function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the javascript function or be fine-tuned on other javascript code tasks. It can be used on unparsed and untokenized javascript code. However, if the javascript code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
46,
60,
88,
107
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.11578662693500519,
0.03386032581329346,
-0.0014835594920441508,
0.10240959376096725,
0.04707440361380577,
0.01928182877600193,
0.05507291108369827,
0.09434618055820465,
0.0033777952194213867,
0.07039493322372437,
0.04608161374926567,
-0.06705866754055023,
0.058717332780361176,
0.19460755586624146,
0.02356255054473877,
-0.11412044614553452,
-0.04277028515934944,
0.037433501332998276,
-0.052231092005968094,
0.10859408229589462,
0.06625117361545563,
-0.09123820811510086,
0.07310743629932404,
-0.03171733021736145,
-0.13579556345939636,
0.03268890082836151,
-0.01445557177066803,
-0.01334783062338829,
0.0914013683795929,
0.051454946398735046,
0.11504905670881271,
-0.009753153659403324,
0.05763337016105652,
-0.1872175633907318,
0.0041305748745799065,
0.02467978186905384,
0.06941615790128708,
0.040961071848869324,
0.05578415095806122,
0.07147307693958282,
0.0871814712882042,
-0.02694629691541195,
0.0292253028601408,
0.06306194514036179,
-0.0665016919374466,
-0.02665698528289795,
-0.08110632747411728,
0.0504632331430912,
0.09296819567680359,
0.0960640013217926,
-0.010721421800553799,
0.04235215112566948,
-0.09753874689340591,
0.08262433856725693,
0.12253039330244064,
-0.23030808568000793,
-0.021467076614499092,
0.08464183658361435,
0.09898976981639862,
0.02318493276834488,
-0.08058016747236252,
-0.03825630247592926,
0.10545860230922699,
0.0415860153734684,
0.05530451238155365,
-0.09495986253023148,
-0.006463299039751291,
-0.014443622902035713,
-0.04561534523963928,
-0.04295516759157181,
0.17268683016300201,
0.06286168843507767,
-0.062142714858055115,
-0.10251875221729279,
-0.040489837527275085,
-0.16517606377601624,
0.03227746859192848,
0.02633778005838394,
-0.0065965778194367886,
-0.016320964321494102,
-0.017844853922724724,
-0.003961081616580486,
-0.09133758395910263,
-0.11183976382017136,
0.03142183646559715,
0.010081617161631584,
0.053164366632699966,
0.028003711253404617,
-0.05338237062096596,
0.0894988402724266,
0.04000437632203102,
-0.059053465723991394,
-0.012652185745537281,
0.01026084739714861,
-0.11751744896173477,
-0.006209040526300669,
0.0024899241980165243,
-0.06128188222646713,
0.007947283796966076,
0.07275222986936569,
-0.07086526602506638,
0.08583149313926697,
0.07922635227441788,
0.020282121375203133,
0.01221687626093626,
0.21886053681373596,
0.07097224146127701,
-0.16216284036636353,
0.029367245733737946,
0.04705854132771492,
0.0050178831443190575,
0.013930968940258026,
-0.055152639746665955,
-0.0503680557012558,
0.019846461713314056,
0.07040037959814072,
-0.11816046386957169,
0.02331988885998726,
-0.06834588944911957,
-0.011908259242773056,
0.10375208407640457,
-0.1269547939300537,
0.036198802292346954,
0.034191302955150604,
-0.06309664249420166,
-0.030633587390184402,
0.08065468072891235,
-0.1317140758037567,
-0.11423664540052414,
0.004754324443638325,
-0.04775170236825943,
-0.020834460854530334,
-0.11772218346595764,
-0.10325739532709122,
0.001965645933523774,
-0.026504265144467354,
0.0005156481638550758,
-0.09818762540817261,
-0.08509121090173721,
-0.032047562301158905,
0.03082314506173134,
0.007782320957630873,
-0.02436850219964981,
-0.056949276477098465,
0.0010639865649864078,
-0.008672317489981651,
-0.026422223076224327,
0.010809551924467087,
-0.026272360235452652,
0.09376125782728195,
0.08533963561058044,
0.04306051880121231,
0.006822218652814627,
0.023828282952308655,
-0.09198682010173798,
0.09309668093919754,
-0.12346059828996658,
0.0716816708445549,
0.012222354300320148,
0.06363306939601898,
-0.0973769873380661,
-0.06787633895874023,
-0.010043838061392307,
0.04291369020938873,
0.07723473757505417,
0.05549163371324539,
-0.1233283206820488,
0.013235699385404587,
0.15273213386535645,
-0.08702194690704346,
-0.14777223765850067,
0.11303676664829254,
-0.014582531526684761,
0.03128456324338913,
0.0772988572716713,
0.13350528478622437,
0.1562114804983139,
-0.09304621070623398,
-0.043694619089365005,
0.08060210943222046,
0.046876706182956696,
-0.06207852065563202,
0.05621960386633873,
0.010712854564189911,
-0.0029332085978239775,
0.015821199864149094,
0.0729711651802063,
0.07590052485466003,
-0.009473351761698723,
-0.03167136758565903,
-0.030346861109137535,
-0.09319641441106796,
-0.0287276990711689,
-0.009759528562426567,
0.015001869760453701,
-0.05584835633635521,
-0.063080795109272,
-0.0018459153361618519,
0.16403673589229584,
-0.09817706048488617,
0.023808090016245842,
-0.08994460850954056,
-0.040684156119823456,
-0.08382010459899902,
0.025590060278773308,
-0.1266443133354187,
0.008115030825138092,
0.06294136494398117,
-0.04750443622469902,
0.05266202613711357,
0.07280611246824265,
0.00784866139292717,
0.023494167253375053,
-0.04257037490606308,
-0.045740868896245956,
-0.049984514713287354,
-0.07930736988782883,
-0.10369347780942917,
-0.03432381525635719,
-0.10322915762662888,
-0.0314568392932415,
-0.02861960232257843,
-0.17145420610904694,
0.0013968584826216102,
-0.013885403983294964,
0.0116491774097085,
0.027431651949882507,
-0.03644579276442528,
0.029977496713399887,
0.0447952039539814,
-0.04956051707267761,
-0.07896092534065247,
0.0200723335146904,
0.04115115851163864,
-0.08605950325727463,
-0.05400494486093521,
-0.10166727006435394,
-0.09259584546089172,
0.08075898885726929,
0.0898004099726677,
-0.11011950671672821,
-0.005130391102284193,
-0.02855559252202511,
-0.04272136092185974,
-0.05050426721572876,
-0.06983932852745056,
0.16327325999736786,
0.021793462336063385,
0.15774020552635193,
-0.13908825814723969,
-0.06338586658239365,
-0.02271435409784317,
0.008568823337554932,
0.03653848171234131,
0.14106056094169617,
0.024927275255322456,
-0.10064869374036789,
0.017929041758179665,
-0.01011203695088625,
-0.036540113389492035,
0.1693446934223175,
-0.02446407452225685,
-0.06114910542964935,
-0.00018149014795199037,
0.11876485496759415,
-0.00682012690231204,
0.18295152485370636,
-0.0861521065235138,
-0.013695748522877693,
-0.017969351261854172,
0.00990935880690813,
0.03469707444310188,
-0.12127210944890976,
0.030465049669146538,
0.027770763263106346,
-0.06423398107290268,
-0.04784829542040825,
-0.026524201035499573,
-0.027953390032052994,
0.03861893713474274,
0.027079978957772255,
0.023367658257484436,
-0.006079420447349548,
-0.03962930291891098,
-0.11788710951805115,
0.17126406729221344,
-0.05721009895205498,
-0.18153466284275055,
-0.15251287817955017,
0.1092800572514534,
-0.029177609831094742,
-0.012555750086903572,
0.02418217435479164,
-0.08894432336091995,
-0.0368189737200737,
-0.08764132857322693,
0.1451103836297989,
-0.08812719583511353,
-0.00035553364432416856,
0.0020200065337121487,
0.059520795941352844,
0.056019823998212814,
-0.1622302234172821,
0.023443201556801796,
-0.018065467476844788,
0.0010146615095436573,
-0.03196581080555916,
-0.06417681276798248,
0.08988526463508606,
0.12698578834533691,
-0.05408281460404396,
0.021297665312886238,
-0.0007007684907875955,
0.16409415006637573,
-0.06352565437555313,
0.054785434156656265,
0.19160400331020355,
0.014329088851809502,
0.027330657467246056,
0.0419454388320446,
0.007382769137620926,
-0.08376915752887726,
0.06488005816936493,
0.044456854462623596,
-0.039512183517217636,
-0.21190454065799713,
-0.02150559611618519,
-0.08085305988788605,
0.0660681426525116,
0.1080484390258789,
0.03276390582323074,
-0.16698943078517914,
0.031040675938129425,
-0.012417715974152088,
0.16488000750541687,
-0.021089455112814903,
0.05624367669224739,
-0.0022841321770101786,
0.030276648700237274,
-0.0048602488823235035,
-0.10660642385482788,
-0.0033924065064638853,
0.05968580022454262,
0.1012597307562828,
0.2035234570503235,
-0.08724670857191086,
0.1705380380153656,
0.020805202424526215,
0.10122749954462051,
0.025726284831762314,
0.09936931729316711,
-0.11058765649795532,
0.008079842664301395,
0.012224199250340462,
-0.018646106123924255,
-0.06622371822595596,
0.03373677283525467,
-0.03231880068778992,
0.07940040528774261,
-0.08372540026903152,
0.027062274515628815,
0.035937316715717316,
0.17878791689872742,
0.10370692610740662,
-0.17876207828521729,
-0.13809743523597717,
0.012378825806081295,
-0.08751805126667023,
-0.10502360016107559,
0.07136479020118713,
0.21940436959266663,
-0.0655655711889267,
0.023401817306876183,
-0.017905673012137413,
0.1282544881105423,
-0.09423017501831055,
-0.010976769030094147,
0.04631288722157478,
0.06888140738010406,
0.008964155800640583,
0.11107620596885681,
-0.25234630703926086,
0.06441175192594528,
0.023871365934610367,
0.10673046112060547,
-0.027447029948234558,
0.05642753094434738,
-0.034385040402412415,
-0.01115897111594677,
0.07534486055374146,
0.0031860137823969126,
-0.020190509036183357,
-0.19545118510723114,
-0.03885252773761749,
0.026776796206831932,
0.05894063040614128,
-0.009694686159491539,
0.10008213669061661,
-0.008173314854502678,
0.03456282988190651,
-0.034046951681375504,
-0.12386515736579895,
-0.09175774455070496,
-0.11881440132856369,
-0.04991266876459122,
-0.007274380419403315,
-0.07227278500795364,
-0.016255775466561317,
0.05404529720544815,
0.033891599625349045,
0.21921178698539734,
-0.14192238450050354,
-0.07106685638427734,
-0.07293804734945297,
0.05398375913500786,
0.12210579216480255,
-0.0936792865395546,
0.01196806225925684,
0.016336994245648384,
0.05347178131341934,
-0.04273645207285881,
-0.0784396305680275,
0.02714863047003746,
-0.06037406250834465,
-0.08285035192966461,
-0.040654879063367844,
0.13062897324562073,
-0.0038979914970695972,
0.04523361474275589,
0.017590446397662163,
-0.09611230343580246,
-0.026121608912944794,
-0.13189002871513367,
-0.06289731711149216,
-0.0474320724606514,
0.041920583695173264,
-0.011479124426841736,
-0.1145441010594368,
0.06948237866163254,
-0.012904743663966656,
-0.059563834220170975,
0.06120042875409126,
0.1652238517999649,
-0.07055039703845978,
0.010328670963644981,
0.07984790951013565,
-0.044005606323480606,
-0.1808163970708847,
-0.01409278716892004,
0.043749723583459854,
0.0811648890376091,
-0.026057044044137,
-0.14877104759216309,
0.06488195806741714,
-0.018652552738785744,
0.01488145999610424,
0.03297348693013191,
-0.27453169226646423,
-0.11893928050994873,
-0.002134351758286357,
0.05517499893903732,
0.03667521849274635,
-0.09048449248075485,
-0.04866337403655052,
-0.06217193603515625,
-0.055530767887830734,
0.06677591800689697,
0.02950836718082428,
0.10421663522720337,
-0.029480786994099617,
0.030183857306838036,
0.044998254626989365,
-0.03471609205007553,
0.048656221479177475,
0.01605217717587948,
0.11051575839519501,
-0.012830287218093872,
-0.008131462149322033,
0.07150553166866302,
-0.07394139468669891,
0.18942590057849884,
-0.15519274771213531,
0.09421127289533615,
-0.16122320294380188,
-0.03237573057413101,
-0.04791552945971489,
0.004735428374260664,
-0.03697942569851875,
-0.029388852417469025,
-0.11648573726415634,
0.025166867300868034,
0.03915596008300781,
-0.012219357304275036,
0.04366130009293556,
-0.0029107467271387577,
-0.06138084456324577,
0.05888442322611809,
0.10503726452589035,
-0.000989633845165372,
-0.12494716048240662,
0.03789740428328514,
0.022301170974969864,
0.08846587687730789,
-0.1856478899717331,
0.03299552574753761,
0.1029345840215683,
0.017254281789064407,
0.09021937102079391,
0.017533952370285988,
-0.09710994362831116,
0.0445716567337513,
0.06627031415700912,
-0.06866192072629929,
-0.07271077483892441,
-0.02351728081703186,
-0.054857272654771805,
-0.10781010240316391,
0.04954128339886665,
0.08522866666316986,
-0.03036479838192463,
-0.0073308502323925495,
-0.01628505066037178,
0.0036241752095520496,
-0.08326069265604019,
0.168488547205925,
0.0184310469776392,
0.0824270099401474,
-0.057171162217855453,
0.07098713517189026,
0.08557812869548798,
-0.10418511927127838,
0.020636333152651787,
0.1535705029964447,
-0.08768367767333984,
-0.02570030279457569,
0.07486706227064133,
0.14648930728435516,
0.004505549557507038,
-0.04538823664188385,
-0.10573052614927292,
-0.0819888710975647,
0.0145231569185853,
0.04975025728344917,
0.0611213818192482,
0.09465088695287704,
-0.01554123591631651,
0.0016742584994062781,
-0.1371314972639084,
0.09739760309457779,
0.08062799274921417,
0.036974772810935974,
-0.12816201150417328,
0.14507676661014557,
0.03096013329923153,
0.1043277308344841,
0.002271156059578061,
0.02786262519657612,
-0.1073031947016716,
0.03926573693752289,
-0.0343264602124691,
0.03224174305796623,
-0.017936956137418747,
0.04008593410253525,
-0.05234656482934952,
0.0499994270503521,
-0.03723888844251633,
0.04304160922765732,
-0.03931483253836632,
-0.02773573249578476,
-0.03078153170645237,
0.03333982452750206,
-0.059069883078336716,
-0.01175705622881651,
0.0034581946674734354,
-0.09990119189023972,
0.09808043390512466,
-0.054622676223516464,
-0.0011636561248451471,
0.0012243492528796196,
0.019947009161114693,
0.0353933647274971,
0.003628736361861229,
0.04668543115258217,
-0.00835210457444191,
0.00852871872484684,
0.032465092837810516,
0.020208541303873062,
0.004870211239904165,
-0.011203852482140064,
0.07857368886470795,
-0.13206616044044495,
-0.06713230907917023,
-0.0775311142206192,
-0.07516898214817047,
-0.06848463416099548,
0.08614972978830338,
0.06992962211370468,
0.07016143202781677,
0.09739997237920761,
-0.03695352002978325,
0.017901750281453133,
-0.18808428943157196,
-0.04527275636792183,
0.05234266072511673,
0.0002344347012694925,
-0.11764123290777206,
-0.04459941014647484,
0.07116751372814178,
-0.03566839173436165,
0.09476319700479507,
-0.01466540526598692,
0.05641317740082741,
-0.01766784116625786,
-0.045450400561094284,
-0.04950148984789848,
0.009020323865115643,
0.13974322378635406,
-0.11549293249845505,
-0.003473952179774642,
-0.009799934923648834,
0.005975868087261915,
0.0397084504365921,
0.1577639877796173,
0.13419963419437408,
0.12350504100322723,
0.035053882747888565,
0.09516716748476028,
-0.041884638369083405,
-0.035497959703207016,
-0.13354845345020294,
0.07611798495054245,
-0.03631086274981499,
0.03801281377673149,
-0.03579052910208702,
0.12061075866222382,
0.08164743334054947,
-0.13028685748577118,
0.09133990108966827,
0.000751771847717464,
-0.09656145423650742,
-0.03807550668716431,
-0.08633766323328018,
-0.038659628480672836,
-0.09361448884010315,
0.011599551886320114,
-0.0922398790717125,
-0.005914768204092979,
0.06002611294388771,
0.019092045724391937,
-0.03701630234718323,
0.1837587207555771,
-0.027661751955747604,
-0.05770286172628403,
0.045493289828300476,
0.039716701954603195,
0.023191554471850395,
0.10696427524089813,
0.02929646335542202,
0.06640259176492691,
-0.07126112282276154,
0.07691720128059387,
0.03531022369861603,
-0.004663584753870964,
0.017847441136837006,
0.020844539627432823,
-0.023244241252541542,
-0.04577747359871864,
-0.006839594803750515,
0.08963773399591446,
0.1329801082611084,
0.04113885387778282,
-0.03933030366897583,
-0.05387220159173012,
0.16181373596191406,
-0.05033687874674797,
-0.04258493334054947,
-0.12365967035293579,
0.1477942019701004,
0.03274529427289963,
0.0024257004261016846,
0.007946815341711044,
-0.0821094736456871,
-0.008324938826262951,
0.24154812097549438,
0.05111119523644447,
-0.0481867715716362,
-0.02772589400410652,
-0.019036754965782166,
-0.010113440454006195,
-0.04323197156190872,
0.1539936661720276,
-0.0037589753046631813,
0.22256214916706085,
0.018660511821508408,
-0.015212814323604107,
-0.035571467131376266,
-0.04392984136939049,
0.0045649888925254345,
0.19635912775993347,
-0.04551513493061066,
0.04159238561987877,
-0.10337718576192856,
-0.006835070438683033,
0.026107802987098694,
-0.12611499428749084,
0.12512454390525818,
-0.12838296592235565,
-0.0645357295870781,
0.021436255425214767,
0.05773286893963814,
-0.03987572342157364,
0.06720301508903503,
-0.030859608203172684,
0.06151125207543373,
0.0430995412170887,
-0.03896597772836685,
-0.09726202487945557,
-0.12833210825920105,
0.04649370536208153,
-0.02474473975598812,
0.1453285664319992,
0.014245763421058655,
0.09414585679769516,
0.08556592464447021,
0.006732456851750612,
-0.08303603529930115,
0.07817043364048004,
0.03317005932331085,
0.023737866431474686,
0.05538896471261978,
0.118108831346035,
-0.04039989784359932,
0.16202101111412048,
0.013927721418440342,
-0.02376602776348591,
-0.024273371323943138,
-0.04234781488776207,
-0.014073982834815979,
-0.1741991639137268,
-0.0015495861880481243,
-0.056112270802259445,
0.14087800681591034,
0.1824685037136078,
-0.05426614731550217,
-0.015123472549021244,
-0.03518441319465637,
0.08544522523880005,
0.003234083764255047,
0.09670396149158478,
-0.017129063606262207,
-0.1635563224554062,
0.017721805721521378,
-0.03453933447599411,
0.007087052799761295,
-0.19749128818511963,
-0.05051090195775032,
-0.03699125349521637,
-0.03569018095731735,
-0.10466035455465317,
0.13961949944496155,
0.05386022850871086,
0.03627408668398857,
-0.04474717378616333,
-0.13018134236335754,
-0.01226657722145319,
0.05940930172801018,
-0.12459734082221985,
-0.12819066643714905
] |
null | null | transformers |
# CodeTrans model for code documentation generation javascript
Pretrained model on programming language javascript using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized javascript code functions: it works best with tokenized javascript functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the javascript function/method.
## Intended uses & limitations
The model could be used to generate the description for the javascript function or be fine-tuned on other javascript code tasks. It can be used on unparsed and untokenized javascript code. However, if the javascript code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_javascript_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_javascript_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "function isStandardBrowserEnv ( ) { if ( typeof navigator !== 'undefined' && ( navigator . product === 'ReactNative' || navigator . product === 'NativeScript' || navigator . product === 'NS' ) ) { return false ; } return ( typeof window !== 'undefined' && typeof document !== 'undefined' ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/function%20documentation%20generation/javascript/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V3-8 for 35,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "function isStandardBrowserEnv ( ) { if ( typeof navigator !== 'undefined' && ( navigator . product === 'ReactNative' || navigator . product === 'NativeScript' || navigator . product === 'NS' ) ) { return false ; } return ( typeof window !== 'undefined' && typeof document !== 'undefined' ) ; }"}]} | summarization | SEBIS/code_trans_t5_base_code_documentation_generation_javascript_transfer_learning_finetune | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
| CodeTrans model for code documentation generation javascript
============================================================
Pretrained model on programming language javascript using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized javascript code functions: it works best with tokenized javascript functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the javascript function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the javascript function or be fine-tuned on other javascript code tasks. It can be used on unparsed and untokenized javascript code. However, if the javascript code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V3-8 for 35,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V3-8 for 35,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V3-8 for 35,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
46,
60,
87,
107
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V3-8 for 35,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.12392016500234604,
0.01978279836475849,
-0.0010413661366328597,
0.09078510850667953,
0.044198643416166306,
0.020770635455846786,
0.047888435423374176,
0.09745944291353226,
-0.008076846599578857,
0.0676724761724472,
0.041092682629823685,
-0.06746168434619904,
0.05800091102719307,
0.20712600648403168,
0.01904319040477276,
-0.11181402951478958,
-0.03352869674563408,
0.037753939628601074,
-0.07690567523241043,
0.11424332112073898,
0.06608907133340836,
-0.08974639326334,
0.07212450355291367,
-0.02935400977730751,
-0.14860619604587555,
0.02134375087916851,
-0.014599641785025597,
-0.01360744796693325,
0.09368859231472015,
0.0611770823597908,
0.1187448725104332,
-0.007401298265904188,
0.056819695979356766,
-0.18667452037334442,
0.006108220666646957,
0.0235127042979002,
0.06780195236206055,
0.04689433425664902,
0.039515841752290726,
0.0813920721411705,
0.09460834413766861,
-0.024916410446166992,
0.02415071800351143,
0.06402721256017685,
-0.06593833118677139,
-0.04970233514904976,
-0.08726602792739868,
0.06481006741523743,
0.09204713255167007,
0.10039710998535156,
-0.014009691774845123,
0.05290864408016205,
-0.09520619362592697,
0.07919382303953171,
0.12920033931732178,
-0.2467203140258789,
-0.024380585178732872,
0.10304254293441772,
0.0962047129869461,
0.02502581849694252,
-0.07814113050699234,
-0.028681959956884384,
0.11171425879001617,
0.029760487377643585,
0.054352860897779465,
-0.08179108053445816,
0.028252532705664635,
-0.009761101566255093,
-0.05708057060837746,
-0.04361392557621002,
0.1721995323896408,
0.06753458827733994,
-0.06619954854249954,
-0.0996907502412796,
-0.03683891519904137,
-0.16702160239219666,
0.03283786028623581,
0.010171934962272644,
-0.012481333687901497,
-0.0117393359541893,
-0.03252546116709709,
0.0050982520915567875,
-0.0860775038599968,
-0.114578977227211,
0.02886088192462921,
0.02433057129383087,
0.05663950741291046,
0.027591483667492867,
-0.06019320711493492,
0.0888085663318634,
0.06609155982732773,
-0.05849337950348854,
-0.02296495996415615,
0.007109820377081633,
-0.11869286745786667,
-0.01402056124061346,
-0.0034571948926895857,
-0.055519767105579376,
-0.0015215558232739568,
0.0827271118760109,
-0.06786548346281052,
0.07777499407529831,
0.0786057561635971,
0.021365582942962646,
0.006202911026775837,
0.2208544760942459,
0.06827948242425919,
-0.1737816482782364,
0.0336134098470211,
0.03787943348288536,
0.005427629686892033,
0.015542657114565372,
-0.055052656680345535,
-0.05214426666498184,
0.024235161021351814,
0.07313163578510284,
-0.11730296164751053,
0.029891641810536385,
-0.06544878333806992,
-0.020048612728714943,
0.09791699051856995,
-0.12470787018537521,
0.026797685772180557,
0.02763393521308899,
-0.06685848534107208,
-0.030096381902694702,
0.09527129679918289,
-0.14003941416740417,
-0.10563822835683823,
-0.002474106615409255,
-0.05050686374306679,
-0.029462065547704697,
-0.11415380239486694,
-0.10591160506010056,
0.002467789687216282,
-0.013962293975055218,
-0.0005007044528611004,
-0.10478109866380692,
-0.06557250767946243,
-0.02295423299074173,
0.03413112834095955,
0.011053438298404217,
-0.026233572512865067,
-0.05451568216085434,
-0.0010941888904199004,
-0.003376435022801161,
-0.028962334617972374,
0.009335456416010857,
-0.02637398988008499,
0.10327577590942383,
0.0838112011551857,
0.043823108077049255,
0.016867419704794884,
0.025090189650654793,
-0.08749274909496307,
0.08532688021659851,
-0.13315989077091217,
0.0778001993894577,
0.00920693390071392,
0.05079028010368347,
-0.10178327560424805,
-0.06970687210559845,
-0.006717523094266653,
0.04563479498028755,
0.08275829255580902,
0.058091238141059875,
-0.11266212910413742,
0.00254124216735363,
0.15846791863441467,
-0.09621330350637436,
-0.15964704751968384,
0.1130039244890213,
-0.020925190299749374,
0.049996696412563324,
0.07559072971343994,
0.11381427198648453,
0.14583498239517212,
-0.1046355590224266,
-0.050990670919418335,
0.06954818964004517,
0.05019191652536392,
-0.07011765241622925,
0.05066733807325363,
0.022070730105042458,
-0.01953265815973282,
0.0075240847654640675,
0.05677036941051483,
0.08660418540239334,
-0.01829877495765686,
-0.034686315804719925,
-0.02472585067152977,
-0.09348970651626587,
-0.0387151725590229,
-0.010726851411163807,
0.01729441247880459,
-0.04656212404370308,
-0.07409653812646866,
0.005469153169542551,
0.16393283009529114,
-0.10395321995019913,
0.027909770607948303,
-0.09644357115030289,
-0.03747420758008957,
-0.08069764822721481,
0.03161117061972618,
-0.12629187107086182,
-0.004019652958959341,
0.056765828281641006,
-0.04308849573135376,
0.05542779713869095,
0.07581156492233276,
0.005572713911533356,
0.020667875185608864,
-0.04450683668255806,
-0.043198760598897934,
-0.055307306349277496,
-0.07997734099626541,
-0.10504335165023804,
-0.03588903695344925,
-0.10671323537826538,
-0.03328965604305267,
-0.02981126867234707,
-0.16973356902599335,
-0.00007877301686676219,
-0.0021999944001436234,
0.015408379957079887,
0.029738368466496468,
-0.04318280145525932,
0.02722056582570076,
0.04183647036552429,
-0.05070575326681137,
-0.07785482704639435,
0.01881347969174385,
0.044149573892354965,
-0.09074961394071579,
-0.04439103975892067,
-0.09675421565771103,
-0.09642314165830612,
0.09321141988039017,
0.08839289098978043,
-0.11992023885250092,
-0.01897747628390789,
-0.0308770090341568,
-0.044599369168281555,
-0.05004371330142021,
-0.06704413145780563,
0.15872439742088318,
0.023149194195866585,
0.15259355306625366,
-0.1373298168182373,
-0.06670726835727692,
-0.022794131189584732,
0.01567402295768261,
0.03992987796664238,
0.13270604610443115,
0.01584109291434288,
-0.11023436486721039,
0.02330353856086731,
-0.0057558221742510796,
-0.03280143067240715,
0.16206662356853485,
-0.03023805283010006,
-0.058624267578125,
0.0003638243942987174,
0.11747634410858154,
0.0005373279564082623,
0.1930893361568451,
-0.08025647699832916,
-0.017031675204634666,
-0.015223355032503605,
0.009963544085621834,
0.03250648081302643,
-0.11717043817043304,
0.022860858589410782,
0.030276330187916756,
-0.06946402788162231,
-0.03794107586145401,
-0.015566925518214703,
-0.03678090497851372,
0.03896665200591087,
0.02945009432733059,
0.018513016402721405,
-0.0017248085932806134,
-0.035617977380752563,
-0.11421608924865723,
0.17387370765209198,
-0.051868245005607605,
-0.18395137786865234,
-0.1527022123336792,
0.10131054371595383,
-0.01929725892841816,
-0.019776886329054832,
0.024189213290810585,
-0.09953472018241882,
-0.0420629121363163,
-0.07577424496412277,
0.1610109806060791,
-0.0823417529463768,
0.010584508068859577,
0.005635105073451996,
0.05099690333008766,
0.05314045026898384,
-0.16547076404094696,
0.026296650990843773,
-0.028680283576250076,
0.0016421031905338168,
-0.0294809527695179,
-0.07276137918233871,
0.08940605819225311,
0.12869727611541748,
-0.05615836754441261,
0.026650601997971535,
-0.0017051597824320197,
0.1670766919851303,
-0.06944144517183304,
0.054588865488767624,
0.18675395846366882,
0.015528189018368721,
0.02049645595252514,
0.03850200027227402,
0.0019800839945673943,
-0.09132006019353867,
0.07205934077501297,
0.04641549289226532,
-0.04490094631910324,
-0.2171301245689392,
-0.025583334267139435,
-0.07983829081058502,
0.056112777441740036,
0.10503479093313217,
0.028372492641210556,
-0.15787014365196228,
0.03334047645330429,
-0.015900036320090294,
0.1658748835325241,
-0.010775690898299217,
0.06122064217925072,
0.003502311185002327,
0.037304624915122986,
0.005355841480195522,
-0.10106456279754639,
0.003990457858890295,
0.057778820395469666,
0.08279304951429367,
0.21106411516666412,
-0.0897170677781105,
0.1638810634613037,
0.019546711817383766,
0.11974287778139114,
0.036298610270023346,
0.09770797938108444,
-0.10472161322832108,
0.011024015955626965,
0.010575784370303154,
-0.02154209464788437,
-0.07293783128261566,
0.03977429121732712,
-0.04740189388394356,
0.07954128086566925,
-0.08839378505945206,
0.03680439665913582,
0.026854509487748146,
0.19261641800403595,
0.10343718528747559,
-0.18328867852687836,
-0.15044225752353668,
0.0035422041546553373,
-0.08480653166770935,
-0.09608838707208633,
0.0680798813700676,
0.20893536508083344,
-0.06006866693496704,
0.021037785336375237,
-0.02367890067398548,
0.12739381194114685,
-0.08939852565526962,
-0.012462318874895573,
0.04882228001952171,
0.07303854078054428,
0.006559240166097879,
0.10042805969715118,
-0.2538152039051056,
0.08423537760972977,
0.020195510238409042,
0.09604024887084961,
-0.0181693397462368,
0.054002128541469574,
-0.02467077039182186,
-0.012357695028185844,
0.07551851123571396,
0.005458386614918709,
-0.019973300397396088,
-0.19876033067703247,
-0.0433318056166172,
0.026683775708079338,
0.05772949382662773,
-0.0197409950196743,
0.09659893065690994,
-0.01029964443296194,
0.03643634170293808,
-0.025604745373129845,
-0.09771865606307983,
-0.08931545168161392,
-0.10745351761579514,
-0.06012485921382904,
-0.011861882172524929,
-0.047560159116983414,
-0.016758760437369347,
0.04353905841708183,
0.03019419126212597,
0.24401971697807312,
-0.13779966533184052,
-0.0643276646733284,
-0.07545576244592667,
0.07161501049995422,
0.11402484774589539,
-0.10150588303804398,
0.011703179217875004,
0.01889667846262455,
0.0517418310046196,
-0.0445123165845871,
-0.087130606174469,
0.036689504981040955,
-0.06050819903612137,
-0.08165163546800613,
-0.03720425069332123,
0.11425236612558365,
0.01122304331511259,
0.045102544128894806,
0.021360144019126892,
-0.09768060594797134,
-0.015304588712751865,
-0.13156409561634064,
-0.062160421162843704,
-0.0395539328455925,
0.04640376195311546,
-0.010988584719598293,
-0.11764431744813919,
0.06581811606884003,
-0.018775587901473045,
-0.05763121321797371,
0.049133364111185074,
0.1655801236629486,
-0.07277224212884903,
0.005556178279221058,
0.08010975271463394,
-0.04145905748009682,
-0.18482978641986847,
-0.010425060987472534,
0.04442330077290535,
0.08536524325609207,
-0.030222898349165916,
-0.1500381976366043,
0.07541820406913757,
-0.02077432908117771,
0.01241742167621851,
0.008739719167351723,
-0.25036653876304626,
-0.12095941603183746,
0.007541711907833815,
0.061289288103580475,
0.0630308985710144,
-0.08568094670772552,
-0.04404813423752785,
-0.04618316888809204,
-0.05807662382721901,
0.07900911569595337,
0.04541122540831566,
0.10121264308691025,
-0.026937171816825867,
0.0279204323887825,
0.04402463138103485,
-0.03138859570026398,
0.04456742852926254,
0.016661548987030983,
0.11072640120983124,
-0.01513416226953268,
0.0045824721455574036,
0.05337030440568924,
-0.07490704953670502,
0.1819385439157486,
-0.1505880057811737,
0.09419217705726624,
-0.15598784387111664,
-0.03204364329576492,
-0.0469370037317276,
0.0052299220114946365,
-0.03547995164990425,
-0.03599090874195099,
-0.13245545327663422,
0.03989268094301224,
0.040883854031562805,
-0.013978143222630024,
0.06382814049720764,
0.0035604657605290413,
-0.047799352556467056,
0.050039730966091156,
0.09877090156078339,
-0.002911536954343319,
-0.12023753672838211,
0.035458315163850784,
0.02078293450176716,
0.10021945089101791,
-0.1840450018644333,
0.03172510117292404,
0.09899801015853882,
0.01698225736618042,
0.09089677035808563,
0.015921182930469513,
-0.10677103698253632,
0.043476324528455734,
0.06191321834921837,
-0.06620950251817703,
-0.06129708141088486,
-0.025071345269680023,
-0.052140869200229645,
-0.10280005633831024,
0.045498717576265335,
0.09454401582479477,
-0.03530854731798172,
-0.001785883097909391,
-0.011376062408089638,
0.0019077019533142447,
-0.0842059776186943,
0.1724899262189865,
0.01470845378935337,
0.0861341804265976,
-0.056422941386699677,
0.0683244839310646,
0.08273076266050339,
-0.09703173488378525,
0.029957879334688187,
0.13365969061851501,
-0.09375002235174179,
-0.015415691770613194,
0.08555256575345993,
0.13362464308738708,
-0.022540336474776268,
-0.03875317424535751,
-0.09898310899734497,
-0.0856451615691185,
0.012689017690718174,
0.06798144429922104,
0.07074663043022156,
0.10110782831907272,
-0.01327251736074686,
0.0063855815678834915,
-0.13628195226192474,
0.09223587810993195,
0.09070219099521637,
0.03699959069490433,
-0.12328141927719116,
0.1495606154203415,
0.03709764778614044,
0.09690974652767181,
-0.004299733322113752,
0.02493264526128769,
-0.11624931544065475,
0.03904200717806816,
-0.04796137660741806,
0.034457579255104065,
-0.019127249717712402,
0.03927198052406311,
-0.06219368800520897,
0.04340328648686409,
-0.033834341913461685,
0.045039232820272446,
-0.04098536819219589,
-0.024756092578172684,
-0.02588718570768833,
0.026901282370090485,
-0.0629318505525589,
-0.012231339700520039,
0.0052082473412156105,
-0.10269811004400253,
0.09903091192245483,
-0.05257561057806015,
0.004032665863633156,
0.005504682194441557,
0.025567764416337013,
0.035385530441999435,
0.008529751561582088,
0.047108639031648636,
-0.0029396379832178354,
-0.005623857025057077,
0.03321702033281326,
0.01901288330554962,
-0.0035800302866846323,
-0.005778399761766195,
0.09086000174283981,
-0.1253095269203186,
-0.07441672682762146,
-0.09114541113376617,
-0.055681828409433365,
-0.06386704742908478,
0.08548827469348907,
0.07608828693628311,
0.07656458765268326,
0.09103880077600479,
-0.03567282110452652,
0.00993461161851883,
-0.19686661660671234,
-0.044726282358169556,
0.049660589545965195,
-0.00037642818642780185,
-0.11832833290100098,
-0.04766090214252472,
0.07271844893693924,
-0.03575501963496208,
0.08052313327789307,
-0.02422996796667576,
0.04596548154950142,
-0.012745994143188,
-0.05274740606546402,
-0.04492489993572235,
0.0019462270429357886,
0.13797062635421753,
-0.10488294810056686,
0.008501283824443817,
0.004765165504068136,
0.012438194826245308,
0.04276064410805702,
0.14860022068023682,
0.152891606092453,
0.11878181248903275,
0.02511082962155342,
0.09631752967834473,
-0.04570499435067177,
-0.03880017250776291,
-0.12536710500717163,
0.07176872342824936,
-0.044291891157627106,
0.028089486062526703,
-0.02710367739200592,
0.11492680013179779,
0.08123254776000977,
-0.13922472298145294,
0.09193749725818634,
0.0056714327074587345,
-0.09198053926229477,
-0.04551165923476219,
-0.08886805921792984,
-0.03486813232302666,
-0.10448308289051056,
0.01281104888767004,
-0.09140019863843918,
-0.022281737998127937,
0.04577901214361191,
0.026121746748685837,
-0.031423814594745636,
0.16976577043533325,
-0.040076177567243576,
-0.05654530972242355,
0.04556157439947128,
0.03944476693868637,
0.007251003757119179,
0.0994197353720665,
0.02125210128724575,
0.06866943836212158,
-0.06346344202756882,
0.07367316633462906,
0.0334346629679203,
-0.002883529057726264,
0.03119732066988945,
0.03503677621483803,
-0.0241253562271595,
-0.048175983130931854,
-0.0028580648358911276,
0.09947257488965988,
0.14077065885066986,
0.040978848934173584,
-0.028917640447616577,
-0.05512483790516853,
0.15309222042560577,
-0.046285782009363174,
-0.03275858610868454,
-0.1286039501428604,
0.15944503247737885,
0.02040019817650318,
0.00174924754537642,
0.010837043635547161,
-0.07361797243356705,
-0.014720004051923752,
0.24675773084163666,
0.04417349025607109,
-0.04856901988387108,
-0.02636641450226307,
-0.017272384837269783,
-0.011614997871220112,
-0.04740242287516594,
0.15903963148593903,
0.00766135985031724,
0.22974929213523865,
0.02449839375913143,
-0.025043565779924393,
-0.04125699773430824,
-0.05385970324277878,
0.029468365013599396,
0.18139782547950745,
-0.046200230717659,
0.03273070976138115,
-0.10140829533338547,
-0.0039656925946474075,
0.012309315614402294,
-0.1426393985748291,
0.1346418410539627,
-0.13115711510181427,
-0.06300237029790878,
0.012641051784157753,
0.044509366154670715,
-0.03962884843349457,
0.06678260117769241,
-0.026962807402014732,
0.06467083841562271,
0.05299507454037666,
-0.034714434295892715,
-0.09586498141288757,
-0.14256788790225983,
0.055270932614803314,
-0.014145894907414913,
0.13390208780765533,
0.012436210177838802,
0.0959608182311058,
0.08389769494533539,
0.01704707369208336,
-0.07499456405639648,
0.06841105967760086,
0.03266813978552818,
0.016363780945539474,
0.04811308532953262,
0.11186132580041885,
-0.04605809599161148,
0.17512297630310059,
0.02249949239194393,
-0.02511812373995781,
-0.027259239926934242,
-0.06490300595760345,
-0.021529482677578926,
-0.17508980631828308,
-0.003832174465060234,
-0.05680747330188751,
0.14199261367321014,
0.18766914308071136,
-0.055488914251327515,
-0.014792605303227901,
-0.038654640316963196,
0.09332361817359924,
0.005429412238299847,
0.09035266935825348,
-0.007716855499893427,
-0.16696572303771973,
0.013309024274349213,
-0.04116063565015793,
0.0030380815733224154,
-0.18444116413593292,
-0.05724372714757919,
-0.03629598021507263,
-0.03831702470779419,
-0.10092463344335556,
0.1469646543264389,
0.05623796582221985,
0.03627607971429825,
-0.04481179639697075,
-0.10403358191251755,
-0.01159645989537239,
0.05892014503479004,
-0.13005487620830536,
-0.12851452827453613
] |
null | null | transformers |
# CodeTrans model for code documentation generation php
Pretrained model on programming language php using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized php code functions: it works best with tokenized php functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used single-task training on CodeSearchNet Corpus php dataset.
## Intended uses & limitations
The model could be used to generate the description for the php function or be fine-tuned on other php code tasks. It can be used on unparsed and untokenized php code. However, if the php code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate php function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_php"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_php", skip_special_tokens=True),
device=0
)
tokenized_code = "public static function update ( $ table ) { if ( ! is_array ( $ table ) ) { $ table = json_decode ( $ table , true ) ; } if ( ! SchemaManager :: tableExists ( $ table [ 'oldName' ] ) ) { throw SchemaException :: tableDoesNotExist ( $ table [ 'oldName' ] ) ; } $ updater = new self ( $ table ) ; $ updater -> updateTable ( ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/single%20task/function%20documentation%20generation/php/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "public static function update ( $ table ) { if ( ! is_array ( $ table ) ) { $ table = json_decode ( $ table , true ) ; } if ( ! SchemaManager :: tableExists ( $ table [ 'oldName' ] ) ) { throw SchemaException :: tableDoesNotExist ( $ table [ 'oldName' ] ) ; } $ updater = new self ( $ table ) ; $ updater -> updateTable ( ) ; }"}]} | summarization | SEBIS/code_trans_t5_base_code_documentation_generation_php | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
| CodeTrans model for code documentation generation php
=====================================================
Pretrained model on programming language php using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized php code functions: it works best with tokenized php functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used single-task training on CodeSearchNet Corpus php dataset.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the php function or be fine-tuned on other php code tasks. It can be used on unparsed and untokenized php code. However, if the php code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate php function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
46,
112
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.12262928485870361,
0.04760627821087837,
-0.0012154183350503445,
0.07886946201324463,
0.12940795719623566,
0.004873304627835751,
0.07704035937786102,
0.036619167774915695,
-0.0347420759499073,
-0.03032386675477028,
0.10819561034440994,
0.11034061014652252,
0.008230275474488735,
0.10252544283866882,
-0.01134067215025425,
-0.17104800045490265,
-0.010767240077257156,
0.06730972975492477,
-0.21671870350837708,
0.13199742138385773,
0.11851270496845245,
-0.06558118015527725,
0.08601730316877365,
0.021582957357168198,
-0.16458946466445923,
0.08412481099367142,
-0.026823580265045166,
-0.046219903975725174,
0.11477005481719971,
0.0747961550951004,
0.10903230309486389,
0.014466359280049801,
0.004460393451154232,
-0.233376145362854,
0.039286114275455475,
-0.03056967630982399,
-0.007004658225923777,
0.03763728216290474,
0.06088586524128914,
-0.05420137941837311,
0.09736865758895874,
-0.007025453262031078,
0.018516432493925095,
0.05514291301369667,
-0.08719692379236221,
-0.07074491679668427,
-0.011377259157598019,
0.023999623954296112,
0.06588952243328094,
0.09725863486528397,
0.022855110466480255,
0.08127547800540924,
-0.15506626665592194,
0.10440061241388321,
0.11186080425977707,
-0.1556883007287979,
-0.019516535103321075,
0.08174820244312286,
0.06318126618862152,
-0.08528182655572891,
-0.04502119868993759,
-0.002742310054600239,
0.04305224493145943,
0.008077838458120823,
-0.00821888167411089,
-0.12990133464336395,
-0.11894845217466354,
0.07736934721469879,
-0.06583396345376968,
-0.08417248725891113,
0.30025655031204224,
0.01911121793091297,
-0.011828262358903885,
-0.04981391131877899,
-0.04083075001835823,
0.053098343312740326,
-0.03974553197622299,
0.0005041530821472406,
0.028536846861243248,
0.013614614494144917,
-0.033435191959142685,
-0.02855677902698517,
-0.1170823723077774,
-0.13066856563091278,
0.009108278900384903,
0.05284993350505829,
0.013345818035304546,
0.01639782264828682,
-0.1682516485452652,
0.11027464270591736,
0.11093065142631531,
-0.05371757224202156,
0.027423422783613205,
-0.03585556522011757,
0.006804251577705145,
0.019649123772978783,
-0.08437584340572357,
-0.18006815016269684,
0.13459685444831848,
0.1164764016866684,
-0.08270339667797089,
0.0744962990283966,
0.05236651748418808,
0.05991159752011299,
0.008883330971002579,
0.1828211396932602,
0.00821502786129713,
-0.05623449385166168,
0.05227431654930115,
-0.024988772347569466,
-0.07367564737796783,
0.04209761694073677,
-0.07607656717300415,
-0.03506382182240486,
0.017043642699718475,
0.10747820883989334,
-0.08368456363677979,
0.08886928111314774,
-0.06414017826318741,
-0.021532094106078148,
0.01633272133767605,
-0.14897379279136658,
-0.017526209354400635,
0.037004221230745316,
-0.03213954716920853,
-0.04694126918911934,
0.11347247660160065,
-0.07932165265083313,
-0.15784600377082825,
-0.03357440605759621,
-0.06607287377119064,
-0.008363917469978333,
-0.07249923050403595,
-0.0403604581952095,
0.026641206815838814,
0.07297472655773163,
0.05846766382455826,
-0.11587851494550705,
-0.13543817400932312,
-0.015063595026731491,
0.0923188328742981,
0.022119244560599327,
0.03070998191833496,
-0.10241621732711792,
0.0042912014760077,
-0.021739793941378593,
-0.018138572573661804,
0.06065567210316658,
-0.0797632560133934,
0.06525345891714096,
0.07369360327720642,
0.012300543487071991,
-0.07460524886846542,
0.05211435630917549,
-0.11318839341402054,
0.0709679126739502,
-0.12551377713680267,
0.09131905436515808,
-0.03620119392871857,
0.08725175261497498,
-0.11970026791095734,
-0.05745616555213928,
0.011630876921117306,
0.055927153676748276,
0.06744839996099472,
0.1683690994977951,
-0.07072876393795013,
-0.05456588789820671,
0.18006297945976257,
-0.0820271372795105,
-0.21747203171253204,
0.05060485005378723,
-0.07236688584089279,
0.1694127321243286,
0.05525492876768112,
0.19060862064361572,
0.13479328155517578,
-0.06404291838407516,
0.07582956552505493,
0.10001631081104279,
-0.014829853549599648,
-0.08675727248191833,
0.07200316339731216,
0.0007430611294694245,
-0.11827385425567627,
0.07129202038049698,
-0.0471414253115654,
0.08109548687934875,
-0.011490928940474987,
-0.05575262010097504,
-0.005935200955718756,
-0.062359198927879333,
0.05176370590925217,
0.0030755391344428062,
0.09046275913715363,
0.008509018458425999,
-0.010347381234169006,
0.0864151120185852,
0.09488870203495026,
-0.13032583892345428,
0.028679264709353447,
-0.09653836488723755,
0.07499799132347107,
-0.10940288007259369,
0.035337500274181366,
-0.23279063403606415,
-0.008769621141254902,
-0.015922199934720993,
0.004262278787791729,
0.06710365414619446,
0.009961425326764584,
0.038383688777685165,
-0.03212357312440872,
0.002673052018508315,
-0.00797353032976389,
-0.000022038080714992248,
-0.01335128303617239,
-0.045871760696172714,
-0.09447258710861206,
-0.020684923976659775,
-0.038175828754901886,
0.05873965099453926,
-0.16601502895355225,
-0.010973574593663216,
0.06439600884914398,
0.057128168642520905,
-0.008840201422572136,
0.026113804429769516,
0.01727687567472458,
0.04664018005132675,
-0.02148444950580597,
0.0024178270250558853,
0.07700604200363159,
0.03184773772954941,
-0.0984547957777977,
0.03753891587257385,
-0.024071795865893364,
0.05575600266456604,
0.11477924883365631,
-0.18189719319343567,
-0.04731661081314087,
-0.11792001873254776,
-0.04310713708400726,
-0.01096101850271225,
0.049748655408620834,
-0.023605553433299065,
0.24043118953704834,
0.006393253803253174,
0.19207946956157684,
-0.13113726675510406,
-0.059940095990896225,
-0.026383964344859123,
-0.014203071594238281,
0.041681211441755295,
0.15847896039485931,
0.08450357615947723,
-0.1418822705745697,
0.05680086463689804,
0.11675970256328583,
-0.014658446423709393,
0.1432337462902069,
-0.08221453428268433,
-0.03806329518556595,
-0.007090677972882986,
0.08593928068876266,
-0.03546794503927231,
0.14315484464168549,
-0.2596267759799957,
-0.03086424432694912,
0.01280622836202383,
-0.008229797706007957,
0.11268039792776108,
-0.14003410935401917,
0.02234337478876114,
0.015464953146874905,
-0.05633513629436493,
-0.09999621659517288,
0.02806790918111801,
0.00933816283941269,
0.033488769084215164,
0.02673506736755371,
0.008002368733286858,
0.043828003108501434,
-0.029686471447348595,
-0.13052891194820404,
0.24799737334251404,
-0.08390206843614578,
-0.24599212408065796,
-0.17442481219768524,
0.037579674273729324,
-0.056233324110507965,
-0.020020581781864166,
0.05716891214251518,
-0.08938515931367874,
-0.03863527625799179,
-0.039112068712711334,
0.07971645146608353,
-0.05461287125945091,
-0.03199930116534233,
-0.03621898218989372,
0.06689143925905228,
0.04308051988482475,
-0.18800096213817596,
-0.016093986108899117,
0.003501336555927992,
0.05702546611428261,
-0.02657814882695675,
-0.12681391835212708,
0.11996336281299591,
0.1038338840007782,
-0.031083112582564354,
0.06979965418577194,
-0.023936856538057327,
0.22739292681217194,
-0.06095912307500839,
-0.0910612940788269,
0.15452416241168976,
-0.09576360881328583,
0.02231455035507679,
0.0030331916641443968,
0.020967066287994385,
-0.09950436651706696,
0.014011658728122711,
-0.009704535827040672,
-0.04786228761076927,
-0.2560635209083557,
-0.11983998864889145,
-0.08486350625753403,
0.0287313349545002,
0.05497843027114868,
0.06084080785512924,
-0.09917228668928146,
0.06347830593585968,
0.06414846330881119,
0.09353846311569214,
-0.0075041260570287704,
0.05429311469197273,
0.12887068092823029,
-0.007460867986083031,
-0.024818703532218933,
-0.11898493021726608,
-0.0802903026342392,
0.04524555802345276,
0.10023193806409836,
0.1659216284751892,
0.0026679104194045067,
0.1448908895254135,
0.08739381283521652,
0.04788118228316307,
-0.010629955679178238,
0.1614646017551422,
-0.0912775844335556,
0.03870979696512222,
-0.007468048948794603,
-0.04204998537898064,
-0.1118837296962738,
0.03256922960281372,
-0.04853528365492821,
0.019635895267128944,
-0.1636836975812912,
-0.11283131688833237,
0.07981999963521957,
0.10857048630714417,
-0.012630186043679714,
-0.2404697984457016,
-0.09991899877786636,
-0.006136797368526459,
-0.09299591928720474,
-0.0652763843536377,
0.06198379769921303,
0.06564455479383469,
-0.11897740513086319,
0.007312208879739046,
-0.04550378397107124,
0.17756038904190063,
-0.06418716907501221,
0.02143850177526474,
-0.036685362458229065,
-0.0637383908033371,
0.0231575109064579,
0.16719147562980652,
-0.16900081932544708,
0.24166081845760345,
0.013931556604802608,
0.02307438664138317,
-0.07935979962348938,
0.026534339413046837,
0.008023289032280445,
0.0711965337395668,
0.1301904171705246,
-0.018702076748013496,
0.05036752671003342,
-0.13175565004348755,
0.03500528261065483,
0.10102719068527222,
0.0785847008228302,
-0.026170639321208,
0.06117613613605499,
-0.036297496408224106,
0.023765217512845993,
-0.013016351498663425,
-0.06938708573579788,
-0.07257036864757538,
-0.14201514422893524,
-0.020070886239409447,
-0.04826807603240013,
0.049688056111335754,
-0.019484957680106163,
0.0310247540473938,
0.045817017555236816,
0.17494334280490875,
-0.09585664421319962,
-0.07388953119516373,
-0.10680001974105835,
0.013029116205871105,
0.10699988901615143,
-0.09787636995315552,
0.03202376514673233,
-0.03189215809106827,
-0.001257964177057147,
0.01599852181971073,
-0.14358074963092804,
0.04009275138378143,
-0.05306731164455414,
0.0023385558743029833,
-0.024331439286470413,
0.11785714328289032,
-0.02360720932483673,
-0.019569730386137962,
0.06278139352798462,
-0.07797306776046753,
-0.09209495782852173,
-0.1384000927209854,
-0.1157410740852356,
-0.0893067941069603,
0.09793616831302643,
0.020800303667783737,
-0.13009709119796753,
0.09156661480665207,
-0.006228778976947069,
-0.0030727426055818796,
0.2267014980316162,
0.09021448343992233,
-0.024488531053066254,
0.01620701141655445,
0.1564667820930481,
-0.11232444643974304,
-0.27545031905174255,
-0.030617723241448402,
-0.03763485699892044,
0.02603120356798172,
-0.015664558857679367,
-0.1463547646999359,
0.12929578125476837,
-0.05169260501861572,
0.029544692486524582,
-0.01269586756825447,
-0.2651468813419342,
-0.10508961975574493,
0.1234230101108551,
0.11159229278564453,
0.07429447025060654,
-0.1340339034795761,
-0.06641264259815216,
-0.08774331212043762,
-0.1740356683731079,
0.12378684431314468,
-0.11006639152765274,
0.082959845662117,
-0.007445476017892361,
0.05853297561407089,
0.012530605308711529,
-0.05197453126311302,
0.1017027273774147,
0.011681284755468369,
0.09012142568826675,
-0.028160294517874718,
-0.0885629653930664,
0.11754115670919418,
-0.0348907895386219,
0.13227123022079468,
-0.09786419570446014,
0.09977711737155914,
-0.23912550508975983,
-0.05291642248630524,
-0.028790703043341637,
0.03650360181927681,
-0.004970479756593704,
-0.04570024460554123,
-0.07298785448074341,
-0.006572564598172903,
0.05111750215291977,
0.02070360630750656,
0.07691248506307602,
-0.036901265382766724,
-0.03764912113547325,
0.10307062417268753,
0.1504017412662506,
-0.051146235316991806,
-0.13650187849998474,
0.03602183237671852,
0.011856094934046268,
0.09518123418092728,
-0.23696278035640717,
0.09205339848995209,
0.10998693108558655,
0.0270924624055624,
0.09469205886125565,
0.0797029510140419,
-0.024459538981318474,
0.014639617875218391,
0.08570519834756851,
-0.13507264852523804,
-0.09310352057218552,
-0.0654577761888504,
-0.11154911667108536,
-0.043084144592285156,
0.08413206785917282,
0.12173021584749222,
-0.028065793216228485,
-0.0018933838000521064,
-0.0008011278114281595,
-0.035286445170640945,
-0.13666321337223053,
0.14727076888084412,
0.06416239589452744,
0.04950134828686714,
-0.09191901236772537,
0.06727571040391922,
0.044020239263772964,
-0.16898779571056366,
-0.008366351015865803,
0.10971211642026901,
-0.12197969108819962,
-0.08200690895318985,
0.02991180121898651,
0.26404592394828796,
-0.09276256710290909,
-0.11105155944824219,
-0.15069465339183807,
-0.04752015694975853,
0.028780262917280197,
0.17503581941127777,
0.10576356202363968,
0.07573447376489639,
-0.024332145228981972,
-0.011414985172450542,
-0.07695970684289932,
0.07378756254911423,
0.10983534157276154,
-0.020304000005126,
-0.08252271264791489,
0.0569087415933609,
-0.00204598275013268,
0.15315914154052734,
-0.04597173631191254,
-0.03763118386268616,
-0.17712248861789703,
0.0855940580368042,
-0.14724156260490417,
0.07227043062448502,
-0.0515754371881485,
0.04103505611419678,
0.010714331641793251,
0.022809306159615517,
-0.034131281077861786,
0.06120027229189873,
-0.0811849981546402,
0.009427023120224476,
0.013242391869425774,
0.08876445144414902,
-0.10152477771043777,
0.01637013629078865,
0.0791422426700592,
-0.04076681658625603,
0.09380859136581421,
0.02932831458747387,
-0.09037821739912033,
0.1102731004357338,
-0.21593187749385834,
-0.023837920278310776,
0.025553997606039047,
0.0414920337498188,
0.026447072625160217,
-0.004112796857953072,
0.04701744392514229,
0.050579387694597244,
0.037077683955430984,
-0.009364018216729164,
0.13682295382022858,
-0.12555386126041412,
-0.0915583074092865,
-0.0766422376036644,
-0.11559824645519257,
-0.0317155197262764,
0.02403712272644043,
0.040690239518880844,
0.09561042487621307,
0.0948786810040474,
-0.04289185255765915,
0.03113187476992607,
-0.09130161255598068,
-0.01643199473619461,
0.06258826702833176,
-0.05706740543246269,
-0.07813455909490585,
-0.11657276749610901,
0.025581877678632736,
-0.05075637623667717,
0.2386331707239151,
0.001829805551096797,
0.13424904644489288,
-0.00595724955201149,
0.012397249229252338,
0.0476570799946785,
0.03165788576006889,
0.21101954579353333,
-0.0335560105741024,
0.03791658952832222,
-0.06957688927650452,
0.07759533077478409,
0.03240098059177399,
0.09378817677497864,
0.06998982280492783,
0.1460052877664566,
-0.019080257043242455,
0.11804229766130447,
0.0035906280390918255,
0.06698235869407654,
-0.03320050612092018,
-0.06684892624616623,
0.08020275086164474,
0.0698043629527092,
-0.07062719762325287,
0.13433833420276642,
0.10284752398729324,
-0.08120587468147278,
0.10038686543703079,
0.008353189565241337,
-0.09305467456579208,
-0.04571501538157463,
-0.03208451718091965,
-0.064858578145504,
-0.14871251583099365,
0.0028512070421129465,
-0.10355640947818756,
-0.0764349102973938,
0.08252251148223877,
0.025703029707074165,
-0.0531664676964283,
0.22060301899909973,
0.014973213896155357,
-0.07186220586299896,
0.04837231710553169,
-0.016980314627289772,
0.01976759172976017,
0.03334276005625725,
0.05857377126812935,
0.010715940035879612,
-0.046721719205379486,
0.011981552466750145,
0.0655512586236,
-0.04876922816038132,
-0.0036231924314051867,
-0.06717211753129959,
-0.009591198526322842,
-0.05317304655909538,
0.07203220576047897,
0.0062725236639380455,
0.09944625198841095,
0.017221972346305847,
-0.050882309675216675,
-0.012068766169250011,
0.15876121819019318,
-0.02466759830713272,
-0.08041749149560928,
-0.13216504454612732,
0.16458986699581146,
0.054712820798158646,
0.044922009110450745,
0.01720765233039856,
-0.06581538170576096,
-0.014541764743626118,
0.3312808573246002,
0.2047710120677948,
-0.06427378207445145,
0.010111432522535324,
0.04875277727842331,
0.023115374147892,
0.029657764360308647,
0.15189988911151886,
0.02556792087852955,
0.24802890419960022,
-0.014889980666339397,
-0.09478969126939774,
-0.047573018819093704,
-0.046963032335042953,
0.003366398625075817,
0.14621153473854065,
0.0412418469786644,
-0.04987327381968498,
-0.03781063109636307,
0.10026782006025314,
-0.13482987880706787,
-0.10979405045509338,
0.05491257831454277,
-0.12124012410640717,
-0.07154269516468048,
-0.05960661545395851,
0.013585829176008701,
-0.01798424869775772,
0.023548685014247894,
-0.031910110265016556,
-0.006636586505919695,
0.03512117266654968,
0.03095722198486328,
-0.1531020551919937,
-0.08724387735128403,
0.07170771062374115,
-0.03755025938153267,
0.16513460874557495,
-0.00027630801196210086,
0.12023650109767914,
0.0896209105849266,
0.038092128932476044,
-0.03123498521745205,
0.01675710454583168,
0.07693451642990112,
0.0028023060876876116,
0.06983889639377594,
-0.01644425466656685,
-0.020767204463481903,
0.04750363528728485,
-0.04588143527507782,
-0.03610702231526375,
0.061551257967948914,
0.011822688393294811,
-0.03164464607834816,
-0.1469152569770813,
-0.01894978992640972,
-0.12661726772785187,
0.07679063826799393,
0.1498335301876068,
-0.04018877074122429,
0.021775418892502785,
-0.08412735164165497,
0.12383437156677246,
-0.009225428104400635,
-0.06360487639904022,
-0.08296214044094086,
-0.12738695740699768,
-0.010990479029715061,
0.0095487916842103,
-0.037409599870443344,
-0.24147361516952515,
-0.0026693278923630714,
-0.02636227197945118,
0.013167647644877434,
-0.02112038992345333,
0.13212887942790985,
0.11363628506660461,
0.025373701006174088,
-0.01704387366771698,
-0.13040247559547424,
-0.022755742073059082,
0.06642117351293564,
-0.10523014515638351,
-0.12614238262176514
] |
null | null | transformers |
# CodeTrans model for code documentation generation php
Pretrained model on programming language php using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized php code functions: it works best with tokenized php functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the php function or be fine-tuned on other php code tasks. It can be used on unparsed and untokenized php code. However, if the php code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate php function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_php_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_php_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "public static function update ( $ table ) { if ( ! is_array ( $ table ) ) { $ table = json_decode ( $ table , true ) ; } if ( ! SchemaManager :: tableExists ( $ table [ 'oldName' ] ) ) { throw SchemaException :: tableDoesNotExist ( $ table [ 'oldName' ] ) ; } $ updater = new self ( $ table ) ; $ updater -> updateTable ( ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/function%20documentation%20generation/php/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 360,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "public static function update ( $ table ) { if ( ! is_array ( $ table ) ) { $ table = json_decode ( $ table , true ) ; } if ( ! SchemaManager :: tableExists ( $ table [ 'oldName' ] ) ) { throw SchemaException :: tableDoesNotExist ( $ table [ 'oldName' ] ) ; } $ updater = new self ( $ table ) ; $ updater -> updateTable ( ) ; }"}]} | summarization | SEBIS/code_trans_t5_base_code_documentation_generation_php_multitask | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
| CodeTrans model for code documentation generation php
=====================================================
Pretrained model on programming language php using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized php code functions: it works best with tokenized php functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the php function or be fine-tuned on other php code tasks. It can be used on unparsed and untokenized php code. However, if the php code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate php function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 360,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 360,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 360,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
46,
61,
112
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 360,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.1426914930343628,
-0.024861058220267296,
0.0008379322825931013,
0.1220940575003624,
0.13360679149627686,
0.02282947301864624,
0.0856003537774086,
0.06775010377168655,
-0.009048443287611008,
0.01222450565546751,
0.0753864198923111,
0.0005701679037883878,
0.034246865659952164,
0.13265521824359894,
0.009882302954792976,
-0.15344083309173584,
-0.01984572783112526,
0.09146860241889954,
-0.13221429288387299,
0.11621282994747162,
0.07831188291311264,
-0.10153250396251678,
0.09418614208698273,
-0.011517741717398167,
-0.17599260807037354,
0.032554298639297485,
-0.04802257567644119,
-0.055452123284339905,
0.10524633526802063,
0.03961789980530739,
0.13034352660179138,
-0.006312401965260506,
0.048930663615465164,
-0.1103014424443245,
0.012338393367826939,
0.01971421018242836,
0.05302887037396431,
0.04937699809670448,
0.047578249126672745,
0.09387136250734329,
0.07646924257278442,
-0.019722850993275642,
0.03741491958498955,
0.04391735419631004,
-0.07098378986120224,
-0.013586138375103474,
-0.03612301126122475,
0.10047373920679092,
0.10913461446762085,
0.14372743666172028,
0.018682675436139107,
0.047655947506427765,
-0.09494605660438538,
0.06826413422822952,
0.08152131736278534,
-0.26989948749542236,
-0.0267596784979105,
0.06932032108306885,
0.035303618758916855,
-0.01352014672011137,
-0.07535732537508011,
-0.031570494174957275,
0.06627338379621506,
0.05872327461838722,
0.0943906381726265,
-0.09132914990186691,
-0.04735550284385681,
-0.029870694503188133,
-0.08256226032972336,
-0.024962231516838074,
0.23041929304599762,
0.05003104731440544,
-0.03571903333067894,
-0.04608064889907837,
-0.055541981011629105,
-0.09891974925994873,
0.015276191756129265,
-0.005558317992836237,
0.010324854403734207,
-0.002383204409852624,
-0.030192989856004715,
-0.029535239562392235,
-0.1023736298084259,
-0.10480058193206787,
-0.0325612798333168,
0.06869777292013168,
0.07527683675289154,
0.04598429426550865,
-0.04561343416571617,
0.0739990770816803,
0.04512136057019234,
-0.03759947419166565,
0.02835976704955101,
-0.002822952577844262,
-0.040738388895988464,
-0.004281493369489908,
-0.04492347314953804,
-0.21518078446388245,
0.0074129728600382805,
0.017314363270998,
-0.11749119311571121,
0.07549315690994263,
0.16126033663749695,
0.06601046770811081,
-0.02360067330300808,
0.19632835686206818,
0.003619155380874872,
-0.09818212687969208,
0.029260925948619843,
0.027980444952845573,
-0.03710968419909477,
0.013481682166457176,
-0.11121466010808945,
-0.04075835272669792,
0.0478946790099144,
-0.013802645727992058,
-0.12459349632263184,
0.08400830626487732,
-0.01787176914513111,
-0.042722396552562714,
0.0555061474442482,
-0.08312875032424927,
0.012340468354523182,
-0.002082081511616707,
-0.09625052660703659,
0.0038747701328247786,
0.11135009676218033,
-0.08940767496824265,
-0.14359596371650696,
0.0006200476782396436,
-0.05593611299991608,
-0.0029415064491331577,
-0.11113592982292175,
-0.10266654193401337,
-0.006141978781670332,
-0.026547618210315704,
0.002895807847380638,
-0.14803215861320496,
-0.13325689733028412,
-0.04264409840106964,
0.07907059043645859,
0.013808398507535458,
-0.04563693702220917,
-0.061955902725458145,
0.030986690893769264,
-0.001718458253890276,
-0.03919404745101929,
0.0293581523001194,
-0.04702957347035408,
0.0800183117389679,
0.06632204353809357,
0.06550158560276031,
-0.04741315916180611,
0.05882249400019646,
-0.09242524951696396,
0.028927002102136612,
-0.14656461775302887,
0.08082102239131927,
0.048447828739881516,
0.11578536778688431,
-0.08999689668416977,
-0.10771512240171432,
-0.047816161066293716,
0.048987481743097305,
0.07709480077028275,
0.07251672446727753,
-0.0360579714179039,
-0.009997553192079067,
0.10449974983930588,
-0.09159785509109497,
-0.16276615858078003,
0.10743238031864166,
-0.018554765731096268,
0.08493880182504654,
0.07738441973924637,
0.1513708233833313,
0.12757933139801025,
-0.05523703247308731,
0.00718408590182662,
0.0740555077791214,
-0.03296681120991707,
-0.20173488557338715,
0.055849529802799225,
0.05566886067390442,
-0.10030430555343628,
0.01972397230565548,
0.039690934121608734,
0.12099017202854156,
-0.0436614453792572,
-0.025922689586877823,
-0.024746807292103767,
-0.1106271967291832,
-0.006028368137776852,
0.009296216070652008,
0.0935259759426117,
-0.04660710319876671,
-0.06736944615840912,
0.043248001486063004,
0.08163941651582718,
-0.0810864269733429,
0.029007183387875557,
-0.07662715017795563,
-0.05012122914195061,
-0.10010390728712082,
0.01211516186594963,
-0.1503402590751648,
0.0071718511171638966,
0.0022549424320459366,
0.033966630697250366,
0.05974176153540611,
0.09659746289253235,
0.04596445709466934,
0.026561150327324867,
-0.027951380237936974,
-0.034578487277030945,
-0.03153803572058678,
-0.04453680291771889,
-0.12677954137325287,
-0.00819329172372818,
-0.06903475522994995,
-0.010836286470293999,
-0.00941165629774332,
-0.1424945592880249,
0.03418669104576111,
-0.0871693566441536,
-0.009232407435774803,
-0.012709507718682289,
0.006406143307685852,
0.04714105278253555,
0.07245542854070663,
-0.023994216695427895,
-0.061484187841415405,
0.08718893676996231,
0.0764928087592125,
-0.07569579035043716,
0.0015822192654013634,
-0.06243814155459404,
0.012024401687085629,
0.08599399775266647,
-0.07657377421855927,
-0.104427769780159,
-0.021822268143296242,
-0.037212520837783813,
-0.06060577556490898,
-0.003489095950499177,
-0.018515009433031082,
0.2830825448036194,
0.004267195705324411,
0.18484243750572205,
-0.08717547357082367,
-0.02224069833755493,
-0.018765537068247795,
-0.02446570061147213,
0.08266458660364151,
0.1028405949473381,
0.06350935995578766,
-0.11651264876127243,
0.05115222930908203,
-0.014985661022365093,
-0.06529868394136429,
0.07756532728672028,
-0.019461780786514282,
-0.05047403648495674,
0.0502985417842865,
0.05450226739048958,
-0.012036442756652832,
0.10432395339012146,
-0.14713172614574432,
-0.03267969563603401,
-0.0019316152902320027,
0.020524226129055023,
0.05527579411864281,
-0.16078677773475647,
0.028620369732379913,
0.03676023706793785,
-0.022182753309607506,
-0.03694970905780792,
-0.03496851772069931,
-0.031102273613214493,
0.03470935299992561,
0.05697684362530708,
-0.023778744041919708,
0.02528439648449421,
0.01780984364449978,
-0.09253239631652832,
0.2139953374862671,
-0.01704038865864277,
-0.24908189475536346,
-0.1327260583639145,
0.03552698716521263,
-0.010238264687359333,
-0.008398901671171188,
0.05518127605319023,
-0.09429455548524857,
-0.04488247632980347,
-0.053073372691869736,
0.12163349241018295,
-0.07266626507043839,
0.03577062115073204,
0.008075876161456108,
0.03152375668287277,
0.06593753397464752,
-0.11518710106611252,
0.0036092367954552174,
-0.03287553787231445,
-0.0015129317762330174,
0.009394371882081032,
-0.0833665281534195,
0.08646319806575775,
0.15974785387516022,
-0.06843883544206619,
0.03821931034326553,
-0.025743840262293816,
0.13865786790847778,
-0.043497178703546524,
0.007632168475538492,
0.1851397603750229,
0.016475660726428032,
0.02582887001335621,
0.024554546922445297,
0.02198372222483158,
-0.07657133787870407,
0.052639786154031754,
0.03539135679602623,
-0.05271933972835541,
-0.21002578735351562,
-0.053729791194200516,
-0.0703386515378952,
0.007920279167592525,
0.1527966558933258,
0.060651522129774094,
-0.06676216423511505,
0.07528003305196762,
0.03452982008457184,
0.14805595576763153,
-0.05374545976519585,
0.03989323228597641,
0.0753893032670021,
0.042519234120845795,
0.014585534110665321,
-0.09721878170967102,
-0.04191151633858681,
0.05351700633764267,
0.08897849917411804,
0.22205688059329987,
-0.07993271946907043,
0.11729099601507187,
0.0321563184261322,
0.03512168675661087,
0.015909280627965927,
0.1454206258058548,
-0.0931893140077591,
-0.00706065259873867,
0.006298788823187351,
0.020201649516820908,
-0.07488788664340973,
0.0183926522731781,
-0.0707487240433693,
0.056698739528656006,
-0.14649365842342377,
0.037225574254989624,
0.03608635440468788,
0.2156302034854889,
0.03444395586848259,
-0.2961425483226776,
-0.15340198576450348,
-0.059279412031173706,
-0.08782625943422318,
-0.07788720726966858,
0.06209319084882736,
0.15804819762706757,
-0.048673246055841446,
0.009721698239445686,
-0.042650945484638214,
0.15875820815563202,
-0.11857615411281586,
-0.000716153415851295,
0.059968482702970505,
0.05757376551628113,
0.02040725387632847,
0.10511671006679535,
-0.22830472886562347,
0.13174237310886383,
-0.008543930947780609,
0.0869022086262703,
-0.029077546671032906,
-0.0017056725919246674,
-0.012912111356854439,
0.08040929585695267,
0.05468565225601196,
0.0157448910176754,
0.009206980466842651,
-0.18319456279277802,
-0.07948761433362961,
0.021867305040359497,
-0.014804002828896046,
0.02174397185444832,
0.08216404169797897,
-0.013586243614554405,
0.046498630195856094,
-0.013053342700004578,
-0.08358154445886612,
-0.05708399787545204,
-0.09531997889280319,
-0.06318633258342743,
0.057357318699359894,
-0.038308870047330856,
-0.0157961156219244,
-0.008765138685703278,
0.04650597646832466,
0.19403931498527527,
-0.05906178429722786,
-0.09905768185853958,
-0.10137463361024857,
0.06094817444682121,
0.06519634276628494,
-0.07453442364931107,
0.038912802934646606,
0.02342953346669674,
0.0038207571487873793,
-0.0011495762737467885,
-0.07612147927284241,
0.06626004725694656,
-0.06122602894902229,
0.004085704684257507,
-0.010375084355473518,
0.08556041866540909,
0.004854518920183182,
0.02623981423676014,
0.010074295103549957,
-0.0925697460770607,
-0.07138875126838684,
-0.11458776891231537,
-0.06212630495429039,
-0.09678062051534653,
0.08565391600131989,
-0.03358408436179161,
-0.07236072421073914,
0.15407980978488922,
-0.0034537408500909805,
-0.008091159164905548,
0.1550813764333725,
-0.007365773897618055,
-0.03125530481338501,
-0.04966653510928154,
0.10719974339008331,
-0.03383747488260269,
-0.22369225323200226,
-0.026554251089692116,
0.0646960511803627,
0.05186111107468605,
-0.10955476015806198,
-0.1658753752708435,
0.11309373378753662,
0.014934763312339783,
0.0311115775257349,
0.03875531628727913,
-0.2989024817943573,
-0.10063737630844116,
0.05893772467970848,
0.09212467074394226,
0.2177785336971283,
-0.10964327305555344,
-0.018207356333732605,
-0.0408671572804451,
-0.07908491790294647,
0.08986511826515198,
-0.051019735634326935,
0.1373722106218338,
-0.05398052558302879,
0.07602241635322571,
0.011099725030362606,
-0.0476941242814064,
0.03056495636701584,
0.04106361046433449,
0.07396464049816132,
-0.04533009976148605,
0.025781171396374702,
0.04516424611210823,
-0.0879138931632042,
0.20423342287540436,
-0.1299697905778885,
0.05335488170385361,
-0.16132602095603943,
-0.07958269864320755,
-0.03730035945773125,
0.01265037152916193,
0.0383116714656353,
-0.034328486770391464,
-0.06577355414628983,
0.005310328211635351,
0.02836276963353157,
-0.005023357458412647,
0.052923865616321564,
0.021948348730802536,
-0.030914107337594032,
0.07038495689630508,
0.07935323566198349,
-0.10550499707460403,
-0.19929912686347961,
0.021328814327716827,
0.01665138266980648,
0.12843768298625946,
-0.2361893355846405,
0.013622275553643703,
0.10987747460603714,
-0.015202277339994907,
0.08800233900547028,
0.04790031537413597,
-0.02147631347179413,
0.020747890695929527,
0.06325335055589676,
-0.0837126225233078,
-0.06732688844203949,
-0.02701568603515625,
-0.04094793274998665,
-0.07585448026657104,
0.06285132467746735,
0.08114884048700333,
-0.10517401248216629,
0.012159821577370167,
-0.015813110396265984,
-0.036941759288311005,
-0.10716559737920761,
0.20438893139362335,
0.044262416660785675,
0.06697846949100494,
-0.062517911195755,
0.10187608748674393,
0.07945932447910309,
-0.09639202058315277,
0.02379431389272213,
0.17831039428710938,
-0.11682283878326416,
-0.0710151419043541,
0.07952090352773666,
0.18609923124313354,
-0.041983023285865784,
-0.1055152490735054,
-0.14467565715312958,
-0.09578090161085129,
0.04666874557733536,
0.04424960911273956,
0.06767510622739792,
0.040417566895484924,
-0.056020792573690414,
0.005586358718574047,
-0.1458420753479004,
0.0391867496073246,
0.07358413189649582,
0.04473754018545151,
-0.14187972247600555,
0.1298305094242096,
0.06763418018817902,
0.12174193561077118,
-0.027299750596284866,
0.014807107858359814,
-0.10952349752187729,
0.06341716647148132,
-0.03545833006501198,
0.03810231760144234,
-0.02275605872273445,
0.019972898066043854,
-0.04136284813284874,
0.0032895447220653296,
-0.04460132494568825,
0.06544779241085052,
-0.03240130469202995,
-0.01691223494708538,
-0.0007840048638172448,
0.030702393501996994,
-0.02317919209599495,
-0.02930280938744545,
-0.015357625670731068,
-0.0402083657681942,
0.07248778641223907,
-0.009962866082787514,
-0.06348498910665512,
-0.002098656026646495,
-0.044080689549446106,
0.0005325331585481763,
0.06447048485279083,
0.043920837342739105,
0.02285042405128479,
0.010881759226322174,
0.04472312331199646,
0.031085742637515068,
0.02884606271982193,
-0.01626599207520485,
0.1067342683672905,
-0.10927961766719818,
-0.06728298217058182,
-0.09803348034620285,
-0.0528983436524868,
-0.057269152253866196,
0.030709687620401382,
0.10421174764633179,
0.11254782974720001,
0.13637565076351166,
-0.08195865899324417,
0.019403288140892982,
-0.1582416445016861,
-0.016522187739610672,
0.06524298340082169,
-0.0344063863158226,
-0.0723816454410553,
-0.10089170932769775,
0.060129135847091675,
-0.016369622200727463,
0.1319558173418045,
0.009113474749028683,
0.047145504504442215,
-0.007332997862249613,
0.032882921397686005,
-0.043778978288173676,
-0.028626471757888794,
0.1588912457227707,
-0.08095260709524155,
-0.007603216916322708,
0.01287770364433527,
0.08194385468959808,
0.10546548664569855,
0.13815884292125702,
0.14550206065177917,
0.12101156264543533,
0.0336492545902729,
0.10216937959194183,
-0.03562586009502411,
0.023835770785808563,
-0.15641772747039795,
0.049581509083509445,
-0.03445390239357948,
0.043507788330316544,
-0.046442992985248566,
0.14281421899795532,
0.10875669866800308,
-0.06309712678194046,
0.07811979204416275,
0.01684418134391308,
-0.08700301498174667,
-0.021538859233260155,
-0.04775479435920715,
-0.06038505211472511,
-0.1526421755552292,
-0.010844516567885876,
-0.0609760619699955,
-0.09080538898706436,
0.12849809229373932,
0.01912548393011093,
-0.032810408622026443,
0.23194459080696106,
0.017847925424575806,
-0.03432822972536087,
0.04145950451493263,
-0.013309000059962273,
0.020021595060825348,
0.0028828540816903114,
-0.0037131395656615496,
0.027290277183055878,
-0.030880210921168327,
0.08525098860263824,
0.0039904252626001835,
-0.03176237642765045,
0.04825173318386078,
0.0268984567373991,
-0.03882381692528725,
-0.05235285311937332,
0.030151883140206337,
0.0682070180773735,
0.08852217346429825,
0.013569371774792671,
-0.04660102352499962,
-0.043732114136219025,
0.16267958283424377,
-0.0574503131210804,
-0.07267723977565765,
-0.10287469625473022,
0.12126503139734268,
0.07107637077569962,
-0.024930261075496674,
0.0045038750395178795,
-0.0554361529648304,
-0.045693520456552505,
0.26698219776153564,
0.11042044311761856,
-0.08479045331478119,
-0.03344711661338806,
-0.01966571807861328,
-0.009182038716971874,
-0.011121411807835102,
0.17954891920089722,
0.07676395773887634,
0.15849849581718445,
-0.024369899183511734,
-0.07599534094333649,
-0.07335416227579117,
-0.020179858431220055,
-0.049409035593271255,
0.10567871481180191,
0.02897651307284832,
0.04590830206871033,
-0.09411335736513138,
0.043646592646837234,
-0.01929292641580105,
-0.07623433321714401,
0.1157483384013176,
-0.10154477506875992,
-0.07638037949800491,
-0.01666489616036415,
0.029253141954541206,
-0.002416149480268359,
0.07616172730922699,
-0.02764926664531231,
0.0757443979382515,
0.015253493562340736,
-0.02149686962366104,
-0.12652531266212463,
-0.096641406416893,
0.040333136916160583,
0.0303134024143219,
0.15903277695178986,
-0.003422616282477975,
0.07854891568422318,
0.06763944029808044,
0.03770170733332634,
-0.08796028792858124,
0.09072939306497574,
-0.020152270793914795,
0.026779793202877045,
0.061775121837854385,
0.0017860574880614877,
-0.06366473436355591,
0.08399568498134613,
-0.018040090799331665,
-0.1087435781955719,
-0.05303991585969925,
-0.049715716391801834,
-0.004088951274752617,
-0.1310843676328659,
-0.019809911027550697,
-0.03998655825853348,
0.12269899994134903,
0.1693146675825119,
-0.030405845493078232,
-0.03462722897529602,
-0.06275392323732376,
0.04986055940389633,
0.01819697394967079,
0.0028158018831163645,
-0.05370081216096878,
-0.15981857478618622,
-0.02269524522125721,
-0.05638124421238899,
-0.0003842426813207567,
-0.21959468722343445,
-0.03277238458395004,
-0.07732845842838287,
-0.025376731529831886,
-0.08240664750337601,
0.09181114286184311,
0.0642891600728035,
0.04123024642467499,
-0.05350012704730034,
-0.05481576547026634,
-0.024418363347649574,
0.09309162199497223,
-0.15147548913955688,
-0.14304344356060028
] |
null | null | transformers |
# CodeTrans model for code documentation generation php
Pretrained model on programming language php using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized php code functions: it works best with tokenized php functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the php function/method.
## Intended uses & limitations
The model could be used to generate the description for the php function or be fine-tuned on other php code tasks. It can be used on unparsed and untokenized php code. However, if the php code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate php function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_php_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_php_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "public static function update ( $ table ) { if ( ! is_array ( $ table ) ) { $ table = json_decode ( $ table , true ) ; } if ( ! SchemaManager :: tableExists ( $ table [ 'oldName' ] ) ) { throw SchemaException :: tableDoesNotExist ( $ table [ 'oldName' ] ) ; } $ updater = new self ( $ table ) ; $ updater -> updateTable ( ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/function%20documentation%20generation/php/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "public static function update ( $ table ) { if ( ! is_array ( $ table ) ) { $ table = json_decode ( $ table , true ) ; } if ( ! SchemaManager :: tableExists ( $ table [ 'oldName' ] ) ) { throw SchemaException :: tableDoesNotExist ( $ table [ 'oldName' ] ) ; } $ updater = new self ( $ table ) ; $ updater -> updateTable ( ) ; }"}]} | summarization | SEBIS/code_trans_t5_base_code_documentation_generation_php_multitask_finetune | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
| CodeTrans model for code documentation generation php
=====================================================
Pretrained model on programming language php using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized php code functions: it works best with tokenized php functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the php function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the php function or be fine-tuned on other php code tasks. It can be used on unparsed and untokenized php code. However, if the php code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate php function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
46,
61,
88,
77
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.11943921446800232,
0.05376221984624863,
-0.000278129824437201,
0.10055449604988098,
0.07867562025785446,
0.02346322126686573,
0.0201910138130188,
0.1045076921582222,
-0.081983283162117,
0.04855254292488098,
0.11123679578304291,
-0.0915076956152916,
0.022501077502965927,
0.1390979290008545,
0.034535426646471024,
-0.19692640006542206,
-0.017414633184671402,
0.06719698756933212,
-0.13287965953350067,
0.11706836521625519,
0.07052513211965561,
-0.1010003387928009,
0.0685507282614708,
-0.021586045622825623,
-0.11882414668798447,
0.04248633235692978,
-0.03939491882920265,
-0.011358607560396194,
0.09210198372602463,
0.03947240114212036,
0.10999684780836105,
-0.01708259992301464,
0.05419329181313515,
-0.1325913518667221,
0.007406284566968679,
0.061194781213998795,
0.05755263939499855,
0.055410079658031464,
0.06315700709819794,
0.1148485541343689,
0.05671849101781845,
-0.024340208619832993,
0.02669633738696575,
0.061081573367118835,
-0.050895582884550095,
-0.055987369269132614,
-0.06579902768135071,
0.09292528033256531,
0.09392748028039932,
0.11237914860248566,
0.01277191936969757,
-0.006392819341272116,
-0.0675646960735321,
0.05778612568974495,
0.1087842583656311,
-0.21472139656543732,
-0.04328605905175209,
0.05824475362896919,
0.05267203226685524,
0.004507672972977161,
-0.06320185959339142,
-0.03615772724151611,
0.06113610789179802,
0.04921703040599823,
0.087982676923275,
-0.09210731834173203,
-0.01876148395240307,
-0.031348299235105515,
-0.08340268582105637,
-0.04237813875079155,
0.20150607824325562,
0.07502344995737076,
-0.02733365260064602,
-0.08172492682933807,
-0.0967429056763649,
-0.18298117816448212,
0.0062233274802565575,
-0.005520579405128956,
0.027297427877783775,
0.0038134807255119085,
-0.03275596350431442,
-0.048812948167324066,
-0.10617557913064957,
-0.12173733115196228,
-0.016337906941771507,
-0.0002494356594979763,
0.07597731053829193,
0.0392841100692749,
0.00006934761040611193,
0.09347335249185562,
-0.014272327534854412,
-0.011018414050340652,
0.014190654270350933,
0.029819443821907043,
-0.08587656915187836,
0.022627992555499077,
-0.02964138425886631,
-0.16659682989120483,
0.013123682700097561,
0.06340344995260239,
-0.11928527802228928,
0.07229530066251755,
0.17540864646434784,
0.017927754670381546,
-0.04509390518069267,
0.18137496709823608,
-0.007687196135520935,
-0.09728731215000153,
0.005380994640290737,
0.010239578783512115,
-0.034664593636989594,
0.03441961482167244,
-0.05967395007610321,
-0.035157859325408936,
0.034455377608537674,
0.01763814315199852,
-0.06748351454734802,
0.04881618544459343,
0.008304127492010593,
-0.01453681942075491,
0.10999732464551926,
-0.09430104494094849,
0.03156093880534172,
0.004500925075262785,
-0.033063068985939026,
0.012092013843357563,
0.05909018963575363,
-0.126445472240448,
-0.15529797971248627,
0.0364130437374115,
-0.04526719078421593,
-0.028930706903338432,
-0.10562890022993088,
-0.09005652368068695,
0.017078518867492676,
-0.017457975074648857,
-0.0300417710095644,
-0.12733860313892365,
-0.08946801722049713,
-0.0408446341753006,
0.06274916976690292,
0.005630908068269491,
-0.03705766797065735,
-0.01226259395480156,
0.023082105442881584,
0.011371158994734287,
-0.017517996951937675,
0.09352201223373413,
-0.03699703887104988,
0.07751812040805817,
0.033012691885232925,
0.03232165053486824,
-0.04771848022937775,
0.03916743025183678,
-0.0747223049402237,
0.062242068350315094,
-0.05985300987958908,
0.034352805465459824,
0.05606827139854431,
0.05461420118808746,
-0.1266418993473053,
-0.10423275083303452,
-0.08009634912014008,
0.030493924394249916,
0.05821503326296806,
0.060410674661397934,
-0.06863347440958023,
0.03129793331027031,
0.16073910892009735,
-0.07667119801044464,
-0.11797518283128738,
0.11068222671747208,
0.013613058254122734,
0.03459075465798378,
0.05085131898522377,
0.121241495013237,
0.13403575122356415,
-0.07352245599031448,
0.022193627431988716,
0.09950663894414902,
0.025512972846627235,
-0.15123209357261658,
0.0753505527973175,
-0.02517852559685707,
-0.00358433835208416,
0.0273288581520319,
0.023492734879255295,
0.05951758101582527,
-0.012573972344398499,
-0.034576017409563065,
-0.036675743758678436,
-0.08357690274715424,
-0.05168035626411438,
0.009314613416790962,
0.04108104109764099,
-0.04106444865465164,
-0.06622505187988281,
0.03487301990389824,
0.12380217015743256,
-0.10126402229070663,
0.053070396184921265,
-0.0255864430218935,
-0.0468100942671299,
-0.10086695104837418,
0.02963392250239849,
-0.14360810816287994,
0.021654829382896423,
0.01787409745156765,
-0.027476288378238678,
0.023685777559876442,
0.08537403494119644,
0.03225123509764671,
-0.009942998178303242,
-0.07609207928180695,
-0.03778736665844917,
-0.009955252520740032,
-0.04212084785103798,
-0.12739317119121552,
0.004430369008332491,
-0.08956366032361984,
-0.00043039541924372315,
-0.011428985744714737,
-0.11824453622102737,
0.024627530947327614,
0.004455298185348511,
0.004344054963439703,
0.005982936359941959,
-0.030732885003089905,
0.025570258498191833,
0.05494861677289009,
-0.006211037747561932,
-0.08067040145397186,
0.06626465916633606,
0.08012701570987701,
-0.053386569023132324,
-0.04113777354359627,
-0.05118217691779137,
-0.014034108258783817,
0.07384596765041351,
0.010506785474717617,
-0.12575411796569824,
-0.029206978157162666,
-0.04398488998413086,
-0.04850512370467186,
-0.04541304334998131,
-0.05471177399158478,
0.15444698929786682,
0.017342692241072655,
0.18172362446784973,
-0.11070214956998825,
-0.056339241564273834,
0.009247837588191032,
0.004462783690541983,
0.05639278143644333,
0.13454154133796692,
0.06017295643687248,
-0.07690761983394623,
0.03589298576116562,
0.007750159595161676,
-0.07158905267715454,
0.08403200656175613,
-0.03621364012360573,
-0.08890922367572784,
0.018774431198835373,
0.09414125978946686,
-0.0045333923771977425,
0.11829408258199692,
-0.12744829058647156,
-0.016211893409490585,
-0.0050735557451844215,
0.03941340371966362,
0.034184303134679794,
-0.1576167792081833,
0.06025274097919464,
0.04603466019034386,
-0.04439016059041023,
0.004107976332306862,
-0.0392083004117012,
-0.060587868094444275,
0.040873195976018906,
0.0703352689743042,
0.030446665361523628,
0.010102592408657074,
0.004622190725058317,
-0.09197696298360825,
0.19913026690483093,
-0.03817082196474075,
-0.21443209052085876,
-0.12068382650613785,
0.06442661583423615,
-0.035744599997997284,
-0.025523755699396133,
0.04421137273311615,
-0.09130516648292542,
-0.031994760036468506,
-0.09565581381320953,
0.04861707240343094,
-0.14087030291557312,
0.0594056136906147,
-0.06138705462217331,
0.053607162088155746,
0.09766734391450882,
-0.11012094467878342,
0.02124493010342121,
-0.026193678379058838,
0.0019030083203688264,
-0.021029973402619362,
-0.02116653509438038,
0.10336795449256897,
0.15297728776931763,
-0.07143227010965347,
0.04332443326711655,
-0.010132196359336376,
0.11466017365455627,
-0.04941384866833687,
0.06342066079378128,
0.19975417852401733,
0.05992254614830017,
0.03566088527441025,
0.03557797893881798,
0.04412045329809189,
-0.04583432525396347,
0.03798285126686096,
0.05825282260775566,
-0.057954367250204086,
-0.17354750633239746,
-0.04666493460536003,
-0.08834263682365417,
0.02758832834661007,
0.17353756725788116,
0.08198606222867966,
-0.1033782958984375,
0.05530915781855583,
-0.015259822830557823,
0.1232404112815857,
-0.06559426337480545,
0.058232031762599945,
0.07652735710144043,
0.002677330980077386,
0.006416148040443659,
-0.10132790356874466,
-0.02279067039489746,
0.08325443416833878,
0.07850578427314758,
0.17782454192638397,
-0.09034416824579239,
0.15072663128376007,
0.035553742200136185,
0.09977664053440094,
0.013828666880726814,
0.12645410001277924,
-0.08926514536142349,
0.007649479899555445,
0.00018887581245508045,
-0.014230635948479176,
-0.019313938915729523,
0.041887734085321426,
-0.05018490180373192,
0.03608185797929764,
-0.11546812951564789,
-0.036666139960289,
0.030947376042604446,
0.2576707899570465,
0.08575111627578735,
-0.20837850868701935,
-0.14125898480415344,
-0.05370992049574852,
-0.10282676666975021,
-0.10802799463272095,
0.07261763513088226,
0.15312278270721436,
-0.04029369726777077,
0.006645230110734701,
-0.040953993797302246,
0.14125125110149384,
-0.10699596256017685,
-0.006196251604706049,
0.10624070465564728,
0.06311014294624329,
0.01707656867802143,
0.10643090307712555,
-0.18773260712623596,
0.1125715970993042,
0.015154736116528511,
0.07771290093660355,
-0.0414404533803463,
0.03682876378297806,
-0.03790191560983658,
0.010013203136622906,
0.07923958450555801,
0.02037840522825718,
0.022530708461999893,
-0.1319008469581604,
-0.05368250608444214,
0.01949634589254856,
0.03497210517525673,
0.015718022361397743,
0.0807487964630127,
-0.03811653330922127,
0.022336844354867935,
-0.01655667833983898,
-0.09215381741523743,
-0.03378584608435631,
-0.1412167102098465,
-0.034992385655641556,
0.04592372849583626,
-0.047222599387168884,
-0.03662926331162453,
0.02872098609805107,
0.055748309940099716,
0.24669663608074188,
-0.12087398767471313,
-0.07970482110977173,
-0.1159742921590805,
0.07439334690570831,
0.12245237827301025,
-0.0775536298751831,
0.053165093064308167,
-0.014101188629865646,
0.025431280955672264,
-0.003757502418011427,
-0.056653399020433426,
0.04897792264819145,
-0.05678707733750343,
-0.0649966150522232,
-0.03180048242211342,
0.10468953102827072,
-0.052679646760225296,
0.02941131964325905,
-0.023067936301231384,
-0.07194484025239944,
-0.07755221426486969,
-0.1262805461883545,
-0.05622882768511772,
-0.0573393777012825,
0.05234479904174805,
-0.0510033555328846,
-0.07581569254398346,
0.167499840259552,
0.04449920356273651,
-0.05194031074643135,
0.11523064970970154,
0.06974126398563385,
-0.06794838607311249,
-0.026499774307012558,
0.115850530564785,
-0.02640567347407341,
-0.22603340446949005,
-0.060068368911743164,
0.03925483673810959,
0.022952357307076454,
-0.10014104843139648,
-0.13166411221027374,
0.086005300283432,
0.04399293288588524,
0.009618953801691532,
0.0075341472402215,
-0.2863560616970062,
-0.10746770352125168,
0.0032629023771733046,
0.06337392330169678,
0.09848590195178986,
-0.11875700950622559,
-0.03503255546092987,
-0.03305482864379883,
-0.01735202968120575,
0.03153234347701073,
0.006022348999977112,
0.12311701476573944,
-0.04783492162823677,
-0.008191651664674282,
0.003347412683069706,
-0.05136633291840553,
0.04329484701156616,
-0.020120052620768547,
0.06536184251308441,
-0.010406957939267159,
0.032913487404584885,
0.07078670710325241,
-0.07843468338251114,
0.16920053958892822,
-0.09623006731271744,
0.08390896767377853,
-0.1231733039021492,
-0.06342105567455292,
-0.0418955460190773,
-0.005495176650583744,
0.0032515558414161205,
-0.05066821351647377,
-0.08051416277885437,
-0.00015633465955033898,
0.059473805129528046,
-0.040007803589105606,
0.005744727328419685,
-0.00398654118180275,
-0.06822377443313599,
0.135827898979187,
0.028599994257092476,
-0.10111763328313828,
-0.23575611412525177,
0.02369290590286255,
-0.008980294689536095,
0.09255283325910568,
-0.2155044674873352,
0.013100413605570793,
0.08795872330665588,
0.031664252281188965,
0.059465352445840836,
0.032830994576215744,
-0.02343597263097763,
0.004858899861574173,
0.05360674485564232,
-0.0761445164680481,
-0.11667913943529129,
-0.036609161645174026,
-0.09519536793231964,
-0.14107322692871094,
0.0567152202129364,
0.07123613357543945,
-0.08229736238718033,
0.013972851447761059,
-0.006527388002723455,
-0.018726171925663948,
-0.08020667731761932,
0.2180669754743576,
0.02955974079668522,
0.06168528273701668,
-0.07116249203681946,
0.06867903470993042,
0.09317027777433395,
-0.16338731348514557,
0.0015136725269258022,
0.1626250147819519,
-0.10419196635484695,
-0.04870416969060898,
0.09378840029239655,
0.0021901109721511602,
-0.01599825918674469,
-0.07951626926660538,
-0.11713182181119919,
-0.07782941311597824,
0.062306132167577744,
-0.038464050740003586,
0.06169506534934044,
0.0670798122882843,
-0.03884069249033928,
0.009315555915236473,
-0.13369221985340118,
0.07619915902614594,
0.07337884604930878,
0.05077739804983139,
-0.14921435713768005,
0.14385966956615448,
0.04483770951628685,
0.08450328558683395,
-0.0037219906225800514,
0.025911126285791397,
-0.07015883177518845,
0.03452880680561066,
-0.014989093877375126,
-0.0037897599395364523,
-0.017673050984740257,
0.016596276313066483,
-0.03382750228047371,
0.045000188052654266,
-0.03959467634558678,
0.06036704033613205,
-0.026521066203713417,
-0.0465371198952198,
-0.03391174226999283,
0.02972269430756569,
-0.027171695604920387,
0.0014861726667732,
-0.020610535517334938,
-0.05381166562438011,
0.05641710385680199,
-0.05114702135324478,
-0.041466426104307175,
-0.03679601848125458,
0.008851711638271809,
0.019688529893755913,
0.03347637876868248,
0.05410134047269821,
-0.016621479764580727,
0.025121692568063736,
0.0347602441906929,
0.04788064956665039,
-0.009852685034275055,
-0.0140305794775486,
0.0670420378446579,
-0.1371622085571289,
-0.058343127369880676,
-0.13089469075202942,
-0.03186631202697754,
-0.06358852982521057,
0.03428942337632179,
0.09699777513742447,
0.09499942511320114,
0.09409057348966599,
-0.05896306037902832,
0.010782218538224697,
-0.1889849156141281,
-0.017783792689442635,
0.0488959439098835,
-0.019921360537409782,
-0.1166737899184227,
-0.06323403865098953,
0.06632310152053833,
-0.013016395270824432,
0.1432451605796814,
-0.012781573459506035,
0.05990142375230789,
0.009782479144632816,
-0.010171915404498577,
-0.0028103457298129797,
-0.019453519955277443,
0.18667462468147278,
-0.07095949351787567,
-0.04895414412021637,
-0.012345962226390839,
0.06111527234315872,
0.06151188910007477,
0.21793963015079498,
0.08132603019475937,
0.13087905943393707,
0.06994026154279709,
0.0901702418923378,
-0.08791062980890274,
0.015564607456326485,
-0.12163316458463669,
0.11242733895778656,
-0.024493440985679626,
0.10971635580062866,
-0.06726615875959396,
0.11687570810317993,
0.09818407148122787,
-0.1000770777463913,
0.07409161329269409,
0.024227671325206757,
-0.07233010977506638,
-0.0245191790163517,
-0.10360556095838547,
-0.07091062515974045,
-0.13694995641708374,
-0.02775830589234829,
-0.0594666413962841,
-0.03889988362789154,
0.1026318222284317,
0.018949171528220177,
0.002101516118273139,
0.19222937524318695,
0.0024822440464049578,
-0.05389432981610298,
0.05946779623627663,
0.02540000155568123,
0.03660259395837784,
0.08639847487211227,
-0.0032393878791481256,
0.06842178851366043,
-0.08996044844388962,
0.08490756899118423,
0.023980949074029922,
-0.009638147428631783,
0.03259192034602165,
0.04148918390274048,
-0.020133398473262787,
-0.056838300079107285,
0.019341733306646347,
0.0827968642115593,
0.1584138423204422,
0.01811162568628788,
-0.06798558682203293,
-0.047230012714862823,
0.16545096039772034,
-0.07893794775009155,
-0.06250153481960297,
-0.10745688527822495,
0.13902977108955383,
0.07251615822315216,
-0.021030861884355545,
0.03453044593334198,
-0.0717846229672432,
-0.018806559965014458,
0.27912697196006775,
0.11796409636735916,
-0.021472977474331856,
-0.029400300234556198,
0.04193003475666046,
-0.024164695292711258,
-0.017088957130908966,
0.1592494696378708,
0.025355566293001175,
0.21792860329151154,
-0.00933924037963152,
-0.014360662549734116,
-0.03186047449707985,
-0.04602337256073952,
-0.06415487825870514,
0.1595228612422943,
0.021487435325980186,
0.0466281920671463,
-0.07713479548692703,
0.02208489552140236,
0.05883849039673805,
-0.13619232177734375,
0.1567864716053009,
-0.07994095236063004,
-0.07347992062568665,
0.021804064512252808,
0.021595321595668793,
-0.005630057770758867,
0.05193028971552849,
-0.03754662349820137,
0.09362387657165527,
0.042531345039606094,
-0.028217250481247902,
-0.11838848888874054,
-0.12445792555809021,
0.06137676537036896,
0.02448994666337967,
0.13956360518932343,
0.01742609776556492,
0.0489087849855423,
0.08407339453697205,
-0.0007243239087983966,
-0.11347027122974396,
0.0879163146018982,
0.015541593544185162,
-0.034138165414333344,
0.07447833567857742,
0.034911636263132095,
-0.04171585664153099,
0.07341889292001724,
-0.0036179828457534313,
-0.05756581947207451,
-0.02840670570731163,
-0.021479617804288864,
-0.011883474886417389,
-0.1693154275417328,
-0.009014613926410675,
-0.03998177871108055,
0.12612520158290863,
0.19392989575862885,
-0.04013264179229736,
-0.020519748330116272,
-0.0660516768693924,
0.041354045271873474,
0.007636989001184702,
0.03141171112656593,
-0.00247966474853456,
-0.13684242963790894,
0.00813543051481247,
-0.04440930858254433,
-0.000539697241038084,
-0.18873044848442078,
-0.06095726788043976,
-0.028791159391403198,
-0.03461150825023651,
-0.06970085203647614,
0.12055715918540955,
0.03779023140668869,
0.0219071414321661,
-0.03818310424685478,
-0.06577086448669434,
-0.03142056241631508,
0.057426270097494125,
-0.13118880987167358,
-0.1105760708451271
] |
null | null | transformers |
# CodeTrans model for code documentation generation php
Pretrained model on programming language php using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized php code functions: it works best with tokenized php functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the php function/method.
## Intended uses & limitations
The model could be used to generate the description for the php function or be fine-tuned on other php code tasks. It can be used on unparsed and untokenized php code. However, if the php code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate php function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_php_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_php_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "public static function update ( $ table ) { if ( ! is_array ( $ table ) ) { $ table = json_decode ( $ table , true ) ; } if ( ! SchemaManager :: tableExists ( $ table [ 'oldName' ] ) ) { throw SchemaException :: tableDoesNotExist ( $ table [ 'oldName' ] ) ; } $ updater = new self ( $ table ) ; $ updater -> updateTable ( ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/function%20documentation%20generation/php/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 65,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "public static function update ( $ table ) { if ( ! is_array ( $ table ) ) { $ table = json_decode ( $ table , true ) ; } if ( ! SchemaManager :: tableExists ( $ table [ 'oldName' ] ) ) { throw SchemaException :: tableDoesNotExist ( $ table [ 'oldName' ] ) ; } $ updater = new self ( $ table ) ; $ updater -> updateTable ( ) ; }"}]} | summarization | SEBIS/code_trans_t5_base_code_documentation_generation_php_transfer_learning_finetune | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
| CodeTrans model for code documentation generation php
=====================================================
Pretrained model on programming language php using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized php code functions: it works best with tokenized php functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the php function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the php function or be fine-tuned on other php code tasks. It can be used on unparsed and untokenized php code. However, if the php code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate php function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 65,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 65,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 65,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
46,
61,
87,
109
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 65,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.117219477891922,
0.08804841339588165,
-0.001519340556114912,
0.11689861118793488,
0.045886848121881485,
0.020135954022407532,
0.021988868713378906,
0.09380888193845749,
-0.037874121218919754,
0.064674511551857,
0.0658736303448677,
-0.07521919161081314,
0.046096619218587875,
0.17776533961296082,
0.029126746580004692,
-0.1632440984249115,
-0.03891948238015175,
0.037114039063453674,
-0.11614365130662918,
0.11101966351270676,
0.07018791884183884,
-0.08814868330955505,
0.05964062735438347,
-0.03958059474825859,
-0.08429678529500961,
0.050389986485242844,
-0.03221781551837921,
-0.014154752716422081,
0.09003761410713196,
0.059325672686100006,
0.10493346303701401,
-0.03320902958512306,
0.05686779320240021,
-0.19772186875343323,
0.006472650915384293,
0.029724784195423126,
0.05484980717301369,
0.03249208256602287,
0.04967859759926796,
0.07973206043243408,
0.08194421231746674,
-0.008643504232168198,
0.034275684505701065,
0.062377117574214935,
-0.05719461292028427,
-0.08357104659080505,
-0.05181717127561569,
0.07390304654836655,
0.07472839206457138,
0.10698530077934265,
-0.004682771861553192,
-0.002335724188014865,
-0.08873486518859863,
0.06764162331819534,
0.11816275864839554,
-0.22363387048244476,
-0.01750705949962139,
0.10550793260335922,
0.06468922644853592,
0.032780926674604416,
-0.08632396161556244,
-0.03437875956296921,
0.09256115555763245,
0.03685181587934494,
0.06068462133407593,
-0.09198542684316635,
-0.0012543713673949242,
0.007954857312142849,
-0.05561757832765579,
-0.06386463344097137,
0.15946629643440247,
0.059355851262807846,
-0.04180753603577614,
-0.11223798990249634,
-0.06171424686908722,
-0.16486677527427673,
0.02087116613984108,
0.010099219158291817,
0.02372782677412033,
0.005402417853474617,
-0.012149540707468987,
-0.029109567403793335,
-0.09638704359531403,
-0.12496940791606903,
0.026058601215481758,
0.013501725159585476,
0.06478022783994675,
0.03037572093307972,
-0.014060167595744133,
0.09383045136928558,
0.016276072710752487,
-0.039148326963186264,
-0.017534317448735237,
0.026208877563476562,
-0.08917616307735443,
0.03816382959485054,
-0.01418143417686224,
-0.05850386247038841,
0.013392360880970955,
0.10777227580547333,
-0.11680973321199417,
0.08777827769517899,
0.12039836496114731,
0.010188862681388855,
-0.01625656709074974,
0.20757143199443817,
0.04386792704463005,
-0.14566707611083984,
0.020179275423288345,
0.03228829801082611,
-0.008651473559439182,
0.029225321486592293,
-0.05655738338828087,
-0.05448770523071289,
0.017260529100894928,
0.05902925133705139,
-0.12713280320167542,
0.023984922096133232,
-0.05029961094260216,
-0.008939919993281364,
0.09272147715091705,
-0.12016808986663818,
0.03215614706277847,
0.02130788192152977,
-0.05660874396562576,
-0.04264194145798683,
0.0839933454990387,
-0.13724303245544434,
-0.1424221396446228,
0.03338370472192764,
-0.040468864142894745,
-0.04033710062503815,
-0.10895448178052902,
-0.09625443071126938,
0.0006322244880720973,
-0.03289489448070526,
-0.005860015284270048,
-0.10363893210887909,
-0.08677192777395248,
-0.026049792766571045,
0.04360362887382507,
0.002176597248762846,
-0.02913469634950161,
-0.052963558584451675,
0.019911108538508415,
0.00015470004291273654,
-0.03059449978172779,
0.03402051702141762,
-0.04421839490532875,
0.09703482687473297,
0.07762573659420013,
0.03930129110813141,
0.0011980780400335789,
0.02785329893231392,
-0.07934805750846863,
0.07647913694381714,
-0.10544252395629883,
0.06483177840709686,
-0.008850272744894028,
0.050490692257881165,
-0.11928200721740723,
-0.08685405552387238,
-0.01663191430270672,
0.04678606986999512,
0.07988359779119492,
0.05541303753852844,
-0.08419280499219894,
0.01792137324810028,
0.16913796961307526,
-0.09293054789304733,
-0.13093462586402893,
0.1074015274643898,
-0.003924823831766844,
0.02578909695148468,
0.06487318873405457,
0.14449593424797058,
0.1351115107536316,
-0.10743892192840576,
-0.0188890527933836,
0.09012940526008606,
0.06922116130590439,
-0.0967051312327385,
0.0718064233660698,
-0.01501525193452835,
0.0014658912550657988,
0.0366940051317215,
0.04517455771565437,
0.05897518992424011,
0.008288341574370861,
-0.03840334340929985,
-0.03289615362882614,
-0.09403452277183533,
-0.0329274982213974,
0.009286249987781048,
0.026784414425492287,
-0.046380192041397095,
-0.0721600353717804,
0.02470402419567108,
0.16354988515377045,
-0.1101706475019455,
0.0378446951508522,
-0.06744952499866486,
-0.01792551949620247,
-0.07732332497835159,
0.0325997956097126,
-0.13052751123905182,
0.026038680225610733,
0.04222479090094566,
-0.02293766289949417,
0.06130286678671837,
0.07624460756778717,
0.0322832316160202,
0.001487871166318655,
-0.06538159400224686,
-0.04361617937684059,
-0.041792940348386765,
-0.07344077527523041,
-0.1194492056965828,
-0.020817872136831284,
-0.0769994929432869,
-0.01729772426187992,
-0.027498286217451096,
-0.16686762869358063,
-0.0022340985015034676,
-0.006031872238963842,
0.019524753093719482,
0.012111212126910686,
-0.03405952826142311,
0.03137727454304695,
0.03961622714996338,
-0.026095779612660408,
-0.07270336896181107,
0.0346117727458477,
0.04080545902252197,
-0.08261536806821823,
-0.040296729654073715,
-0.07522046566009521,
-0.04251294210553169,
0.0815674439072609,
0.07290609925985336,
-0.10124605149030685,
-0.05964609980583191,
-0.029194403439760208,
-0.041155073791742325,
-0.03576622158288956,
-0.05632520094513893,
0.16687360405921936,
0.012752389535307884,
0.18256732821464539,
-0.15234223008155823,
-0.06900539994239807,
-0.018448345363140106,
0.011510656215250492,
0.04236272722482681,
0.16468337178230286,
0.015054354444146156,
-0.06201811134815216,
0.045391883701086044,
-0.005469338037073612,
-0.05444277822971344,
0.13953764736652374,
-0.03432886302471161,
-0.07861797511577606,
0.010193933732807636,
0.11694684624671936,
-0.006690778769552708,
0.17224319279193878,
-0.0890139564871788,
0.007070229388773441,
-0.008473768830299377,
0.02384837344288826,
0.03695046529173851,
-0.13515424728393555,
0.031008707359433174,
0.03607330098748207,
-0.08101122081279755,
-0.020959967747330666,
-0.024569900706410408,
-0.04967408999800682,
0.04311272129416466,
0.03081933967769146,
0.037146247923374176,
-0.017487619072198868,
-0.02200518175959587,
-0.1019197553396225,
0.20157712697982788,
-0.06942608952522278,
-0.20126019418239594,
-0.15834158658981323,
0.09785409271717072,
-0.03235640749335289,
-0.02500244788825512,
0.03159394487738609,
-0.12438859790563583,
-0.05803250893950462,
-0.09830067306756973,
0.08910094201564789,
-0.12123356759548187,
0.0054542324505746365,
-0.048224806785583496,
0.06260227411985397,
0.07049092650413513,
-0.16851021349430084,
0.025046752765774727,
-0.028921637684106827,
0.011725425720214844,
-0.03074418380856514,
-0.05111950263381004,
0.0890679582953453,
0.12236639857292175,
-0.060464996844530106,
0.03694000095129013,
-0.0053106145933270454,
0.1755097657442093,
-0.05144704505801201,
0.030010011047124863,
0.18699532747268677,
0.03371141478419304,
0.036365360021591187,
0.03397829830646515,
0.018537074327468872,
-0.08716994524002075,
0.057097963988780975,
0.07387398928403854,
-0.031797654926776886,
-0.22364279627799988,
-0.03055742010474205,
-0.07806256413459778,
0.03847137838602066,
0.13308143615722656,
0.0664885938167572,
-0.14984431862831116,
0.02724127285182476,
-0.009788652881979942,
0.13774999976158142,
-0.0377420075237751,
0.049210913479328156,
0.06602410227060318,
0.006842178758233786,
-0.006040932144969702,
-0.10399654507637024,
-0.0071242935955524445,
0.08172047138214111,
0.11978687345981598,
0.20500479638576508,
-0.10279794782400131,
0.17160794138908386,
0.013911422342061996,
0.10634751617908478,
0.016969291493296623,
0.0905512347817421,
-0.135105699300766,
0.01707041822373867,
0.0051891133189201355,
-0.018450934439897537,
-0.058310557156801224,
0.04520422965288162,
-0.038016799837350845,
0.04848475381731987,
-0.08168094605207443,
-0.016905205324292183,
0.024943526834249496,
0.20548850297927856,
0.07307766377925873,
-0.16196662187576294,
-0.12513180077075958,
-0.00589686818420887,
-0.10105933248996735,
-0.10468938946723938,
0.07494305819272995,
0.19386789202690125,
-0.04516436532139778,
0.016811924055218697,
-0.012549283914268017,
0.14045587182044983,
-0.09799055010080338,
-0.025497611612081528,
0.032102447003126144,
0.0583324059844017,
0.004456869326531887,
0.12025582045316696,
-0.2421356737613678,
0.09695284068584442,
0.02047925442457199,
0.09763336926698685,
-0.018372075632214546,
0.047928791493177414,
-0.04153717681765556,
-0.02181326411664486,
0.0812615379691124,
0.011732792481780052,
-0.009162520989775658,
-0.1930205225944519,
-0.0446278378367424,
0.03242061659693718,
0.0495714396238327,
-0.003487617475911975,
0.08146973699331284,
-0.02766668051481247,
0.03817471116781235,
-0.023152047768235207,
-0.14505787193775177,
-0.0525079183280468,
-0.13883307576179504,
-0.05690661072731018,
-0.005851409398019314,
-0.04767031595110893,
-0.014169681817293167,
0.04552086442708969,
0.03430340066552162,
0.21443282067775726,
-0.16585974395275116,
-0.0912962332367897,
-0.098445363342762,
0.07743152230978012,
0.12769782543182373,
-0.09835261851549149,
0.042928144335746765,
-0.0013630001340061426,
0.015137381851673126,
-0.027341751381754875,
-0.06891212612390518,
0.03243499994277954,
-0.045245375484228134,
-0.08278153836727142,
-0.023635007441043854,
0.1082611083984375,
-0.01972675696015358,
0.03975587710738182,
-0.00020845620019827038,
-0.09125293791294098,
-0.0461781844496727,
-0.13301615417003632,
-0.07573224604129791,
-0.03635787218809128,
0.06683005392551422,
-0.0281179528683424,
-0.10500975698232651,
0.1169089525938034,
0.014274599961936474,
-0.07640501856803894,
0.08523008972406387,
0.15864776074886322,
-0.0707782506942749,
0.013640332035720348,
0.12959499657154083,
-0.05182862654328346,
-0.19234859943389893,
-0.04841538891196251,
0.03394033759832382,
0.055119261145591736,
-0.057123176753520966,
-0.15476666390895844,
0.07539951801300049,
0.0067000482231378555,
0.01346404105424881,
0.009099922142922878,
-0.26890623569488525,
-0.13299424946308136,
0.007330591324716806,
0.07461141049861908,
0.08298580348491669,
-0.11645612865686417,
-0.04129187762737274,
-0.05259377509355545,
-0.05156745761632919,
0.042757824063301086,
0.06855777651071548,
0.1149333268404007,
-0.040819548070430756,
0.024294551461935043,
0.03515680879354477,
-0.03184378147125244,
0.05752158164978027,
-0.02148563414812088,
0.09492138773202896,
-0.017289254814386368,
0.03751617297530174,
0.04959055036306381,
-0.060862280428409576,
0.18060511350631714,
-0.17122918367385864,
0.10533274710178375,
-0.19911932945251465,
-0.062229570001363754,
-0.021009346470236778,
-0.013934736140072346,
-0.026315739378333092,
-0.0449535958468914,
-0.11523837596178055,
0.027269575744867325,
0.05204886943101883,
-0.02507678233087063,
0.021164005622267723,
-0.018665259703993797,
-0.06926421821117401,
0.07273925095796585,
0.08127666264772415,
-0.023387940600514412,
-0.18032561242580414,
0.036416444927453995,
0.015911610797047615,
0.09970729053020477,
-0.2318400889635086,
0.024533027783036232,
0.10767380893230438,
0.02760198712348938,
0.08355261385440826,
0.010763724334537983,
-0.07719536125659943,
0.029705461114645004,
0.06323118507862091,
-0.06392538547515869,
-0.12897811830043793,
-0.013793970458209515,
-0.08071151375770569,
-0.11643781512975693,
0.0442197322845459,
0.07470406591892242,
-0.051189977675676346,
-0.0009947243379428983,
-0.003926346078515053,
0.0007533511379733682,
-0.07308689504861832,
0.21007655560970306,
0.04098489135503769,
0.0668017789721489,
-0.06253151595592499,
0.08276434987783432,
0.08893576264381409,
-0.13332515954971313,
0.03993364796042442,
0.15454313158988953,
-0.0857335701584816,
-0.02572687901556492,
0.09177844226360321,
0.09143566340208054,
-0.022395215928554535,
-0.0631333738565445,
-0.10111476480960846,
-0.06406820565462112,
0.038647837936878204,
0.018810108304023743,
0.07268493622541428,
0.07044212520122528,
-0.02024023048579693,
0.009452184662222862,
-0.10571113973855972,
0.10132379829883575,
0.09090347588062286,
0.033824268728494644,
-0.12182832509279251,
0.11243560165166855,
0.04240332543849945,
0.08069059997797012,
0.0039000941906124353,
0.022106824442744255,
-0.11413736641407013,
0.04181913286447525,
-0.04192524403333664,
0.040735047310590744,
-0.0011790388962253928,
0.051648225635290146,
-0.047690946608781815,
0.03763984888792038,
-0.024384893476963043,
0.05562262609601021,
-0.0378495529294014,
-0.0255038570612669,
-0.02215011976659298,
0.031200870871543884,
-0.07426135241985321,
-0.009641353040933609,
0.000434834451880306,
-0.06996521353721619,
0.09980366379022598,
-0.05998747795820236,
-0.01948215626180172,
-0.001695017097517848,
0.005302076693624258,
0.05479587987065315,
0.016026170924305916,
0.05530586838722229,
-0.022731073200702667,
0.02986440248787403,
0.05085887387394905,
0.021766141057014465,
0.0015714461915194988,
-0.004707189276814461,
0.10603887587785721,
-0.14547526836395264,
-0.07458122074604034,
-0.11348355561494827,
-0.07083740085363388,
-0.06502122431993484,
0.0735553503036499,
0.08351816982030869,
0.08009815216064453,
0.0876922607421875,
-0.040720775723457336,
-0.006554922554641962,
-0.17539171874523163,
-0.04030961915850639,
0.05868338793516159,
-0.007560959085822105,
-0.12061154842376709,
-0.04987942799925804,
0.05724179744720459,
-0.031501468271017075,
0.12183455377817154,
-0.0047951252199709415,
0.061564285308122635,
-0.011736024171113968,
-0.04134763404726982,
-0.010335588827729225,
-0.010763939470052719,
0.17044338583946228,
-0.09395081549882889,
-0.002291756449267268,
-0.013145804405212402,
0.024078775197267532,
0.04551875591278076,
0.19566701352596283,
0.08317409455776215,
0.14452552795410156,
0.03051063045859337,
0.08866362273693085,
-0.06960452347993851,
-0.018864037469029427,
-0.14150305092334747,
0.10041686147451401,
-0.024851620197296143,
0.05561615899205208,
-0.058089517056941986,
0.15074513852596283,
0.10226672142744064,
-0.130266010761261,
0.10533927381038666,
0.018847959116101265,
-0.0910925418138504,
-0.04704957455396652,
-0.11088863760232925,
-0.052577223628759384,
-0.12945695221424103,
0.011125477962195873,
-0.09118622541427612,
-0.0081920912489295,
0.08029193431138992,
0.038661882281303406,
-0.02032841183245182,
0.1510363072156906,
-0.0006992861744947731,
-0.05556488782167435,
0.040062662214040756,
0.0388900488615036,
0.03303765878081322,
0.11168580502271652,
0.013057789765298367,
0.07582734525203705,
-0.08775508403778076,
0.06819348782300949,
0.04007014259696007,
-0.009337286464869976,
0.013493044301867485,
0.02969798818230629,
0.006351899355649948,
-0.06263861060142517,
0.01882966421544552,
0.0871858224272728,
0.20239867269992828,
0.04544104263186455,
-0.05259154364466667,
-0.050417691469192505,
0.17050707340240479,
-0.044844403862953186,
-0.0595504492521286,
-0.12441203743219376,
0.153898224234581,
0.04792279005050659,
0.007686808705329895,
0.02299121394753456,
-0.07725660502910614,
-0.016962561756372452,
0.27321118116378784,
0.048635758459568024,
-0.04573207348585129,
-0.02775096334517002,
0.010684681124985218,
-0.01013301033526659,
-0.019769759848713875,
0.17036563158035278,
0.011913768015801907,
0.25841090083122253,
0.008983134292066097,
-0.03298843279480934,
-0.035254936665296555,
-0.04908709228038788,
-0.020756054669618607,
0.2044517695903778,
-0.030164124444127083,
0.025876063853502274,
-0.09350915998220444,
-0.012863630428910255,
0.02632916532456875,
-0.11107128858566284,
0.13409069180488586,
-0.11384750157594681,
-0.06754102557897568,
0.010238309390842915,
0.03685033693909645,
-0.029193757101893425,
0.027110137045383453,
-0.02105552889406681,
0.07092158496379852,
0.0358760692179203,
-0.03176650404930115,
-0.11239901930093765,
-0.15143737196922302,
0.050362709909677505,
0.019785389304161072,
0.1530381739139557,
0.02235095389187336,
0.08191852271556854,
0.0864172950387001,
0.013805060647428036,
-0.07846897095441818,
0.0795404389500618,
0.039612799882888794,
-0.019551703706383705,
0.0612185038626194,
0.07545401155948639,
-0.03698868677020073,
0.12768027186393738,
0.0027863800060003996,
0.0129758445546031,
-0.013925704173743725,
-0.016855759546160698,
-0.020393580198287964,
-0.17690624296665192,
0.0058785732835531235,
-0.07125460356473923,
0.12621693313121796,
0.18025587499141693,
-0.043273042887449265,
-0.020928913727402687,
-0.06267809867858887,
0.0788343995809555,
-0.01869158260524273,
0.05235602334141731,
-0.00023314860300160944,
-0.14773984253406525,
0.014119848608970642,
-0.006910758558660746,
-0.001838970580138266,
-0.1986195147037506,
-0.04538094997406006,
-0.027159763500094414,
-0.015703829005360603,
-0.07508908957242966,
0.15495403110980988,
0.05634186044335365,
0.033905260264873505,
-0.034434132277965546,
-0.1390434354543686,
-0.019243702292442322,
0.05211346223950386,
-0.12723585963249207,
-0.10952966660261154
] |
null | null | transformers |
# CodeTrans model for code documentation generation python
Pretrained model on programming language python using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized python code functions: it works best with tokenized python functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used single-task training on CodeSearchNet Corpus python dataset.
## Intended uses & limitations
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_python"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_python", skip_special_tokens=True),
device=0
)
tokenized_code = "def e ( message , exit_code = None ) : print_log ( message , YELLOW , BOLD ) if exit_code is not None : sys . exit ( exit_code )"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/single%20task/function%20documentation%20generation/python/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "def e ( message , exit_code = None ) : print_log ( message , YELLOW , BOLD ) if exit_code is not None : sys . exit ( exit_code )"}]} | summarization | SEBIS/code_trans_t5_base_code_documentation_generation_python | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #has_space #text-generation-inference #region-us
| CodeTrans model for code documentation generation python
========================================================
Pretrained model on programming language python using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized python code functions: it works best with tokenized python functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used single-task training on CodeSearchNet Corpus python dataset.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
50,
112
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #has_space #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.09369116276502609,
0.0051415846683084965,
0.00004991525929654017,
0.06199810281395912,
0.13274241983890533,
0.018632490187883377,
0.07615751028060913,
0.07368475198745728,
-0.005356624722480774,
-0.02868969738483429,
0.09994233399629593,
0.12846828997135162,
0.01057368889451027,
0.14057694375514984,
-0.01905764825642109,
-0.20768558979034424,
0.012873741798102856,
0.0588982068002224,
-0.17237313091754913,
0.14727622270584106,
0.10698788613080978,
-0.04363436996936798,
0.09342916309833527,
0.006013191770762205,
-0.21378910541534424,
0.04984449967741966,
-0.003177658189088106,
-0.07883273810148239,
0.12636946141719818,
0.05907556787133217,
0.14526988565921783,
0.028193769976496696,
0.007222828455269337,
-0.19252866506576538,
0.032748814672231674,
-0.02027803659439087,
-0.006691041402518749,
0.04682968184351921,
0.0210060216486454,
-0.07514747977256775,
0.2281477302312851,
-0.024417389184236526,
0.051526088267564774,
0.04213995859026909,
-0.11933218687772751,
-0.13941477239131927,
-0.03040892817080021,
-0.003367447294294834,
0.05384141206741333,
0.08354886621236801,
0.0015884648310020566,
0.15762090682983398,
-0.15216194093227386,
0.1285298466682434,
0.10809463262557983,
-0.19463017582893372,
-0.02107836678624153,
0.15551503002643585,
0.11099279671907425,
-0.052073486149311066,
-0.02367408759891987,
0.013090419583022594,
0.06806696951389313,
0.013850434683263302,
0.02497265487909317,
-0.11648987233638763,
-0.13565874099731445,
0.051052939146757126,
-0.10859932750463486,
-0.08162233978509903,
0.2628716826438904,
-0.02652968280017376,
-0.044269584119319916,
-0.058984410017728806,
-0.04100494831800461,
-0.010791700333356857,
-0.007887638173997402,
0.02151966653764248,
-0.009568453766405582,
-0.01793307065963745,
-0.04507520794868469,
0.0045883492566645145,
-0.09479857981204987,
-0.09849245846271515,
-0.007977139204740524,
0.12740905582904816,
0.0071333288215100765,
0.025771088898181915,
-0.14511501789093018,
0.10805685818195343,
0.07382757216691971,
-0.06090080738067627,
0.018432224169373512,
-0.06347429007291794,
-0.04842778295278549,
-0.012494654394686222,
-0.07359819859266281,
-0.1180461198091507,
0.07455478608608246,
0.1086287647485733,
-0.03246746212244034,
0.04239729791879654,
0.034663256257772446,
0.05959700793027878,
0.04362039268016815,
0.1911630928516388,
-0.024285651743412018,
-0.045078106224536896,
0.03442162647843361,
-0.017822328954935074,
-0.043326932936906815,
-0.006265624426305294,
-0.07092853635549545,
-0.048247791826725006,
0.042079631239175797,
0.1177034005522728,
-0.046615250408649445,
0.09866470843553543,
-0.06925150752067566,
-0.025472035631537437,
-0.05137183889746666,
-0.12816807627677917,
-0.0030994638800621033,
0.008630628697574139,
-0.05721399560570717,
-0.017362195998430252,
0.1407494693994522,
-0.07301514595746994,
-0.08056212961673737,
-0.0009519636514596641,
-0.0815199688076973,
0.006877859588712454,
-0.09289836883544922,
-0.12853392958641052,
0.014819484204053879,
0.05701198801398277,
0.06156638637185097,
-0.13544020056724548,
-0.1250990629196167,
0.0022801575250923634,
0.09136981517076492,
0.01798084005713463,
0.027378737926483154,
-0.06253855675458908,
-0.03160271421074867,
-0.008187389932572842,
-0.010774802416563034,
0.019658561795949936,
-0.07930698990821838,
0.0978803038597107,
0.0739983469247818,
0.05779218673706055,
-0.06525186449289322,
0.04113795608282089,
-0.11357557028532028,
0.07135026156902313,
-0.145437091588974,
0.0731431171298027,
-0.0765787735581398,
0.12039068341255188,
-0.10444580018520355,
-0.08742258697748184,
0.0488097220659256,
0.06755941361188889,
0.05665162205696106,
0.13625162839889526,
-0.122328020632267,
-0.05296911671757698,
0.14770695567131042,
-0.10264494270086288,
-0.2005617469549179,
0.08992752432823181,
-0.07655935734510422,
0.19232934713363647,
0.05552908405661583,
0.16931112110614777,
0.16595523059368134,
-0.10689949244260788,
0.0359438955783844,
0.0844646766781807,
-0.02953161671757698,
-0.033538222312927246,
0.07705193012952805,
0.06551113724708557,
-0.13259978592395782,
0.046912647783756256,
-0.02428823709487915,
0.12098328769207001,
-0.03989260643720627,
-0.046858299523591995,
-0.02832440286874771,
-0.0523616261780262,
0.10440177470445633,
-0.003999806474894285,
0.0774945467710495,
-0.008626624941825867,
-0.06681472063064575,
0.0935283899307251,
0.13061846792697906,
-0.1240067109465599,
0.0032209642231464386,
-0.10577084869146347,
0.11100885272026062,
-0.0973648950457573,
0.013751850463449955,
-0.19801321625709534,
-0.03526008129119873,
-0.020853951573371887,
0.04331387206912041,
0.06250029057264328,
0.06748440861701965,
0.01686127856373787,
-0.01240373495966196,
0.019466523081064224,
0.007023009005934,
-0.0009149292600341141,
-0.012493427842855453,
-0.04518498107790947,
-0.06783617287874222,
-0.05995391309261322,
-0.04907216131687164,
0.07019945979118347,
-0.18940064311027527,
0.011659340932965279,
0.056438714265823364,
0.05332347750663757,
0.01125402096658945,
0.02538544312119484,
0.03021254576742649,
0.06595654785633087,
-0.05918404087424278,
-0.01645977608859539,
0.054128676652908325,
-0.00038102813414298,
-0.10857893526554108,
0.03028169274330139,
-0.10506269335746765,
0.06347604095935822,
0.1389487236738205,
-0.1353733092546463,
-0.07665655761957169,
-0.007059774361550808,
-0.023052237927913666,
-0.0036699161864817142,
0.007894407957792282,
-0.030997324734926224,
0.16974949836730957,
-0.010270528495311737,
0.16375696659088135,
-0.09707465767860413,
-0.026305189356207848,
-0.03477493301033974,
-0.021753903478384018,
0.026857133954763412,
0.13974130153656006,
0.05294252932071686,
-0.13506345450878143,
0.06208149343729019,
0.08324646204710007,
-0.04651876538991928,
0.17270486056804657,
-0.04241709038615227,
-0.03975923731923103,
-0.01395199354737997,
0.07669690251350403,
-0.018356934189796448,
0.15584643185138702,
-0.18146023154258728,
-0.03196215257048607,
0.021873250603675842,
-0.008394995704293251,
0.10038352757692337,
-0.12524983286857605,
-0.014476696960628033,
0.04726899787783623,
-0.023974578827619553,
-0.15901613235473633,
0.0574607327580452,
0.021570587530732155,
0.04041954502463341,
-0.00008248187805293128,
-0.029556885361671448,
0.018550913780927658,
-0.01268254965543747,
-0.11918295174837112,
0.23817354440689087,
-0.0751710832118988,
-0.2724089026451111,
-0.16354656219482422,
0.001856415532529354,
-0.005876004695892334,
-0.024974314495921135,
0.0574427954852581,
-0.05692126229405403,
-0.028094667941331863,
-0.03779640048742294,
0.155324324965477,
-0.07269295305013657,
-0.028642643243074417,
-0.05918622389435768,
0.05481616407632828,
0.012238219380378723,
-0.19058489799499512,
0.005157894920557737,
0.011777833104133606,
0.03853992745280266,
0.038468699902296066,
-0.14439436793327332,
0.10012692958116531,
0.10755106806755066,
-0.06520525366067886,
0.03765013813972473,
-0.040786080062389374,
0.2553946375846863,
-0.07607436180114746,
-0.09041818976402283,
0.14685550332069397,
-0.08224945515394211,
0.007742907851934433,
0.037820421159267426,
0.0011395461624488235,
-0.10878767818212509,
0.03284309804439545,
-0.04565548524260521,
-0.08009078353643417,
-0.21085943281650543,
-0.12163998931646347,
-0.09427570551633835,
0.11153509467840195,
0.06687898188829422,
0.02340569905936718,
-0.09575840830802917,
0.06628318876028061,
0.07270919531583786,
0.10724809765815735,
-0.013092177920043468,
0.07772649824619293,
0.07864576578140259,
-0.006114135961979628,
0.029470104724168777,
-0.1013314500451088,
-0.05609568580985069,
0.031328070908784866,
0.08598548918962479,
0.18608710169792175,
-0.005873170215636492,
0.11061207950115204,
0.058871153742074966,
0.08001893758773804,
0.04867611825466156,
0.16139519214630127,
-0.08443120867013931,
0.014893756248056889,
0.007732245605438948,
-0.034007057547569275,
-0.1257813423871994,
0.0297565720975399,
0.012185333296656609,
0.00969956535845995,
-0.1335245668888092,
-0.09160619974136353,
0.042467888444662094,
0.05063799023628235,
0.053158074617385864,
-0.274722158908844,
-0.10349435359239578,
0.020797248929739,
-0.08462639898061752,
-0.05765063688158989,
0.04222457483410835,
0.08606188744306564,
-0.1308802366256714,
0.004660140257328749,
-0.0654304176568985,
0.15252748131752014,
-0.033305294811725616,
0.005759978201240301,
-0.08414351940155029,
-0.07651036977767944,
-0.005539372097700834,
0.1411002278327942,
-0.20228618383407593,
0.23484107851982117,
-0.009781255386769772,
-0.0017848499119281769,
-0.061153050512075424,
0.024556398391723633,
0.017839165404438972,
0.09951543062925339,
0.0980428084731102,
-0.017503604292869568,
-0.02878388948738575,
-0.17268607020378113,
0.01576540246605873,
0.07997352629899979,
0.05169399082660675,
-0.01066511869430542,
0.09199990332126617,
-0.037528783082962036,
0.0405423678457737,
0.0016569964354857802,
-0.027602143585681915,
-0.05367187038064003,
-0.1091528981924057,
-0.002833245089277625,
-0.07212656736373901,
0.04807410389184952,
-0.02697400376200676,
-0.007258198224008083,
0.016886934638023376,
0.174085333943367,
-0.034147147089242935,
-0.06979407370090485,
-0.10873915255069733,
0.03202921152114868,
0.1279200315475464,
-0.07471266388893127,
0.03820711746811867,
-0.00524662621319294,
0.012545495294034481,
-0.010808411985635757,
-0.15623068809509277,
0.06325679272413254,
-0.05920936539769173,
0.0077079907059669495,
-0.009076585061848164,
0.0904579609632492,
-0.02462150901556015,
0.022888977080583572,
0.05748946592211723,
-0.023955747485160828,
-0.09335338324308395,
-0.1213788166642189,
-0.132715106010437,
-0.06914052367210388,
0.037862587720155716,
0.07864898443222046,
-0.1124102920293808,
0.028760740533471107,
-0.014430797658860683,
0.008848307654261589,
0.23278672993183136,
0.13744807243347168,
-0.06440477073192596,
0.01940287835896015,
0.047846607863903046,
-0.07286668568849564,
-0.2678578794002533,
0.0017509165918454528,
-0.026012493297457695,
0.08420107513666153,
0.0381132997572422,
-0.13499322533607483,
0.07397853583097458,
-0.01670501008629799,
0.029425717890262604,
0.036705899983644485,
-0.28406572341918945,
-0.1102597787976265,
0.12905418872833252,
0.10824289917945862,
0.09459856152534485,
-0.1340804547071457,
-0.021012993529438972,
-0.07086332887411118,
-0.18572905659675598,
0.16762787103652954,
-0.09823232889175415,
0.10158856958150864,
-0.0031603588722646236,
0.10813543945550919,
0.03264850750565529,
-0.045891404151916504,
0.13487382233142853,
-0.021376769989728928,
0.07148635387420654,
-0.028565851971507072,
-0.10045532882213593,
0.08936062455177307,
-0.04655131325125694,
0.13371193408966064,
-0.12775646150112152,
0.08220748603343964,
-0.27151212096214294,
-0.028442084789276123,
-0.039449065923690796,
0.05911446735262871,
-0.014095891267061234,
-0.0586196705698967,
-0.1094178780913353,
0.012035727500915527,
0.039536427706480026,
0.010417401790618896,
0.11022476851940155,
-0.05122274532914162,
0.053745776414871216,
0.08640247583389282,
0.12785933911800385,
-0.000919685116969049,
-0.07912933826446533,
0.06697896122932434,
0.01743978261947632,
0.09605202823877335,
-0.26048463582992554,
0.06489359587430954,
0.12407876551151276,
0.042943477630615234,
0.10296357423067093,
0.079932302236557,
-0.03032355196774006,
0.03911588713526726,
0.09688015282154083,
-0.1121119037270546,
-0.09237667173147202,
-0.02824806608259678,
-0.08203192800283432,
-0.03756370022892952,
0.042212944477796555,
0.1418299823999405,
-0.07146547734737396,
-0.015626151114702225,
0.002991589019075036,
-0.030467625707387924,
-0.14192764461040497,
0.12860868871212006,
0.046534255146980286,
0.07609560340642929,
-0.08295980095863342,
0.029435941949486732,
0.04486766457557678,
-0.12120755761861801,
-0.02852684073150158,
0.09148044884204865,
-0.11962507665157318,
-0.08321499079465866,
-0.022446170449256897,
0.19546851515769958,
-0.12092512100934982,
-0.04786977916955948,
-0.1090717539191246,
-0.05871499329805374,
-0.01004059799015522,
0.20776604115962982,
0.12234200537204742,
0.08945067971944809,
-0.0380101278424263,
-0.014391276054084301,
-0.10435944050550461,
0.05342716723680496,
0.10176798701286316,
0.015205918811261654,
-0.11267005652189255,
0.1365167498588562,
0.0012013926170766354,
0.13035602867603302,
-0.06297823041677475,
-0.037749048322439194,
-0.18938036262989044,
0.081887386739254,
-0.1318766325712204,
0.044986292719841,
-0.06280604004859924,
0.02984566241502762,
0.01933986507356167,
-0.004162218887358904,
-0.043194565922021866,
0.029345333576202393,
-0.10985281318426132,
0.017501484602689743,
0.0030490439385175705,
0.04839824140071869,
-0.0664539635181427,
0.003187716705724597,
0.08813219517469406,
-0.06997629255056381,
0.08652349561452866,
0.028011947870254517,
-0.05722441151738167,
0.11155062168836594,
-0.1520608812570572,
-0.0316103957593441,
0.038083817809820175,
0.017632875591516495,
0.05116498842835426,
-0.06304729729890823,
0.04602779075503349,
0.010766935534775257,
0.05977720767259598,
0.025884421542286873,
0.09980253130197525,
-0.11154402047395706,
-0.08885122090578079,
-0.056336041539907455,
-0.09903638064861298,
-0.026674948632717133,
0.03567884489893913,
0.010245941579341888,
0.10414501279592514,
0.12019567936658859,
-0.021329835057258606,
0.03953387588262558,
-0.07265837490558624,
-0.03432073816657066,
0.012157220393419266,
-0.07718328386545181,
-0.008308127522468567,
-0.08856258541345596,
0.03389451652765274,
-0.04932784289121628,
0.19607128202915192,
0.014146164059638977,
0.09011485427618027,
-0.013127029873430729,
-0.007094650994986296,
0.058210063725709915,
0.03512676805257797,
0.2635565996170044,
-0.010391296818852425,
0.06214313209056854,
-0.057400014251470566,
0.06390000134706497,
0.03190230578184128,
0.07462193816900253,
0.09250729531049728,
0.13140439987182617,
-0.06030930206179619,
0.10009101778268814,
0.0018554965499788523,
0.024321265518665314,
-0.06626356393098831,
-0.14013269543647766,
0.05386464670300484,
0.07153729349374771,
-0.04798407107591629,
0.11377888172864914,
0.12308207899332047,
-0.0857747420668602,
0.09595368057489395,
0.002244694624096155,
-0.09805340319871902,
-0.056592993438243866,
0.0036762289237231016,
-0.038360606878995895,
-0.1603422611951828,
0.005609314423054457,
-0.10641540586948395,
-0.05803966522216797,
0.10328137874603271,
0.03585994243621826,
-0.03542960807681084,
0.19085495173931122,
-0.030368804931640625,
-0.08095967769622803,
0.03866267949342728,
-0.017737500369548798,
0.007864775136113167,
0.00039649871177971363,
0.05528873950242996,
-0.016746461391448975,
-0.03295326605439186,
0.02385852485895157,
0.030533423647284508,
-0.06291268020868301,
0.023818979039788246,
-0.09310535341501236,
-0.03302234038710594,
-0.049505993723869324,
0.058812521398067474,
0.01782352477312088,
0.08810113370418549,
0.020705565810203552,
-0.036690425127744675,
-0.03026430308818817,
0.22710371017456055,
-0.04960279166698456,
-0.08250990509986877,
-0.1560349464416504,
0.23235027492046356,
-0.000814747647382319,
0.04550584778189659,
-0.015660004690289497,
-0.06660789251327515,
-0.04284580796957016,
0.2837293744087219,
0.22173738479614258,
-0.03923393040895462,
0.014919957146048546,
0.004652713891118765,
0.019100332632660866,
0.0030810122843831778,
0.14239732921123505,
0.03537070006132126,
0.23505477607250214,
-0.032032888382673264,
-0.08661504834890366,
-0.056155335158109665,
-0.04934029281139374,
-0.001971753314137459,
0.09461952745914459,
0.02846965193748474,
-0.0718880370259285,
-0.04876387119293213,
0.10557151585817337,
-0.14728906750679016,
-0.08218087255954742,
0.03670113906264305,
-0.1624322384595871,
-0.09275159239768982,
-0.0635935440659523,
0.02662712335586548,
-0.037617892026901245,
0.0638229101896286,
-0.053934890776872635,
-0.02355974167585373,
0.010076900012791157,
0.021557385101914406,
-0.13415935635566711,
-0.10207729041576385,
0.08253879100084305,
0.0029207300394773483,
0.1148483082652092,
-0.021995538845658302,
0.10502567142248154,
0.11544694006443024,
0.028814787045121193,
-0.01979638636112213,
0.020982159301638603,
0.07936987280845642,
0.007753332145512104,
0.05780952423810959,
0.055659763514995575,
-0.03558565676212311,
0.11004229635000229,
-0.03157815337181091,
-0.09869851171970367,
0.03825552016496658,
-0.035420358180999756,
0.02632436901330948,
-0.11641527712345123,
-0.055859170854091644,
-0.09550917148590088,
0.08682585507631302,
0.18707899749279022,
-0.05599305033683777,
0.021849913522601128,
-0.07702887058258057,
0.10393387079238892,
0.016403131186962128,
-0.03079122304916382,
-0.08664993941783905,
-0.16192510724067688,
-0.03941488638520241,
-0.012876113876700401,
-0.04882320761680603,
-0.23321282863616943,
-0.002926129149273038,
-0.0473775789141655,
-0.005926320794969797,
-0.033598341047763824,
0.1446404755115509,
0.1238945797085762,
0.014087743125855923,
-0.02352842688560486,
-0.1436404287815094,
-0.018652619794011116,
0.08475355058908463,
-0.11337742209434509,
-0.12632732093334198
] |
null | null | transformers |
# CodeTrans model for code documentation generation python
Pretrained model on programming language python using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized python code functions: it works best with tokenized python functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_python_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_python_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "def e ( message , exit_code = None ) : print_log ( message , YELLOW , BOLD ) if exit_code is not None : sys . exit ( exit_code )"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/function%20documentation%20generation/python/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 420,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "def e ( message , exit_code = None ) : print_log ( message , YELLOW , BOLD ) if exit_code is not None : sys . exit ( exit_code )"}]} | summarization | SEBIS/code_trans_t5_base_code_documentation_generation_python_multitask | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
| CodeTrans model for code documentation generation python
========================================================
Pretrained model on programming language python using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized python code functions: it works best with tokenized python functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 420,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 420,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 420,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
46,
61,
143
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 420,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.1281861811876297,
-0.017873352393507957,
-0.0006902650347910821,
0.13325180113315582,
0.11884834617376328,
0.034155648201704025,
0.06055370718240738,
0.06928083300590515,
-0.037707678973674774,
0.019685382023453712,
0.045776840299367905,
0.02138960175216198,
0.03297438099980354,
0.18984273076057434,
0.016662297770380974,
-0.13063772022724152,
-0.021778864786028862,
0.04477338492870331,
-0.051645275205373764,
0.13382799923419952,
0.08299331367015839,
-0.0576661080121994,
0.050344549119472504,
-0.06260361522436142,
-0.2371012270450592,
0.05212556943297386,
0.004738867748528719,
-0.054320987313985825,
0.10274650901556015,
0.03583918511867523,
0.14017155766487122,
-0.02467113919556141,
0.032114725559949875,
-0.14049440622329712,
0.00773554528132081,
0.015929710119962692,
0.025422465056180954,
0.009587056003510952,
0.0476720966398716,
0.03207775950431824,
0.16564318537712097,
-0.001324565033428371,
0.0686599537730217,
0.05854376032948494,
-0.07763193547725677,
-0.13562947511672974,
-0.011735041625797749,
0.030060606077313423,
0.044805388897657394,
0.10848456621170044,
-0.014415208250284195,
0.13256889581680298,
-0.14230285584926605,
0.1366840898990631,
0.09272369742393494,
-0.2326132357120514,
-0.010667968541383743,
0.11696695536375046,
0.08007878065109253,
0.0855918750166893,
-0.0440046451985836,
-0.06764350086450577,
0.0940818265080452,
0.05639692768454552,
0.03130835294723511,
-0.09380488097667694,
-0.08352621644735336,
-0.0000730573883629404,
-0.09429354965686798,
-0.07504595071077347,
0.2081473469734192,
-0.013975521549582481,
-0.08665261417627335,
-0.05188555642962456,
-0.033101603388786316,
-0.14082598686218262,
0.026030512526631355,
0.05340259522199631,
-0.0020237131975591183,
-0.034108586609363556,
0.004686551634222269,
0.03712109476327896,
-0.06847258657217026,
-0.13869717717170715,
0.024264072999358177,
0.11295025795698166,
0.06557828187942505,
0.026385031640529633,
-0.09851787984371185,
0.1131637692451477,
0.036560505628585815,
-0.044684652239084244,
-0.02197827771306038,
-0.009802483953535557,
-0.11145362257957458,
0.027238471433520317,
-0.048652563244104385,
-0.1584211140871048,
0.009343497455120087,
0.049632180482149124,
-0.030403196811676025,
0.052644193172454834,
0.0296543650329113,
0.025551287457346916,
0.021347055211663246,
0.19705745577812195,
0.059869445860385895,
-0.10822582989931107,
0.06001522019505501,
0.03835931047797203,
-0.031410034745931625,
-0.007003038190305233,
-0.06722737848758698,
-0.09883911907672882,
0.09979414194822311,
0.10512859374284744,
-0.10672586411237717,
0.04678544029593468,
-0.06675104051828384,
-0.04244794696569443,
-0.0394013337790966,
-0.15504761040210724,
0.005723483394831419,
0.028300847858190536,
-0.06570138037204742,
-0.025842241942882538,
0.10439787805080414,
-0.1777792125940323,
-0.1506461352109909,
-0.02090286836028099,
-0.07761198282241821,
-0.03399860858917236,
-0.15472884476184845,
-0.1718670129776001,
-0.013704032637178898,
-0.030672255903482437,
0.026152269914746284,
-0.09412100166082382,
-0.15081170201301575,
-0.019153980538249016,
0.026835041120648384,
0.017262058332562447,
-0.012467586435377598,
-0.07222003489732742,
-0.005764494184404612,
-0.02320552058517933,
-0.03124704770743847,
-0.006162950769066811,
-0.05339433625340462,
0.1322268396615982,
0.10524989664554596,
0.042748671025037766,
-0.03218543156981468,
0.05325160175561905,
-0.06897006183862686,
0.062543123960495,
-0.08772043883800507,
0.08856654912233353,
-0.05898356810212135,
0.08998115360736847,
-0.029217420145869255,
-0.11723149567842484,
0.08731885999441147,
0.05966288223862648,
0.06537256389856339,
0.045442841947078705,
-0.1218753531575203,
-0.03262975811958313,
0.19312795996665955,
-0.11807557940483093,
-0.1288331001996994,
0.1153237521648407,
-0.03812049701809883,
0.0763617604970932,
0.0973428338766098,
0.1250470131635666,
0.17007973790168762,
-0.05789044871926308,
0.016523469239473343,
0.05387846753001213,
0.047790613025426865,
-0.1148909330368042,
0.0788390263915062,
0.05573326721787453,
-0.09584641456604004,
0.054696448147296906,
-0.013609557412564754,
0.11158230900764465,
-0.010779807344079018,
-0.023743584752082825,
-0.05202951282262802,
-0.07536089420318604,
0.024307480081915855,
0.002538733184337616,
0.05920577794313431,
-0.08121505379676819,
-0.08672409504652023,
0.09200413525104523,
0.19338318705558777,
-0.13189977407455444,
-0.003582754172384739,
-0.08876270800828934,
0.07215740531682968,
-0.06543131172657013,
0.021015938371419907,
-0.16307108104228973,
0.00901241134852171,
0.0585763081908226,
-0.01426240149885416,
0.07381164282560349,
0.11787315458059311,
0.017229974269866943,
0.043796356767416,
0.009061597287654877,
-0.012863324955105782,
-0.12290190905332565,
-0.0590306892991066,
-0.07373980432748795,
-0.04507948085665703,
-0.09189888089895248,
-0.05629247426986694,
0.002127026440575719,
-0.2006005197763443,
0.01936589926481247,
0.0084858238697052,
-0.0045663439668715,
0.01837305724620819,
-0.017886921763420105,
0.019037464633584023,
0.07498075067996979,
-0.062098704278469086,
-0.04011385887861252,
0.031405746936798096,
0.018887631595134735,
-0.0418272465467453,
-0.1013658419251442,
-0.10511291027069092,
0.0026072540786117315,
0.12135773152112961,
0.05050972104072571,
-0.09045737236738205,
0.034286368638277054,
-0.009894275106489658,
-0.03568270802497864,
0.011184491217136383,
-0.06502964347600937,
0.1584346741437912,
-0.003604840487241745,
0.20325054228305817,
-0.14616283774375916,
-0.026272261515259743,
-0.027870144695043564,
0.01538321003317833,
0.07023806124925613,
0.13310080766677856,
-0.02436552569270134,
-0.06204404681921005,
0.0646863654255867,
-0.0009283997351303697,
-0.10571945458650589,
0.2086438238620758,
-0.04936474561691284,
-0.09512051939964294,
0.03369962051510811,
0.11033166944980621,
-0.0025089485570788383,
0.16637715697288513,
-0.18289220333099365,
-0.02408907748758793,
0.014482292346656322,
0.0063659995794296265,
0.0651465505361557,
-0.13138070702552795,
0.0049998098984360695,
0.02071032114326954,
-0.06414800882339478,
-0.10979863256216049,
-0.01734272763133049,
0.00021685654064640403,
0.04283313453197479,
-0.00689710071310401,
-0.03357547149062157,
0.011188232339918613,
-0.029840664938092232,
-0.1145806685090065,
0.227989062666893,
-0.0964803546667099,
-0.2193836271762848,
-0.1998814046382904,
0.060338862240314484,
-0.05476135388016701,
-0.022294677793979645,
0.0319027304649353,
-0.08421045541763306,
-0.04749171435832977,
-0.04012932628393173,
0.20017777383327484,
-0.09380262345075607,
-0.00800342857837677,
-0.033909644931554794,
0.06766877323389053,
0.016954682767391205,
-0.2060755342245102,
0.04057344049215317,
-0.006714200135320425,
-0.028804106637835503,
0.01642523519694805,
-0.11457903683185577,
0.08272409439086914,
0.16257396340370178,
-0.08269589394330978,
0.015211771242320538,
-0.0009737422224134207,
0.20844712853431702,
-0.039353929460048676,
-0.0772365853190422,
0.13550199568271637,
-0.02366577461361885,
-0.000199833870283328,
0.00978782493621111,
-0.012458931654691696,
-0.10085268318653107,
0.058165378868579865,
-0.014794341288506985,
-0.030284889042377472,
-0.25888168811798096,
-0.026898393407464027,
-0.08209757506847382,
0.061117466539144516,
0.0676712617278099,
0.03837069496512413,
-0.10115770250558853,
0.034289244562387466,
0.05093924328684807,
0.14524494111537933,
-0.005027451552450657,
0.06535555422306061,
0.06319274753332138,
0.0038006799295544624,
0.0063229952938854694,
-0.09943554550409317,
0.007298408076167107,
0.07350482046604156,
0.11021627485752106,
0.27122437953948975,
-0.0949387326836586,
0.1774524748325348,
0.03197644650936127,
0.056568775326013565,
0.04989146068692207,
0.14401137828826904,
-0.12306249886751175,
0.038523752242326736,
0.017001906409859657,
0.0008711097179912031,
-0.11463883519172668,
0.0022798965219408274,
-0.037115149199962616,
0.07129789143800735,
-0.11938841640949249,
-0.06431514024734497,
-0.002182481810450554,
0.13415181636810303,
0.06626749783754349,
-0.237335667014122,
-0.12593163549900055,
0.014229778200387955,
-0.10110877454280853,
-0.11138899624347687,
0.06260070204734802,
0.1886579543352127,
-0.08946538716554642,
-0.022220535203814507,
-0.022167671471834183,
0.13740752637386322,
-0.04226788505911827,
-0.032540760934352875,
-0.04688740521669388,
0.0505668930709362,
0.006154310889542103,
0.12643353641033173,
-0.28462210297584534,
0.1299460232257843,
-0.011310585774481297,
0.06566963344812393,
-0.027621202170848846,
0.05502026528120041,
-0.034259870648384094,
0.0805438980460167,
0.03466888144612312,
-0.011991618201136589,
0.04844075068831444,
-0.17447753250598907,
-0.002098239026963711,
0.041550152003765106,
0.021740764379501343,
0.058594126254320145,
0.0861494243144989,
-0.015295440331101418,
0.06812715530395508,
-0.02034357376396656,
-0.14648786187171936,
-0.04406598582863808,
-0.07972656935453415,
-0.02140837348997593,
-0.06505435705184937,
-0.01776808686554432,
-0.03830469772219658,
-0.01157515961676836,
0.06868048757314682,
0.1820782572031021,
-0.10028029978275299,
-0.09241273254156113,
-0.08114761859178543,
0.05909852311015129,
0.10198741406202316,
-0.08483915030956268,
0.05104960501194,
-0.005818333011120558,
0.02713300846517086,
-0.012141923420131207,
-0.09312205761671066,
0.049056507647037506,
-0.038591839373111725,
-0.05638841167092323,
-0.001659182133153081,
0.06921414285898209,
0.000603150692768395,
0.04104791581630707,
0.010737646371126175,
-0.08464975655078888,
-0.05730046331882477,
-0.10767590254545212,
-0.14317168295383453,
-0.055089205503463745,
0.008795649744570255,
0.06707564741373062,
-0.1397232860326767,
-0.05590013042092323,
-0.023233750835061073,
-0.013858644291758537,
0.14937175810337067,
0.17199230194091797,
-0.05604377016425133,
0.019829582422971725,
0.10765815526247025,
-0.053221479058265686,
-0.19088737666606903,
0.02354096807539463,
0.05041033402085304,
0.11573406308889389,
-0.039242420345544815,
-0.18507319688796997,
0.03982662037014961,
0.02240194007754326,
0.04066889360547066,
0.0773235559463501,
-0.31840911507606506,
-0.1208692267537117,
0.10117316991090775,
0.1515442132949829,
0.11041750013828278,
-0.12964344024658203,
-0.020893020555377007,
-0.05675765499472618,
-0.1329382210969925,
0.09365855902433395,
-0.04369134083390236,
0.13495928049087524,
-0.06433621048927307,
0.06589306145906448,
0.04119107127189636,
-0.036494266241788864,
0.07768183946609497,
0.01820141077041626,
0.10505765676498413,
-0.03955947235226631,
0.024283699691295624,
0.11059997230768204,
-0.037195656448602676,
0.17401903867721558,
-0.1491558700799942,
0.09528782963752747,
-0.2658860385417938,
-0.05779055505990982,
-0.06504333019256592,
0.006275600288063288,
-0.032831624150276184,
-0.045501962304115295,
-0.09002386033535004,
0.018807370215654373,
0.0023940063547343016,
-0.008749867789447308,
0.035278208553791046,
-0.024974457919597626,
-0.00862075574696064,
0.06588371843099594,
0.10420501232147217,
0.01648605428636074,
-0.11424941569566727,
0.053594741970300674,
0.04867471009492874,
0.10207876563072205,
-0.19410721957683563,
0.020077569410204887,
0.1119740903377533,
0.03232182562351227,
0.11501552164554596,
0.047779571264982224,
-0.10295626521110535,
0.04883923381567001,
0.08708242326974869,
-0.06640218943357468,
-0.077396921813488,
-0.004691810812801123,
-0.08682840317487717,
-0.08967144787311554,
0.041358113288879395,
0.10221274197101593,
-0.04506884515285492,
-0.025051243603229523,
-0.023203052580356598,
-0.019949784502387047,
-0.11972075700759888,
0.19554102420806885,
0.0746733620762825,
0.08080128580331802,
-0.0696655735373497,
0.04426400735974312,
0.07478664070367813,
-0.06376980990171432,
0.0036982246674597263,
0.1797206997871399,
-0.09944704174995422,
-0.04531025514006615,
0.07576438784599304,
0.1839829385280609,
-0.04630321264266968,
-0.04153407737612724,
-0.1212419643998146,
-0.06864743679761887,
0.021163038909435272,
0.1563214212656021,
0.10660609602928162,
0.08585666120052338,
-0.036893151700496674,
0.0017394735477864742,
-0.11794222146272659,
0.08189049363136292,
0.07673878222703934,
0.04586144536733627,
-0.12166270613670349,
0.15820848941802979,
0.0423123799264431,
0.11418487876653671,
-0.02655697427690029,
-0.01408446952700615,
-0.1421872228384018,
0.0748593658208847,
-0.10370318591594696,
0.03304409608244896,
-0.008822810836136341,
0.058472637087106705,
-0.018237978219985962,
0.006682444363832474,
-0.030160151422023773,
0.05593914911150932,
-0.09018956124782562,
-0.000027343336114427075,
0.010255182161927223,
0.030032172799110413,
-0.0418732576072216,
-0.0095561807975173,
0.03408488258719444,
-0.10202768445014954,
0.12123505771160126,
-0.017338309437036514,
-0.023057641461491585,
0.09910131990909576,
-0.036612674593925476,
0.028383804485201836,
0.01360576506704092,
0.05350333824753761,
0.01633373089134693,
0.018251201137900352,
0.0802576020359993,
0.02291429415345192,
0.06293663382530212,
0.03939564898610115,
0.12626346945762634,
-0.133112832903862,
-0.08501964062452316,
-0.05783122032880783,
-0.10794229805469513,
-0.05243898928165436,
0.09455219656229019,
0.027475325390696526,
0.11221952736377716,
0.1132090613245964,
-0.03430391103029251,
0.026400728151202202,
-0.1163778081536293,
-0.07332833111286163,
0.016647284850478172,
-0.02313779853284359,
-0.0586969330906868,
-0.05719003453850746,
0.04226377233862877,
-0.022921068593859673,
0.11201050877571106,
0.01239826250821352,
0.03616023436188698,
-0.024334702640771866,
-0.039811789989471436,
0.005805129650980234,
0.0100462157279253,
0.22864066064357758,
-0.07354459166526794,
0.04461736977100372,
-0.002100364537909627,
0.01323719322681427,
0.01266325730830431,
0.12352492660284042,
0.12717020511627197,
0.1518240123987198,
-0.05717136710882187,
0.09030148386955261,
0.009519108571112156,
-0.0020270268432796,
-0.09598126262426376,
-0.028445681557059288,
0.023076917976140976,
0.05880787596106529,
-0.04144035652279854,
0.19871185719966888,
0.0920768454670906,
-0.10225339233875275,
0.10567101836204529,
0.03339328244328499,
-0.13459739089012146,
-0.04419412836432457,
0.034926917403936386,
-0.02534407563507557,
-0.14442123472690582,
0.027059171348810196,
-0.11963636428117752,
-0.02745916321873665,
0.0745605081319809,
0.05454858019948006,
-0.07119131833314896,
0.16835810244083405,
0.024605033919215202,
-0.07209298759698868,
0.042711447924375534,
-0.0018650175770744681,
0.01979227177798748,
0.033120181411504745,
0.02697610668838024,
0.0209458377212286,
-0.047118134796619415,
0.05458423122763634,
0.02090521901845932,
-0.03896327316761017,
0.0010840505128726363,
-0.03287406265735626,
-0.0008320020278915763,
-0.020434031262993813,
0.029433144256472588,
0.057773273438215256,
0.19079668819904327,
0.03590782731771469,
-0.07936888188123703,
-0.03339793160557747,
0.164112389087677,
-0.029885223135352135,
-0.10310553759336472,
-0.1335485428571701,
0.17688019573688507,
0.0261519905179739,
0.006940604653209448,
0.021038049831986427,
-0.08471133559942245,
-0.051429931074380875,
0.2165343165397644,
0.06125263124704361,
-0.02292465977370739,
-0.018829455599188805,
0.0017539847176522017,
-0.00028357250266708434,
-0.03826618194580078,
0.21105413138866425,
0.02404973842203617,
0.2450835108757019,
0.02037903293967247,
-0.0182639230042696,
-0.07228036969900131,
-0.034836556762456894,
-0.018011627718806267,
0.1067916601896286,
-0.03859938681125641,
-0.04071563482284546,
-0.08706605434417725,
0.019108468666672707,
-0.00557383568957448,
-0.06581633538007736,
0.08951162546873093,
-0.12446131557226181,
-0.10186626762151718,
-0.043948944658041,
0.024034136906266212,
-0.053231868892908096,
0.02922878973186016,
-0.03327355161309242,
0.03195338323712349,
0.05153270810842514,
-0.03443853184580803,
-0.11584712564945221,
-0.17465174198150635,
0.11302988231182098,
-0.0255756638944149,
0.13820217549800873,
-0.011092121712863445,
0.15465103089809418,
0.08799999952316284,
0.025162896141409874,
-0.05102435126900673,
0.09912189841270447,
0.0365421324968338,
0.053044144064188004,
0.05345012992620468,
0.11839573085308075,
-0.058092597872018814,
0.11561348289251328,
-0.04708987474441528,
-0.01563035510480404,
-0.028205865994095802,
-0.07209841161966324,
-0.0015845217276364565,
-0.1574401706457138,
-0.02974838763475418,
-0.10319434851408005,
0.08795696496963501,
0.2039586305618286,
-0.049423668533563614,
-0.023357465863227844,
-0.09377351403236389,
0.0918336808681488,
0.003498874371871352,
0.05668489262461662,
-0.038976673036813736,
-0.18697871267795563,
-0.019124578684568405,
-0.013675219379365444,
-0.003223475767299533,
-0.27591314911842346,
-0.002550119999796152,
-0.04516563564538956,
-0.0222233384847641,
-0.09529218822717667,
0.17657269537448883,
0.07629793137311935,
0.030472395941615105,
-0.03913094848394394,
-0.12909461557865143,
-0.046578798443078995,
0.06668303161859512,
-0.14419938623905182,
-0.14011134207248688
] |
null | null | transformers |
# CodeTrans model for code documentation generation python
Pretrained model on programming language python using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized python code functions: it works best with tokenized python functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the python function/method.
## Intended uses & limitations
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_python_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_python_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "def e ( message , exit_code = None ) : print_log ( message , YELLOW , BOLD ) if exit_code is not None : sys . exit ( exit_code )"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/function%20documentation%20generation/python/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 4000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "def e ( message , exit_code = None ) : print_log ( message , YELLOW , BOLD ) if exit_code is not None : sys . exit ( exit_code )"}]} | summarization | SEBIS/code_trans_t5_base_code_documentation_generation_python_multitask_finetune | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
| CodeTrans model for code documentation generation python
========================================================
Pretrained model on programming language python using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized python code functions: it works best with tokenized python functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the python function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 4000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 4000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 4000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
46,
61,
88,
108
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 4000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.09364219754934311,
0.07657555490732193,
-0.0012474871473386884,
0.10554156452417374,
0.048726700246334076,
0.035543546080589294,
0.021755022928118706,
0.10967344790697098,
-0.031487125903367996,
0.06592940539121628,
0.06018340215086937,
-0.06934331357479095,
0.05134846270084381,
0.17429319024085999,
0.034922659397125244,
-0.17196036875247955,
-0.03371892124414444,
0.022133007645606995,
-0.05211257189512253,
0.10640904307365417,
0.07487455755472183,
-0.07171907275915146,
0.07376247644424438,
-0.03968166559934616,
-0.11638940125703812,
0.045677222311496735,
-0.02728867717087269,
-0.01729155145585537,
0.10181553661823273,
0.05076586455106735,
0.11802254617214203,
-0.04090002179145813,
0.07006578147411346,
-0.21031714975833893,
-0.000754303066059947,
0.032933443784713745,
0.05878609046339989,
0.03140576183795929,
0.05760490521788597,
0.05835084617137909,
0.15031088888645172,
-0.019257977604866028,
0.06359252333641052,
0.05771839618682861,
-0.06733611971139908,
-0.09718194603919983,
-0.06467626243829727,
0.0644790306687355,
0.06975078582763672,
0.1039157584309578,
-0.007244537118822336,
0.022601202130317688,
-0.06549662351608276,
0.09641087800264359,
0.11146021634340286,
-0.2190857231616974,
-0.01828940585255623,
0.11867285519838333,
0.08918523788452148,
0.05186331644654274,
-0.07261641323566437,
-0.03854935243725777,
0.09836674481630325,
0.051516853272914886,
0.04800897091627121,
-0.09127860516309738,
-0.02173648402094841,
-0.01698777638375759,
-0.05632268637418747,
-0.055264733731746674,
0.14225059747695923,
0.023189714178442955,
-0.056776609271764755,
-0.11466939747333527,
-0.05603717267513275,
-0.20009608566761017,
0.03493133932352066,
0.03516062721610069,
0.009351170621812344,
-0.0037276388611644506,
0.008493958041071892,
-0.011824635788798332,
-0.08824004232883453,
-0.09878810495138168,
0.02942267246544361,
0.02942400425672531,
0.0682186558842659,
0.03603138029575348,
-0.021261367946863174,
0.09596782922744751,
-0.007110784761607647,
-0.041751034557819366,
-0.009709320031106472,
0.02091258391737938,
-0.1172061339020729,
0.012907272204756737,
-0.006529401522129774,
-0.04540980979800224,
-0.0013158678775653243,
0.09482090175151825,
-0.10586608946323395,
0.08470138907432556,
0.09562455117702484,
0.004136912524700165,
0.020035015419125557,
0.21763212978839874,
0.042497940361499786,
-0.14887435734272003,
0.018214026466012,
0.010893834754824638,
0.0021325170528143644,
0.005278171505779028,
-0.04883105307817459,
-0.04374508559703827,
0.014464084059000015,
0.07106100767850876,
-0.10281050950288773,
0.02381390705704689,
-0.05086340382695198,
-0.00800025463104248,
0.06171657517552376,
-0.11965804547071457,
0.04428619518876076,
0.015414156951010227,
-0.04374147206544876,
-0.014869017526507378,
0.086066335439682,
-0.12919825315475464,
-0.12130658328533173,
0.06031017005443573,
-0.04319780692458153,
-0.034440383315086365,
-0.10771597176790237,
-0.12001374363899231,
-0.006319371052086353,
-0.028549600392580032,
0.003135193372145295,
-0.09899721294641495,
-0.08763986825942993,
-0.019911207258701324,
0.04136624559760094,
-0.0029981445986777544,
-0.033058665692806244,
-0.031616970896720886,
0.013452318497002125,
-0.00039148094947449863,
-0.01138978824019432,
0.011619682423770428,
-0.039221927523612976,
0.09375427663326263,
0.07976013422012329,
0.03475486487150192,
-0.006542477756738663,
0.023214103654026985,
-0.08273173868656158,
0.08130978792905807,
-0.08176828920841217,
0.049016907811164856,
-0.01451210118830204,
0.07262181490659714,
-0.10011716932058334,
-0.08977852016687393,
0.029559673741459846,
0.04827527701854706,
0.05688103288412094,
0.028297659009695053,
-0.10129371285438538,
0.025115348398685455,
0.1537620574235916,
-0.11114807426929474,
-0.12311764061450958,
0.1203274205327034,
-0.0015013079391792417,
0.017805777490139008,
0.07238015532493591,
0.12332300841808319,
0.15873365104198456,
-0.11691118031740189,
-0.02293972484767437,
0.08233900368213654,
0.06450236588716507,
-0.04757540673017502,
0.06672503054141998,
-0.0005837631761096418,
0.0014952941564843059,
0.030420202761888504,
0.06134265288710594,
0.05779993534088135,
0.0028885442297905684,
-0.03357787802815437,
-0.04616701975464821,
-0.08332162350416183,
-0.04221139848232269,
-0.011551213450729847,
0.012964890338480473,
-0.060142748057842255,
-0.07556958496570587,
-0.010256695561110973,
0.1861671358346939,
-0.09687779098749161,
0.03337666392326355,
-0.0728449746966362,
-0.018144790083169937,
-0.07226365059614182,
0.021456403657794,
-0.11912448704242706,
0.007630972191691399,
0.05293264985084534,
-0.038984958082437515,
0.05299125984311104,
0.07454876601696014,
0.004302044864743948,
0.02740516886115074,
-0.056412141770124435,
-0.04130448400974274,
-0.039846014231443405,
-0.06951101869344711,
-0.1104850172996521,
-0.023045286536216736,
-0.09013001620769501,
-0.030331077054142952,
-0.043031614273786545,
-0.17590534687042236,
0.0011223095934838057,
0.0059918914921581745,
0.022351205348968506,
0.02407723106443882,
-0.03986005857586861,
0.021681878715753555,
0.05757620185613632,
-0.0472717210650444,
-0.0829949900507927,
0.017442017793655396,
0.03389127925038338,
-0.0877825990319252,
-0.04933552071452141,
-0.10764087736606598,
-0.05974147468805313,
0.07211604714393616,
0.10025471448898315,
-0.1208764910697937,
0.004061833489686251,
-0.0210395660251379,
-0.04627695679664612,
-0.05597427859902382,
-0.06124887987971306,
0.17443154752254486,
0.013328298926353455,
0.17051441967487335,
-0.12218441814184189,
-0.05598223954439163,
-0.028729364275932312,
0.002520614070817828,
0.02666385844349861,
0.1515112668275833,
0.0003890570951625705,
-0.06871556490659714,
0.034516170620918274,
-0.04493755102157593,
-0.05535043776035309,
0.14934603869915009,
-0.014861606061458588,
-0.08020348101854324,
0.003098946763202548,
0.10822107642889023,
-0.002990773180499673,
0.1672644317150116,
-0.056160278618335724,
0.005511343479156494,
-0.00461951969191432,
0.022453999146819115,
0.04306517913937569,
-0.12039713561534882,
0.030401214957237244,
0.050789181143045425,
-0.05613689869642258,
-0.05673617497086525,
-0.02963646501302719,
-0.03386897221207619,
0.041863881051540375,
0.0169810950756073,
0.038653284311294556,
-0.028851311653852463,
-0.028773000463843346,
-0.10490240156650543,
0.19031216204166412,
-0.07432208955287933,
-0.206452414393425,
-0.17526032030582428,
0.041816215962171555,
-0.025626584887504578,
-0.022625479847192764,
0.036156658083200455,
-0.0987156480550766,
-0.0580764003098011,
-0.09548220783472061,
0.13397249579429626,
-0.14252826571464539,
-0.003335846122354269,
-0.04697743430733681,
0.05461077392101288,
0.061255570501089096,
-0.16605883836746216,
0.028324777260422707,
-0.006219402886927128,
0.0012268867576494813,
-0.004572875332087278,
-0.0499177947640419,
0.07670789957046509,
0.11683807522058487,
-0.06203819811344147,
0.013290636241436005,
-0.003920787014067173,
0.17636758089065552,
-0.05768907815217972,
0.029785681515932083,
0.17829535901546478,
0.008402188308537006,
0.036553751677274704,
0.050789374858140945,
0.019276604056358337,
-0.0957384929060936,
0.056522633880376816,
0.04728477820754051,
-0.03866963088512421,
-0.21242505311965942,
-0.03603784367442131,
-0.07909197360277176,
0.07624247670173645,
0.13925407826900482,
0.05645022168755531,
-0.1602574735879898,
0.016724780201911926,
-0.010594907216727734,
0.15061511099338531,
-0.035604801028966904,
0.06794612109661102,
0.025535138323903084,
0.006128487177193165,
-0.003875037422403693,
-0.10381823033094406,
0.0059155966155231,
0.08331441879272461,
0.11380117386579514,
0.1934531033039093,
-0.08294538408517838,
0.18109716475009918,
0.004211460240185261,
0.10221342742443085,
0.03858466073870659,
0.08341000974178314,
-0.1315605193376541,
0.012603027746081352,
0.015801260247826576,
-0.014476679265499115,
-0.05103369057178497,
0.04641517996788025,
-0.01806396059691906,
0.057041022926568985,
-0.06838765740394592,
-0.008819936774671078,
0.013081688433885574,
0.1917303055524826,
0.0793493464589119,
-0.16904892027378082,
-0.1272040754556656,
0.014227435924112797,
-0.09491436183452606,
-0.11146773397922516,
0.0704587996006012,
0.19156943261623383,
-0.07090838253498077,
0.029511533677577972,
-0.018251340836286545,
0.13451707363128662,
-0.11323058605194092,
-0.024056099355220795,
0.03501027822494507,
0.052471619099378586,
0.0022758464328944683,
0.11450590193271637,
-0.23980771005153656,
0.07284620404243469,
0.014823800884187222,
0.08587399870157242,
-0.013522254303097725,
0.06701410561800003,
-0.05227207392454147,
0.01773260161280632,
0.07517331838607788,
0.010997142642736435,
-0.05135861784219742,
-0.19999587535858154,
-0.04885042458772659,
0.026549389585852623,
0.04938327521085739,
-0.013563456013798714,
0.09427318722009659,
-0.03265365958213806,
0.0490129292011261,
-0.03330905735492706,
-0.15680062770843506,
-0.033703479915857315,
-0.14725521206855774,
-0.03215831518173218,
-0.01380995474755764,
-0.05642097443342209,
-0.024018477648496628,
0.04129239171743393,
0.03805938735604286,
0.2455057054758072,
-0.14686988294124603,
-0.08612719923257828,
-0.1045638769865036,
0.06230801343917847,
0.1361309289932251,
-0.09145358949899673,
0.03283628821372986,
0.008576544001698494,
0.0489097535610199,
-0.04327232018113136,
-0.07433201372623444,
0.028982549905776978,
-0.05792923644185066,
-0.08352170884609222,
-0.027772927656769753,
0.11875162273645401,
-0.020337950438261032,
0.04662643373012543,
-0.006027163937687874,
-0.0652473196387291,
-0.05412488803267479,
-0.12565724551677704,
-0.07713808119297028,
-0.009143324568867683,
0.03803219273686409,
-0.003932085353881121,
-0.11435839533805847,
0.0880352109670639,
-0.00047644629376009107,
-0.08159573376178741,
0.08370804041624069,
0.18707412481307983,
-0.07471893727779388,
0.03140408918261528,
0.07512682676315308,
-0.05501144751906395,
-0.17231528460979462,
-0.047557901591062546,
0.03721771761775017,
0.07237208634614944,
-0.019632404670119286,
-0.15781892836093903,
0.05885903909802437,
0.012833667919039726,
0.01870022527873516,
0.03313648700714111,
-0.28528672456741333,
-0.1267007440328598,
0.017235545441508293,
0.06738146394491196,
0.026555033400654793,
-0.10991673916578293,
-0.03982393816113472,
-0.060947299003601074,
-0.05433323234319687,
0.03695644438266754,
0.06922632455825806,
0.1109548881649971,
-0.038272175937891006,
0.05281589552760124,
0.047318555414676666,
-0.02881196141242981,
0.07138299196958542,
-0.026193691417574883,
0.08694813400506973,
-0.019136862829327583,
0.018155323341488838,
0.03465355560183525,
-0.06036799028515816,
0.1823493242263794,
-0.1612682342529297,
0.1007809042930603,
-0.20082277059555054,
-0.04186559095978737,
-0.02004081942141056,
0.0005226510693319142,
-0.040561847388744354,
-0.055133894085884094,
-0.12093060463666916,
0.02088492549955845,
0.05756509676575661,
-0.036418166011571884,
0.046649057418107986,
-0.019793465733528137,
-0.038130540400743484,
0.062229789793491364,
0.05779208987951279,
0.015066174790263176,
-0.16036206483840942,
0.030986154451966286,
0.014788074418902397,
0.08149975538253784,
-0.1921256184577942,
0.018310727551579475,
0.11163853108882904,
0.027957528829574585,
0.09260524064302444,
0.004011843353509903,
-0.08784174174070358,
0.027411507442593575,
0.07133417576551437,
-0.06785624474287033,
-0.09767097979784012,
-0.010405994951725006,
-0.052973225712776184,
-0.10126754641532898,
0.02241344377398491,
0.09402898699045181,
-0.05444291606545448,
-0.014945735223591328,
-0.005367901176214218,
0.01679888181388378,
-0.07395246624946594,
0.17715385556221008,
0.019417976960539818,
0.07481095939874649,
-0.06595882028341293,
0.07037719339132309,
0.10307977348566055,
-0.11015424877405167,
0.015800992026925087,
0.16354583203792572,
-0.08202138543128967,
-0.022999249398708344,
0.07984878867864609,
0.08726152032613754,
-0.029734265059232712,
-0.04448186978697777,
-0.08162346482276917,
-0.06831532716751099,
0.014352708123624325,
0.014068897813558578,
0.07044357806444168,
0.08066847920417786,
-0.036451078951358795,
0.0026110513135790825,
-0.12730388343334198,
0.09949202835559845,
0.07122913002967834,
0.05135239288210869,
-0.1443762332201004,
0.14832940697669983,
0.0395490787923336,
0.07756917178630829,
0.004178448114544153,
0.02308034338057041,
-0.10956250876188278,
0.03630971908569336,
-0.023630090057849884,
0.030197342857718468,
-0.002601931570097804,
0.05313241854310036,
-0.033005159348249435,
0.038407862186431885,
-0.02760879322886467,
0.040741514414548874,
-0.04238616302609444,
-0.03089238703250885,
-0.03334742784500122,
0.018757814541459084,
-0.05996270105242729,
-0.007949106395244598,
0.013052255846560001,
-0.08638562262058258,
0.09688802063465118,
-0.0547950379550457,
-0.0037075225263834,
0.007254639640450478,
0.023127276450395584,
0.060960255563259125,
0.018750054761767387,
0.04275888204574585,
-0.01620395854115486,
-0.01174702774733305,
0.029067061841487885,
0.009792204946279526,
-0.017327532172203064,
0.0030962983146309853,
0.0852765440940857,
-0.15188466012477875,
-0.08237665146589279,
-0.08047755807638168,
-0.08187396824359894,
-0.058359432965517044,
0.06571270525455475,
0.07078671455383301,
0.07499366253614426,
0.09552786499261856,
-0.041199952363967896,
0.01945008896291256,
-0.14744757115840912,
-0.05000952258706093,
0.04555117338895798,
0.00005783526648883708,
-0.09530159085988998,
-0.03512583300471306,
0.054487623274326324,
-0.03825007751584053,
0.12953540682792664,
-0.022147737443447113,
0.050410471856594086,
-0.008866910822689533,
-0.043285686522722244,
-0.02454577572643757,
0.0013431478291749954,
0.17957469820976257,
-0.10288883000612259,
0.0029561067931354046,
-0.013167515397071838,
0.002975607756525278,
0.032965727150440216,
0.1755613386631012,
0.09032835066318512,
0.1163255050778389,
0.04154599457979202,
0.06690101325511932,
-0.05139122158288956,
-0.031519122421741486,
-0.13998274505138397,
0.04132732376456261,
-0.02687983587384224,
0.05014115571975708,
-0.040123358368873596,
0.14569434523582458,
0.10607834905385971,
-0.129838764667511,
0.10241999477148056,
0.014808062463998795,
-0.09188768267631531,
-0.044604767113924026,
-0.07974155247211456,
-0.041935406625270844,
-0.09718794375658035,
0.004008202347904444,
-0.09926432371139526,
0.015762504190206528,
0.0778723806142807,
0.03446941822767258,
-0.028019648045301437,
0.16395044326782227,
-0.03177649527788162,
-0.0642414316534996,
0.025493314489722252,
0.05230557173490524,
0.028014104813337326,
0.110392726957798,
0.027362100780010223,
0.050770457834005356,
-0.07317119091749191,
0.07432273775339127,
0.036307837814092636,
-0.007782327942550182,
0.02052067406475544,
0.008340466767549515,
-0.00649484945461154,
-0.04767608270049095,
-0.000832104473374784,
0.07475944608449936,
0.16536666452884674,
0.044208791106939316,
-0.04883236810564995,
-0.05903686210513115,
0.20488567650318146,
-0.056359611451625824,
-0.056089796125888824,
-0.12840259075164795,
0.18136079609394073,
0.0362652949988842,
0.006628748029470444,
0.015518913976848125,
-0.08028433471918106,
-0.027002274990081787,
0.23277558386325836,
0.06547088921070099,
-0.028716115280985832,
-0.022151434794068336,
0.0003958659654017538,
-0.008093979209661484,
-0.026590751484036446,
0.14898638427257538,
-0.0034330266062170267,
0.2419901043176651,
0.011321856640279293,
-0.004636536352336407,
-0.04233367368578911,
-0.040184032171964645,
-0.031503356993198395,
0.21055227518081665,
-0.03491942957043648,
0.02186673693358898,
-0.10131777077913284,
-0.0022812820971012115,
0.028353175148367882,
-0.12105026096105576,
0.11665359139442444,
-0.1267704963684082,
-0.07757676392793655,
0.021720336750149727,
0.055843934416770935,
-0.04249299317598343,
0.04853774607181549,
-0.023936139419674873,
0.04966229200363159,
0.045442845672369,
-0.029926441609859467,
-0.10796575248241425,
-0.15606962144374847,
0.05456846207380295,
0.004467152990400791,
0.14166629314422607,
0.021591896191239357,
0.06955555826425552,
0.08398241549730301,
0.0045018987730145454,
-0.07599184662103653,
0.08597254008054733,
0.03716311603784561,
-0.01817130111157894,
0.05184146389365196,
0.128005713224411,
-0.04548967257142067,
0.1447393000125885,
0.012898229993879795,
-0.010600944980978966,
-0.023884758353233337,
-0.03363285958766937,
0.005377480294555426,
-0.14639128744602203,
-0.010938886553049088,
-0.06337827444076538,
0.1329718977212906,
0.20651943981647491,
-0.04559687152504921,
-0.018896840512752533,
-0.05139659717679024,
0.07626114040613174,
-0.005800872575491667,
0.08107811212539673,
0.005397809203714132,
-0.17213174700737,
0.011231733486056328,
-0.01880878023803234,
0.005035923328250647,
-0.19310294091701508,
-0.0557919479906559,
-0.029519077390432358,
-0.021540388464927673,
-0.10027951747179031,
0.15573900938034058,
0.05299884080886841,
0.019689347594976425,
-0.03702664002776146,
-0.13652801513671875,
-0.012759359553456306,
0.04811821132898331,
-0.12716349959373474,
-0.11966308951377869
] |
null | null | transformers |
# CodeTrans model for code documentation generation python
Pretrained model on programming language python using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized python code functions: it works best with tokenized python functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the python function/method.
## Intended uses & limitations
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_python_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_python_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "def e ( message , exit_code = None ) : print_log ( message , YELLOW , BOLD ) if exit_code is not None : sys . exit ( exit_code )"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/function%20documentation%20generation/python/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
| {"tags": ["summarization"], "widget": [{"text": "def e ( message , exit_code = None ) : print_log ( message , YELLOW , BOLD ) if exit_code is not None : sys . exit ( exit_code )"}]} | summarization | SEBIS/code_trans_t5_base_code_documentation_generation_python_transfer_learning_finetune | [
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] | 2022-03-02T23:29:04+00:00 | [] | [] | TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
| CodeTrans model for code documentation generation python
========================================================
Pretrained model on programming language python using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized python code functions: it works best with tokenized python functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the python function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
| [
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
46,
61,
87,
108
] | [
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] | [
-0.09859102964401245,
0.07669943571090698,
-0.0006934590637683868,
0.10037070512771606,
0.040964946150779724,
0.029198219999670982,
0.03281877562403679,
0.10960274189710617,
-0.04137900471687317,
0.053839873522520065,
0.060164324939250946,
-0.06623684614896774,
0.06069529801607132,
0.18122780323028564,
0.020331593230366707,
-0.15199953317642212,
-0.025784151628613472,
0.038148995488882065,
-0.0714026466012001,
0.11465974897146225,
0.07227574288845062,
-0.0762271136045456,
0.07237900048494339,
-0.04187599942088127,
-0.12680773437023163,
0.04332587867975235,
-0.02431875467300415,
-0.019234076142311096,
0.09982011467218399,
0.06444841623306274,
0.13131681084632874,
-0.03400580957531929,
0.06990525126457214,
-0.2028551548719406,
0.00014947923773434013,
0.029045237228274345,
0.06057986989617348,
0.03924686461687088,
0.04115908220410347,
0.07142124325037003,
0.14161112904548645,
-0.014821797609329224,
0.05613664537668228,
0.048792097717523575,
-0.06717025488615036,
-0.062346383929252625,
-0.07488804310560226,
0.0774296447634697,
0.07716967165470123,
0.10403434932231903,
-0.0029429662972688675,
0.05474909394979477,
-0.06673161685466766,
0.09518301486968994,
0.11718988418579102,
-0.22383399307727814,
-0.022594913840293884,
0.1063380017876625,
0.08844969421625137,
0.03602416440844536,
-0.07002940028905869,
-0.0440957173705101,
0.10188465565443039,
0.04617857187986374,
0.04927074909210205,
-0.08803993463516235,
-0.04143594950437546,
-0.015517216175794601,
-0.059576600790023804,
-0.055453766137361526,
0.17208527028560638,
0.02582581713795662,
-0.06098392605781555,
-0.09487468004226685,
-0.04994693771004677,
-0.1951928585767746,
0.03594835847616196,
0.02186746336519718,
0.0030450569465756416,
-0.008554053492844105,
-0.00345915206708014,
-0.007021714001893997,
-0.0924060195684433,
-0.10541430115699768,
0.013445496559143066,
0.038425907492637634,
0.06248266622424126,
0.035980530083179474,
-0.03380916267633438,
0.09300234168767929,
0.026694636791944504,
-0.03701974079012871,
-0.011516745202243328,
0.016088129952549934,
-0.11126076430082321,
0.0016767617780715227,
-0.011196189559996128,
-0.06508904695510864,
-0.01364437397569418,
0.08299265056848526,
-0.1023440733551979,
0.07347654551267624,
0.09125031530857086,
0.010368173010647297,
0.017681891098618507,
0.2099159061908722,
0.042262472212314606,
-0.14488714933395386,
0.023870011791586876,
0.01804845966398716,
-0.004682880360633135,
0.010062522254884243,
-0.04865183308720589,
-0.03961764648556709,
0.029243309050798416,
0.06783892214298248,
-0.10711585730314255,
0.032320406287908554,
-0.050596244633197784,
-0.012925617396831512,
0.05300876870751381,
-0.12728385627269745,
0.03517681732773781,
0.007958534173667431,
-0.05312184989452362,
-0.02038058079779148,
0.10131561011075974,
-0.13009904325008392,
-0.11760449409484863,
0.04333438724279404,
-0.045686930418014526,
-0.03645973280072212,
-0.11478565633296967,
-0.12263738363981247,
-0.004961755592375994,
0.0017659574514254928,
-0.0017468510195612907,
-0.09847697615623474,
-0.09099651128053665,
-0.013021349906921387,
0.05131267383694649,
0.005718535743653774,
-0.03067714534699917,
-0.0328962616622448,
0.0006063917535357177,
0.00030287864501588047,
-0.0158393494784832,
0.011214712634682655,
-0.03194340318441391,
0.10140301287174225,
0.07244187593460083,
0.031637195497751236,
-0.02179497294127941,
0.023510025814175606,
-0.08072618395090103,
0.07562492042779922,
-0.08315092325210571,
0.04892423376441002,
-0.007574727758765221,
0.06634484231472015,
-0.10055305808782578,
-0.08212628960609436,
0.01003948226571083,
0.05265862122178078,
0.07095588743686676,
0.03564474359154701,
-0.13125774264335632,
0.029523862525820732,
0.15423285961151123,
-0.11400265246629715,
-0.13581521809101105,
0.10455869883298874,
-0.015201173722743988,
0.037954043596982956,
0.07223185896873474,
0.1094655841588974,
0.15112431347370148,
-0.10107018053531647,
-0.0311740692704916,
0.0773821473121643,
0.056951623409986496,
-0.059783339500427246,
0.05992329865694046,
0.01784050650894642,
-0.016366397961974144,
0.021329043433070183,
0.05307001993060112,
0.06447520107030869,
-0.00335254636593163,
-0.035226911306381226,
-0.033858757466077805,
-0.09030855447053909,
-0.053993839770555496,
-0.009781471453607082,
0.017759956419467926,
-0.05097460374236107,
-0.07065955549478531,
0.020472895354032516,
0.1790362298488617,
-0.10107041895389557,
0.032298702746629715,
-0.07719437777996063,
-0.030697520822286606,
-0.06854653358459473,
0.01828712970018387,
-0.1229226216673851,
0.014927428215742111,
0.04948699474334717,
-0.023408545181155205,
0.05859728902578354,
0.0772351622581482,
0.0000075076363827975,
0.01580224186182022,
-0.0549132376909256,
-0.03230646997690201,
-0.033989232033491135,
-0.0682017058134079,
-0.10948827117681503,
-0.02679533138871193,
-0.08701270818710327,
-0.02668689750134945,
-0.0338689349591732,
-0.1743304282426834,
-0.001611629850231111,
-0.00042926755850203335,
0.022615743800997734,
0.020790359005331993,
-0.03729034587740898,
0.01881171576678753,
0.050231873989105225,
-0.053066473454236984,
-0.08034766465425491,
0.013739489018917084,
0.043971408158540726,
-0.08800583332777023,
-0.04572588950395584,
-0.10349248349666595,
-0.07551167160272598,
0.08590131998062134,
0.11153901368379593,
-0.13216888904571533,
0.00028589210705831647,
-0.02486921288073063,
-0.04059090092778206,
-0.047304220497608185,
-0.05796312168240547,
0.17053836584091187,
0.013321828097105026,
0.16625143587589264,
-0.13394510746002197,
-0.06673786789178848,
-0.03455767408013344,
0.006176521070301533,
0.03097330592572689,
0.13887296617031097,
0.014343183487653732,
-0.10853761434555054,
0.036688435822725296,
-0.05149416998028755,
-0.057880502194166183,
0.14941255748271942,
-0.015515890903770924,
-0.06923367828130722,
0.0027105407789349556,
0.11413451284170151,
0.006355093326419592,
0.18914616107940674,
-0.04072903096675873,
0.004674592521041632,
-0.009467612951993942,
0.01368630863726139,
0.04456073418259621,
-0.12362830340862274,
0.029944663867354393,
0.045861080288887024,
-0.05327732488512993,
-0.03907203674316406,
-0.03526952862739563,
-0.034895461052656174,
0.04718684405088425,
0.017226343974471092,
0.033582061529159546,
-0.026558559387922287,
-0.030128244310617447,
-0.109316386282444,
0.1921139806509018,
-0.07696863263845444,
-0.22130246460437775,
-0.1634579747915268,
0.06163296476006508,
-0.017075905576348305,
-0.022702718153595924,
0.026918256655335426,
-0.08149898052215576,
-0.057636793702840805,
-0.1003522127866745,
0.12922847270965576,
-0.13211961090564728,
0.0009304790874011815,
-0.03252056986093521,
0.055054694414138794,
0.058798640966415405,
-0.16924655437469482,
0.03447146341204643,
-0.015915129333734512,
0.01005622185766697,
-0.006451778579503298,
-0.0673072561621666,
0.0723847821354866,
0.11653739213943481,
-0.07526781409978867,
0.01570923998951912,
-0.005804246757179499,
0.1726498305797577,
-0.05884052440524101,
0.03937531262636185,
0.15739229321479797,
-0.0009203325607813895,
0.030038682743906975,
0.05018379166722298,
0.010827360674738884,
-0.09508802741765976,
0.061826832592487335,
0.03930040821433067,
-0.03827451169490814,
-0.21032701432704926,
-0.03689071536064148,
-0.08166757971048355,
0.06822913885116577,
0.13539090752601624,
0.04875156283378601,
-0.15431147813796997,
0.02164633199572563,
-0.015874013304710388,
0.16430015861988068,
-0.027066901326179504,
0.06968261301517487,
-0.000579093408305198,
0.012975266203284264,
0.000563371810130775,
-0.10267128050327301,
0.005143040791153908,
0.07688458263874054,
0.10083945840597153,
0.19442503154277802,
-0.0920763611793518,
0.14397059381008148,
0.005878608673810959,
0.10571408271789551,
0.04499775916337967,
0.10977873206138611,
-0.13741692900657654,
0.014920475892722607,
0.012184958904981613,
-0.014193085953593254,
-0.06747347116470337,
0.04594303295016289,
-0.036591220647096634,
0.0671769455075264,
-0.06473955512046814,
-0.0004077977209817618,
0.008219567127525806,
0.18700091540813446,
0.08747073262929916,
-0.17222696542739868,
-0.12819869816303253,
0.006254799198359251,
-0.09611592441797256,
-0.11230888217687607,
0.06527070701122284,
0.19351156055927277,
-0.062310267239809036,
0.024428967386484146,
-0.02091590315103531,
0.13262557983398438,
-0.09967098385095596,
-0.029408184811472893,
0.031427208334207535,
0.0490477979183197,
0.0007426876691170037,
0.11061660945415497,
-0.2397485226392746,
0.08773045986890793,
0.014840511605143547,
0.08465290069580078,
-0.019914910197257996,
0.06400943547487259,
-0.04854658618569374,
-0.004131344147026539,
0.0735366940498352,
0.013632096350193024,
-0.03114466369152069,
-0.1827862709760666,
-0.046495478600263596,
0.022972246631979942,
0.04907254874706268,
0.002585330978035927,
0.09957370162010193,
-0.027557745575904846,
0.0511043556034565,
-0.02325865998864174,
-0.12662796676158905,
-0.04663170501589775,
-0.1480916291475296,
-0.03511185571551323,
-0.01870100572705269,
-0.059897530823946,
-0.025873888283967972,
0.04377015680074692,
0.04202376678586006,
0.24846506118774414,
-0.1529986411333084,
-0.07385098934173584,
-0.0987519696354866,
0.07821295410394669,
0.14113859832286835,
-0.08492638170719147,
0.02918468974530697,
0.010518979281187057,
0.05245766416192055,
-0.04966975376009941,
-0.08066067844629288,
0.025462057441473007,
-0.05845082923769951,
-0.07017822563648224,
-0.03026162087917328,
0.11148159205913544,
-0.010232764296233654,
0.04951955005526543,
0.007978004403412342,
-0.07455655187368393,
-0.042589083313941956,
-0.12233009934425354,
-0.09605298191308975,
-0.02028697170317173,
0.03819252550601959,
-0.00035419079358689487,
-0.13078169524669647,
0.07637222856283188,
-0.008235546760261059,
-0.07832985371351242,
0.08377104997634888,
0.17182472348213196,
-0.0731668546795845,
0.025400526821613312,
0.05846928432583809,
-0.057534124702215195,
-0.1844608336687088,
-0.04173139110207558,
0.04584558680653572,
0.07633501291275024,
-0.018825925886631012,
-0.145036980509758,
0.06149549409747124,
0.002140626311302185,
0.022227352485060692,
0.011038411408662796,
-0.2556617558002472,
-0.12137182056903839,
0.018738457933068275,
0.07057410478591919,
0.03276883810758591,
-0.10444452613592148,
-0.03591552749276161,
-0.06376990675926208,
-0.045180365443229675,
0.0698307454586029,
0.07509792596101761,
0.10561792552471161,
-0.03259330987930298,
0.047463685274124146,
0.04699047654867172,
-0.02920038439333439,
0.059773728251457214,
-0.032910238951444626,
0.09120115637779236,
-0.020940877497196198,
0.002395623130723834,
0.042753107845783234,
-0.06175550818443298,
0.18331673741340637,
-0.1690313071012497,
0.10744910687208176,
-0.18870393931865692,
-0.0375247485935688,
-0.023287925869226456,
0.0015422517899423838,
-0.04125348851084709,
-0.04895048215985298,
-0.12905435264110565,
0.0385516993701458,
0.06491754949092865,
-0.032190173864364624,
0.0367865264415741,
-0.011943024583160877,
-0.036226753145456314,
0.052163928747177124,
0.07411585003137589,
0.014396918006241322,
-0.12188020348548889,
0.03720145300030708,
0.01911056786775589,
0.08311579376459122,
-0.1879921406507492,
0.02209320291876793,
0.10772518068552017,
0.027591604739427567,
0.09211254119873047,
0.010810882784426212,
-0.09502280503511429,
0.011454491876065731,
0.07388891279697418,
-0.07062216848134995,
-0.07023173570632935,
-0.009364143013954163,
-0.008107440546154976,
-0.1010771319270134,
0.021466201171278954,
0.09274131804704666,
-0.060213301330804825,
-0.010905580595135689,
-0.00045784033136442304,
0.008683198131620884,
-0.07869189977645874,
0.17076896131038666,
0.013033337891101837,
0.07497496902942657,
-0.061175961047410965,
0.0656198188662529,
0.09534988552331924,
-0.09160508960485458,
0.025608250871300697,
0.15040412545204163,
-0.08269006013870239,
-0.017725227400660515,
0.10294836759567261,
0.10092883557081223,
-0.039016272872686386,
-0.04312676191329956,
-0.07907596230506897,
-0.07472874224185944,
0.011109755374491215,
0.05527748540043831,
0.06919117271900177,
0.09038333594799042,
-0.026952551677823067,
-0.002108343644067645,
-0.12994535267353058,
0.09462413191795349,
0.0784650593996048,
0.04819296672940254,
-0.13318271934986115,
0.1546010673046112,
0.035763002932071686,
0.0672021135687828,
0.002130115171894431,
0.029512668028473854,
-0.11374308913946152,
0.037917450070381165,
-0.017034167423844337,
0.032962918281555176,
-0.005100417882204056,
0.046453602612018585,
-0.030409391969442368,
0.04154270887374878,
-0.033550601452589035,
0.03813278675079346,
-0.04480314254760742,
-0.02520892210304737,
-0.040060169994831085,
0.00844945851713419,
-0.051627323031425476,
-0.012500248849391937,
0.012783132493495941,
-0.09188820421695709,
0.08305142819881439,
-0.05351966246962547,
-0.0027739880606532097,
0.011959783732891083,
0.023336347192525864,
0.04538664594292641,
0.009764608927071095,
0.05449533462524414,
-0.008576802909374237,
-0.005780182313174009,
0.023868974298238754,
0.02663799561560154,
-0.011369448155164719,
0.001463797758333385,
0.09202200919389725,
-0.13744868338108063,
-0.09123038500547409,
-0.08818784356117249,
-0.07174517214298248,
-0.056655894964933395,
0.0734625831246376,
0.07767608016729355,
0.07970008254051208,
0.09452135860919952,
-0.03542633727192879,
0.013102758675813675,
-0.15615716576576233,
-0.045486994087696075,
0.046734318137168884,
-0.0013502856018021703,
-0.09857090562582016,
-0.03597329556941986,
0.05824475362896919,
-0.02887267805635929,
0.12313904613256454,
-0.019823996350169182,
0.03745151683688164,
-0.011622016318142414,
-0.06174792721867561,
-0.028191400691866875,
-0.00013127687270753086,
0.19157615303993225,
-0.10079295933246613,
0.00681227445602417,
-0.015766078606247902,
0.0034314896911382675,
0.028571799397468567,
0.1602686494588852,
0.11449958384037018,
0.1224445328116417,
0.02923734486103058,
0.07381439954042435,
-0.05056149885058403,
-0.031601861119270325,
-0.10055786371231079,
0.03930220007896423,
-0.028999296948313713,
0.036428000777959824,
-0.029352834448218346,
0.14987361431121826,
0.09165623039007187,
-0.13571129739284515,
0.10355502367019653,
0.009778875857591629,
-0.09466080367565155,
-0.04033656045794487,
-0.07839677482843399,
-0.040700264275074005,
-0.09401848912239075,
0.002438998082652688,
-0.10860975086688995,
-0.0009170167031697929,
0.05832621455192566,
0.03555907681584358,
-0.021981021389365196,
0.15257272124290466,
-0.048156995326280594,
-0.06843235343694687,
0.02561233378946781,
0.052255406975746155,
0.02343314327299595,
0.08824124187231064,
0.025911876931786537,
0.05254922807216644,
-0.06364920735359192,
0.07638019323348999,
0.035674042999744415,
0.000836004561278969,
0.028745431452989578,
0.03051227331161499,
-0.006287984549999237,
-0.045248836278915405,
-0.0142699358984828,
0.08598639816045761,
0.16080178320407867,
0.039624348282814026,
-0.04205869138240814,
-0.060377709567546844,
0.18404802680015564,
-0.05960312485694885,
-0.07166962325572968,
-0.1322491466999054,
0.182769313454628,
0.029257746413350105,
0.008028614334762096,
0.015465541742742062,
-0.07832517474889755,
-0.022046374157071114,
0.2533206343650818,
0.05982457846403122,
-0.04405688866972923,
-0.021200338378548622,
-0.0020278841257095337,
-0.008449611254036427,
-0.04085027053952217,
0.14266091585159302,
0.006548075471073389,
0.25293096899986267,
0.013741560280323029,
-0.020561281591653824,
-0.04939896985888481,
-0.038138460367918015,
-0.004281880799680948,
0.1994590312242508,
-0.03072701022028923,
0.017420614138245583,
-0.1031339168548584,
-0.004897762089967728,
0.023089567199349403,
-0.1512255221605301,
0.11560847610235214,
-0.13569819927215576,
-0.07367531955242157,
0.017228856682777405,
0.05521424114704132,
-0.05143853276968002,
0.04733062908053398,
-0.026938818395137787,
0.06267676502466202,
0.032158832997083664,
-0.0229792520403862,
-0.10880496352910995,
-0.15210959315299988,
0.06177075207233429,
-0.0009522156906314194,
0.13347990810871124,
0.01717812567949295,
0.08126454055309296,
0.08281877636909485,
0.007367499638348818,
-0.07250090688467026,
0.07849006354808807,
0.03436135873198509,
-0.011278276331722736,
0.051872044801712036,
0.1256750375032425,
-0.048196941614151,
0.1592390537261963,
0.009668524377048016,
-0.018686294555664062,
-0.023122426122426987,
-0.030048085376620293,
-0.0029465288389474154,
-0.14960959553718567,
-0.008712497539818287,
-0.06594599783420563,
0.1406583935022354,
0.20699553191661835,
-0.0469466932117939,
-0.00958788301795721,
-0.0529014952480793,
0.08084993809461594,
-0.0065215048380196095,
0.08014939725399017,
0.005140842404216528,
-0.17402255535125732,
0.0031316597014665604,
-0.05124271288514137,
0.0008598018903285265,
-0.1979154348373413,
-0.049351178109645844,
-0.03847179189324379,
-0.036962129175662994,
-0.10153764486312866,
0.15457060933113098,
0.06369628757238388,
0.025586487725377083,
-0.0418924018740654,
-0.10111241787672043,
-0.014186043292284012,
0.04599646478891373,
-0.11926551908254623,
-0.11525238305330276
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.