sha
null
last_modified
null
library_name
stringclasses
154 values
text
stringlengths
1
900k
metadata
stringlengths
2
348k
pipeline_tag
stringclasses
45 values
id
stringlengths
5
122
tags
sequencelengths
1
1.84k
created_at
stringlengths
25
25
arxiv
sequencelengths
0
201
languages
sequencelengths
0
1.83k
tags_str
stringlengths
17
9.34k
text_str
stringlengths
0
389k
text_lists
sequencelengths
0
722
processed_texts
sequencelengths
1
723
tokens_length
sequencelengths
1
723
input_texts
sequencelengths
1
61
embeddings
sequencelengths
768
768
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # wav2vec2-base-timit-demo-colab This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.4519 - Wer: 0.3375 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 32 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 1000 - num_epochs: 30 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:----:|:---------------:|:------:| | 3.4351 | 4.0 | 500 | 1.2740 | 0.8259 | | 0.5828 | 8.0 | 1000 | 0.4276 | 0.4403 | | 0.2274 | 12.0 | 1500 | 0.4646 | 0.3739 | | 0.135 | 16.0 | 2000 | 0.4320 | 0.3662 | | 0.0962 | 20.0 | 2500 | 0.4831 | 0.3607 | | 0.0719 | 24.0 | 3000 | 0.4506 | 0.3463 | | 0.0556 | 28.0 | 3500 | 0.4519 | 0.3375 | ### Framework versions - Transformers 4.11.3 - Pytorch 1.10.0+cu111 - Datasets 1.18.3 - Tokenizers 0.10.3
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "model-index": [{"name": "wav2vec2-base-timit-demo-colab", "results": []}]}
automatic-speech-recognition
NicoGrageda/wav2vec2-base-timit-demo-colab
[ "transformers", "pytorch", "tensorboard", "wav2vec2", "automatic-speech-recognition", "generated_from_trainer", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #license-apache-2.0 #endpoints_compatible #region-us
wav2vec2-base-timit-demo-colab ============================== This model is a fine-tuned version of facebook/wav2vec2-base on the None dataset. It achieves the following results on the evaluation set: * Loss: 0.4519 * Wer: 0.3375 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 0.0001 * train\_batch\_size: 32 * eval\_batch\_size: 8 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * lr\_scheduler\_warmup\_steps: 1000 * num\_epochs: 30 * mixed\_precision\_training: Native AMP ### Training results ### Framework versions * Transformers 4.11.3 * Pytorch 1.10.0+cu111 * Datasets 1.18.3 * Tokenizers 0.10.3
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 30\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.3\n* Tokenizers 0.10.3" ]
[ "TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #license-apache-2.0 #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 30\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.3\n* Tokenizers 0.10.3" ]
[ 56, 130, 4, 35 ]
[ "passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #generated_from_trainer #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0001\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 30\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.3\n* Tokenizers 0.10.3" ]
[ -0.10833754390478134, 0.10381042957305908, -0.003447136841714382, 0.05311182141304016, 0.10943093150854111, -0.02224290370941162, 0.12992502748966217, 0.1490924060344696, -0.11156157404184341, 0.07095726579427719, 0.12520445883274078, 0.1469612419605255, 0.044384390115737915, 0.1459488868713379, -0.05123339593410492, -0.2855369746685028, 0.048294976353645325, 0.03550826758146286, -0.020840534940361977, 0.12408933788537979, 0.08524196594953537, -0.1261489987373352, 0.05181831493973732, 0.03754477575421333, -0.1591220498085022, -0.001641957787796855, -0.008117086254060268, -0.10824380815029144, 0.11797899007797241, 0.013362843543291092, 0.07320088893175125, 0.048765409737825394, 0.06339815258979797, -0.21467654407024384, 0.008721605874598026, 0.045480094850063324, 0.027293900027871132, 0.07399290800094604, 0.06101059168577194, -0.0253707654774189, 0.12154541909694672, -0.07785171270370483, 0.08432452380657196, 0.03452400863170624, -0.10040441900491714, -0.295693039894104, -0.0883895605802536, 0.047700464725494385, 0.07843475788831711, 0.08981457352638245, -0.00999368354678154, 0.1470525562763214, -0.057681191712617874, 0.11329855024814606, 0.2798192799091339, -0.31179121136665344, -0.04599817469716072, -0.05289574712514877, 0.05597834661602974, 0.05841030925512314, -0.0901239812374115, -0.02046792581677437, 0.010743708349764347, 0.046851977705955505, 0.13231885433197021, -0.01715417020022869, -0.06198609992861748, -0.008344883099198341, -0.1534324288368225, -0.06298980861902237, 0.11046526581048965, 0.017656773328781128, -0.042628876864910126, -0.09404584765434265, -0.05194579064846039, -0.2004159539937973, -0.06980933248996735, -0.01500130258500576, 0.039956334978342056, -0.04952618107199669, -0.10413790494203568, -0.019491255283355713, -0.06758825480937958, -0.07009370625019073, -0.03837838023900986, 0.19532173871994019, 0.06178545951843262, -0.0007504495442844927, -0.04200323671102524, 0.06930477917194366, -0.014736226759850979, -0.13804151117801666, -0.023672964423894882, 0.036250852048397064, -0.022838842123746872, -0.01682872325181961, -0.04348614066839218, -0.06593196094036102, 0.018360575661063194, 0.1567915380001068, -0.1088852807879448, 0.09793650358915329, -0.01537051610648632, 0.03874713182449341, -0.10357552021741867, 0.20873264968395233, -0.04153716564178467, 0.03293122723698616, -0.005830306094139814, 0.055414408445358276, 0.033529847860336304, -0.026014741510152817, -0.09795874357223511, 0.034854013472795486, 0.11659786105155945, 0.053310833871364594, -0.04302902892231941, 0.05821622163057327, -0.027089765295386314, -0.009910321794450283, 0.011593430303037167, -0.11522748321294785, 0.03396046161651611, 0.0198811162263155, -0.06172381713986397, 0.0008120397687889636, 0.019153296947479248, 0.004520639777183533, -0.06453731656074524, 0.08428143709897995, -0.056282371282577515, 0.033882591873407364, -0.05637597292661667, -0.12755036354064941, 0.02759174071252346, -0.10902155190706253, -0.001358338282443583, -0.10306747257709503, -0.09193158149719238, -0.010619371198117733, 0.036999672651290894, -0.03549756854772568, -0.03275611996650696, -0.07301835715770721, -0.09623170644044876, 0.04175805300474167, -0.03570253774523735, 0.0764346495270729, -0.07133547961711884, 0.09405636042356491, 0.03081537038087845, 0.08494142442941666, -0.01318286918103695, 0.062569260597229, -0.06405647844076157, 0.029027704149484634, -0.20785629749298096, 0.078687384724617, -0.09378376603126526, 0.058948077261447906, -0.12374458461999893, -0.1170143187046051, 0.03827769681811333, -0.004956687800586224, 0.10379257798194885, 0.0937594622373581, -0.16922356188297272, -0.08996674418449402, 0.2025158554315567, -0.08362291753292084, -0.08466292172670364, 0.12438537180423737, -0.023574335500597954, -0.012047374621033669, 0.05270986631512642, 0.25722435116767883, 0.0563923642039299, -0.12386839836835861, 0.01153150387108326, -0.03621745854616165, 0.047043293714523315, -0.04501413181424141, 0.05954015627503395, -0.02173132449388504, 0.07572626322507858, 0.01326675247400999, -0.006562749855220318, 0.042281605303287506, -0.08780118823051453, -0.07798930257558823, -0.040403641760349274, -0.07652655988931656, 0.013507777824997902, 0.034905679523944855, 0.06404134631156921, -0.11733686923980713, -0.11073767393827438, 0.04709266126155853, 0.08484742790460587, -0.10454373061656952, 0.07569947093725204, -0.11945994943380356, 0.08855628222227097, -0.012427026405930519, -0.0042078010737895966, -0.19148027896881104, 0.033684469759464264, 0.03369207680225372, -0.027014397084712982, 0.03843504935503006, -0.06565430760383606, 0.07286848872900009, 0.04831041023135185, -0.024084001779556274, -0.04726380854845047, -0.008630751632153988, 0.012781241908669472, -0.09038025140762329, -0.20807726681232452, -0.040402818471193314, -0.04182978719472885, 0.07309912890195847, -0.13454800844192505, 0.034716520458459854, 0.07227864861488342, 0.09292402863502502, 0.02967613935470581, -0.028521638363599777, 0.0027323609683662653, 0.09046582877635956, -0.017737697809934616, -0.06717314571142197, 0.05653621628880501, 0.023511258885264397, -0.08707185834646225, 0.048796478658914566, -0.1481570303440094, 0.127961665391922, 0.14512650668621063, -0.008458556607365608, -0.0681370198726654, 0.0027188167441636324, -0.05006382241845131, -0.0315980389714241, -0.0025538518093526363, 0.04147781804203987, 0.22176256775856018, 0.01608957350254059, 0.14620628952980042, -0.09077949076890945, -0.04409495368599892, 0.049091413617134094, -0.02334122359752655, -0.009143802337348461, 0.12483556568622589, 0.04845994710922241, -0.05674070864915848, 0.11428955942392349, 0.08967925608158112, -0.08586719632148743, 0.11837322264909744, -0.06838078796863556, -0.07681573182344437, -0.016253173351287842, 0.006750784814357758, 0.028568439185619354, 0.09584370255470276, -0.15449927747249603, -0.04031454026699066, 0.02691691555082798, 0.020981546491384506, 0.02508392371237278, -0.20947007834911346, 0.014041672460734844, 0.03178508207201958, -0.08192425966262817, -0.043465156108140945, -0.0011847163550555706, 0.012034800834953785, 0.09432540088891983, 0.013446008786559105, -0.09667441248893738, 0.009430745616555214, 0.0037322519347071648, -0.07600316405296326, 0.17992286384105682, -0.12140516191720963, -0.17771458625793457, -0.10324431955814362, -0.0862940177321434, -0.032839421182870865, -0.006773955188691616, 0.0887315422296524, -0.09486573934555054, -0.044363152235746384, -0.08358942717313766, -0.023079875856637955, -0.03151819482445717, 0.04283427074551582, 0.03156427666544914, -0.01136570330709219, 0.06314032524824142, -0.11243854463100433, -0.019515544176101685, -0.041744768619537354, 0.004032604396343231, 0.05496735870838165, 0.03658017888665199, 0.10614565014839172, 0.1565544754266739, -0.015423845499753952, 0.04914018139243126, -0.04671413451433182, 0.1867409497499466, -0.07426898181438446, -0.041470639407634735, 0.1136881560087204, -0.007811855059117079, 0.06949979066848755, 0.10878996551036835, 0.04568083956837654, -0.09368357807397842, -0.013869465328752995, -0.000707953586243093, -0.04555567353963852, -0.22215522825717926, -0.036037545651197433, -0.04656601697206497, -0.00568003486841917, 0.10165924578905106, 0.040871743112802505, 0.02505088411271572, 0.018389305099844933, 0.028121553361415863, 0.00035212599323131144, 0.0012278348440304399, 0.09916964918375015, 0.1341795027256012, 0.0387304350733757, 0.1326872706413269, -0.043069735169410706, -0.03335773944854736, 0.03271381929516792, -0.0015795581275597215, 0.23355889320373535, 0.014797404408454895, 0.18411597609519958, 0.05663689598441124, 0.16338348388671875, 0.04172950237989426, 0.06686992943286896, -0.004308757837861776, -0.011605213396251202, 0.012266881763935089, -0.051825493574142456, -0.042994026094675064, 0.022489888593554497, 0.0273785088211298, 0.004465919919312, -0.1159159392118454, 0.0005170528893359005, 0.04267645999789238, 0.3521466553211212, 0.026302076876163483, -0.33115461468696594, -0.0937834158539772, -0.011363771744072437, -0.09160836786031723, -0.029828879982233047, 0.04430842027068138, 0.08963862806558609, -0.07562659680843353, 0.06577971577644348, -0.06103985011577606, 0.09144850075244904, -0.059319667518138885, 0.029836803674697876, 0.03289255127310753, 0.07434683293104172, 0.005700880195945501, 0.03577127307653427, -0.2962503433227539, 0.28073421120643616, 0.005631123203784227, 0.07630942016839981, -0.059538017958402634, 0.012447638437151909, 0.02244623191654682, 0.021201057359576225, 0.0854242816567421, -0.025091901421546936, -0.12549014389514923, -0.16572368144989014, -0.09539511799812317, 0.015275818295776844, 0.12291479855775833, 0.03043687902390957, 0.11055338382720947, -0.008221535012125969, -0.016779381781816483, 0.04930062219500542, -0.10247119516134262, -0.0565626323223114, -0.09930874407291412, 0.013917908072471619, 0.06958311051130295, 0.017841244116425514, -0.07698749750852585, -0.10803275555372238, -0.07963237911462784, 0.161455899477005, -0.04690762236714363, -0.049646005034446716, -0.12043671309947968, 0.009213562123477459, 0.10760517418384552, -0.08037063479423523, 0.0627606213092804, 0.007560367230325937, 0.1034381240606308, 0.003693344769999385, -0.06942233443260193, 0.11578889191150665, -0.06958215683698654, -0.16740162670612335, -0.023777656257152557, 0.14403222501277924, 0.029652034863829613, 0.06261475384235382, -0.010333992540836334, 0.03588103502988815, -0.02198963798582554, -0.0782666876912117, 0.03668055683374405, 0.0313185378909111, 0.04941844940185547, -0.018752507865428925, -0.014451628550887108, -0.005778694525361061, -0.0897565484046936, -0.01813792996108532, 0.20751960575580597, 0.24517950415611267, -0.09391327947378159, 0.095774345099926, 0.06509755551815033, -0.03955508768558502, -0.17117023468017578, -0.009669424965977669, 0.07201457023620605, -0.00040477776201441884, -0.03234190493822098, -0.1950286626815796, 0.02182387374341488, 0.06428606063127518, -0.02105681411921978, 0.07620948553085327, -0.3114224076271057, -0.1389889419078827, 0.14483876526355743, 0.11684533208608627, 0.057372041046619415, -0.14682094752788544, -0.05427340418100357, -0.009698581881821156, -0.08959914743900299, 0.09872198104858398, -0.07368794083595276, 0.13339248299598694, -0.02151283621788025, 0.0900125801563263, 0.011481883004307747, -0.05909395590424538, 0.10904435813426971, 0.006878409069031477, 0.05564282089471817, -0.04371855780482292, 0.02109719254076481, 0.04945603385567665, -0.06575894355773926, 0.05426900461316109, -0.07870833575725555, 0.0321306437253952, -0.08992088586091995, -0.030698301270604134, -0.08440285176038742, 0.012920956127345562, -0.012694328092038631, -0.027571629732847214, -0.038240376859903336, 0.00040720109245739877, 0.06439678370952606, -0.012324657291173935, 0.15859998762607574, -0.0258988868445158, 0.1213768869638443, 0.16440238058567047, 0.10472052544355392, -0.10338187217712402, -0.06646968424320221, 0.006159121636301279, -0.03442716598510742, 0.05600771680474281, -0.12481767684221268, 0.0331452377140522, 0.13678844273090363, 0.02906477451324463, 0.11560565233230591, 0.0657036304473877, -0.07196593284606934, 0.029690509662032127, 0.03940979763865471, -0.14030630886554718, -0.1259399950504303, 0.012432526797056198, 0.04283227026462555, -0.07060881704092026, 0.07352157682180405, 0.11225481331348419, -0.05890776589512825, -0.019077425822615623, -0.0010647890157997608, 0.014384094625711441, -0.039235200732946396, 0.19945017993450165, 0.04253912717103958, 0.06556674838066101, -0.12472614645957947, 0.07962489128112793, 0.04067164659500122, -0.13785240054130554, 0.06680858135223389, 0.11523443460464478, -0.09564115107059479, -0.029312387108802795, 0.03305184841156006, 0.1058652251958847, -0.027327246963977814, -0.07625725865364075, -0.14180098474025726, -0.14805257320404053, 0.11542604118585587, 0.20982274413108826, 0.05477139726281166, 0.011962365359067917, -0.05966893583536148, 0.016742343083024025, -0.12094023823738098, 0.07404458522796631, 0.040687933564186096, 0.06161949783563614, -0.12236526608467102, 0.15302594006061554, 0.01823774166405201, 0.04901929199695587, -0.014212665148079395, -0.008479558862745762, -0.11560764163732529, 0.04105975478887558, -0.1377730667591095, 0.007889210246503353, -0.06813781708478928, 0.002953618997707963, 0.002498693997040391, -0.04447924718260765, -0.062049854546785355, 0.03951378911733627, -0.12002760171890259, -0.02218621037900448, -0.004193393047899008, 0.029725441709160805, -0.12637798488140106, -0.009144372306764126, 0.007749427575618029, -0.09551648050546646, 0.09743473678827286, 0.08704204112291336, -0.02983301691710949, 0.050036896020174026, -0.04546830430626869, -0.03167468309402466, 0.08094117045402527, -0.003110236721113324, 0.055044252425432205, -0.13397149741649628, -0.019748948514461517, 0.014943324960768223, 0.03051268868148327, 0.02191765606403351, 0.11163926869630814, -0.11216187477111816, 0.002342303516343236, -0.02661878988146782, -0.052631352096796036, -0.0695110633969307, 0.0566021203994751, 0.10603443533182144, 0.028557132929563522, 0.16374637186527252, -0.09526465833187103, 0.030032064765691757, -0.16133320331573486, 0.004723858553916216, -0.02056591957807541, -0.12526042759418488, -0.043614841997623444, -0.031058959662914276, 0.08091603964567184, -0.06501792371273041, 0.12357719242572784, -0.027396967634558678, 0.03133884072303772, 0.039567429572343826, -0.08330715447664261, -0.04500983655452728, 0.04368012025952339, 0.19865919649600983, 0.037938669323921204, -0.04089481383562088, 0.07326071709394455, 0.017733758315443993, 0.07938048988580704, 0.12459861487150192, 0.1737319976091385, 0.15788210928440094, 0.060173243284225464, 0.11847540736198425, 0.05435815453529358, -0.058412231504917145, -0.16708436608314514, 0.08628037571907043, -0.06032026931643486, 0.13355810940265656, -0.011683795601129532, 0.23349842429161072, 0.126515194773674, -0.15185151994228363, 0.06547676026821136, -0.01775580458343029, -0.08892745524644852, -0.11879414319992065, -0.059978779405355453, -0.08449370414018631, -0.17035658657550812, 0.007223862688988447, -0.10407434403896332, 0.060791682451963425, 0.04036923497915268, 0.0406450591981411, 0.017503537237644196, 0.13356520235538483, 0.025533415377140045, 0.0011981537099927664, 0.0938468649983406, -0.0034534884616732597, -0.05139409005641937, -0.0654342845082283, -0.08168738335371017, 0.03930104151368141, -0.011124776676297188, 0.05700472742319107, -0.0044067357666790485, -0.06600939482450485, 0.05390038341283798, -0.035257499665021896, -0.09521207958459854, 0.02477937377989292, 0.02138591930270195, 0.07421143352985382, 0.053345803171396255, 0.0343724749982357, -0.03974883630871773, -0.0016492705326527357, 0.19061097502708435, -0.0947212427854538, -0.09959877282381058, -0.10897103697061539, 0.2683177888393402, 0.03826966509222984, -0.01721738465130329, 0.022094130516052246, -0.058050334453582764, -0.03629877790808678, 0.2044251561164856, 0.17119856178760529, -0.010132716968655586, 0.004274469800293446, -0.01581609807908535, -0.005809308495372534, -0.043228887021541595, 0.08381844311952591, 0.15583012998104095, 0.06372498720884323, -0.06269604712724686, -0.06358547508716583, -0.05333370715379715, -0.034645576030015945, -0.06843351572751999, 0.07628190517425537, 0.014270270243287086, -0.02650071680545807, -0.03774745762348175, 0.0622498095035553, -0.09407172352075577, -0.08780978620052338, 0.01707332581281662, -0.1899011880159378, -0.1541675627231598, 0.007431644015014172, 0.06914526224136353, 0.013699430041015148, 0.03485763445496559, 0.0046659428626298904, -0.013051481917500496, 0.08807174861431122, 0.0005368085112422705, -0.08228840678930283, -0.060809750109910965, 0.092787005007267, -0.14782628417015076, 0.15854524075984955, -0.03908930718898773, 0.04669244587421417, 0.12287257611751556, 0.08951910585165024, -0.08050762861967087, 0.08849873393774033, 0.04622596129775047, -0.10895267128944397, 0.02583940513432026, 0.15606917440891266, -0.03488616645336151, 0.0890420526266098, 0.02996581420302391, -0.11539477854967117, 0.010171609930694103, -0.10265477001667023, -0.03983833268284798, -0.03537425026297569, -0.04617121443152428, -0.04696659743785858, 0.10657443851232529, 0.1665657013654709, -0.045781467109918594, 0.004395944532006979, -0.053576916456222534, 0.008421660400927067, 0.046719495207071304, 0.003148264018818736, -0.05753806233406067, -0.2782512605190277, 0.011577482335269451, 0.027842320501804352, 0.00722676794975996, -0.2543206810951233, -0.08786150068044662, 0.010264093987643719, -0.04437977075576782, -0.08825569599866867, 0.08789321780204773, 0.07012148946523666, 0.04342355951666832, -0.058009400963783264, -0.04866177216172218, -0.03920764848589897, 0.18731571733951569, -0.17453256249427795, -0.0540112666785717 ]
null
null
transformers
# Squi
{"tags": ["conversational"]}
text-generation
Nihwy/DialoSqui
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Squi
[ "# Squi" ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Squi" ]
[ 51, 3 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Squi" ]
[ -0.00937036331743002, 0.003815700300037861, -0.008274602703750134, 0.01048512477427721, 0.16940738260746002, 0.012055802159011364, 0.11565074324607849, 0.1098102405667305, 0.010816367343068123, -0.013860464096069336, 0.13355843722820282, 0.19696426391601562, 0.014645634219050407, 0.12317570298910141, -0.0792020708322525, -0.25127390027046204, 0.060342662036418915, 0.017340946942567825, -0.00218098028562963, 0.12836843729019165, 0.09125630557537079, -0.058130115270614624, 0.09287755936384201, -0.01145557314157486, -0.16709397733211517, -0.011128446087241173, 0.03549984470009804, -0.10958102345466614, 0.11236581206321716, 0.07596883922815323, 0.07919523864984512, 0.03405606001615524, -0.05678725615143776, -0.17059756815433502, 0.04249964654445648, 0.00042676180601119995, -0.053869232535362244, 0.05335693433880806, 0.04214625433087349, -0.07813768088817596, 0.06290510296821594, 0.0802399292588234, -0.03589589148759842, 0.06123766303062439, -0.16559676826000214, 0.055704809725284576, -0.03498009219765663, 0.017729245126247406, 0.03964773938059807, 0.09746713191270828, -0.03203103318810463, 0.08257928490638733, -0.08490487933158875, 0.10451006889343262, 0.16224095225334167, -0.34449291229248047, -0.02943265624344349, 0.07229168713092804, 0.03554053232073784, 0.058764856308698654, -0.06338158249855042, 0.07353527098894119, 0.01021250244230032, 0.008096355013549328, -0.025149546563625336, -0.06704802066087723, -0.11525987088680267, 0.04862130805850029, -0.10098199546337128, -0.0365544930100441, 0.19883470237255096, -0.05037452280521393, 0.04312015697360039, -0.024382341653108597, -0.08984153717756271, -0.05363130569458008, -0.03316973149776459, -0.007985229603946209, -0.06265781819820404, 0.07494135200977325, -0.01255070324987173, -0.0981573760509491, -0.12972012162208557, -0.012657709419727325, -0.19664524495601654, 0.16253279149532318, 0.01780688390135765, 0.04669575393199921, -0.1855795979499817, 0.06542105227708817, 0.015237947925925255, -0.06565602868795395, -0.0015228982083499432, -0.10651843994855881, 0.01078595407307148, 0.01288929395377636, -0.05677982047200203, -0.00893294159322977, 0.10878602415323257, 0.13129869103431702, 0.009517488069832325, 0.025196297094225883, -0.0193487536162138, 0.08778733015060425, 0.048727940768003464, 0.05894732475280762, 0.018022378906607628, -0.047902509570121765, 0.04752759635448456, -0.11618339270353317, -0.006905588787049055, -0.0766633152961731, -0.15931442379951477, -0.009329518303275108, 0.04122278839349747, 0.06071517616510391, 0.029601162299513817, 0.10408713668584824, -0.02586076594889164, -0.015344507992267609, -0.013836386613547802, -0.03916669636964798, -0.024637173861265182, 0.01697080209851265, 0.015192078426480293, 0.12202323228120804, -0.018600113689899445, 0.044603630900382996, -0.11088553816080093, 0.023525331169366837, -0.07330567389726639, 0.0013965917751193047, -0.023928401991724968, -0.03793914243578911, 0.0008532189531251788, -0.07531223446130753, -0.009576322510838509, -0.1548064649105072, -0.14127792418003082, 0.0018942335154861212, -0.008185419254004955, -0.049908481538295746, -0.08777425438165665, -0.06450241804122925, -0.02164723537862301, 0.03552428260445595, -0.058456577360630035, 0.01187213882803917, -0.05010798200964928, 0.11379332095384598, -0.031245410442352295, 0.0787162035703659, -0.1192006766796112, 0.07054249197244644, -0.08032296597957611, -0.041576094925403595, -0.07158862799406052, 0.09490927308797836, -0.0005348241538740695, 0.07390036433935165, -0.005924099124968052, -0.0207145344465971, -0.09803048521280289, 0.06685150414705276, -0.034114621579647064, 0.2349705547094345, -0.04728655889630318, -0.1114477813243866, 0.2516425549983978, -0.05589770898222923, -0.12833884358406067, 0.1300712525844574, 0.00868798978626728, 0.04635309427976608, 0.09108296781778336, 0.19439788162708282, -0.03288080915808678, -0.01911025121808052, 0.07804052531719208, 0.10773824155330658, -0.09135769307613373, -0.012617855332791805, 0.024375829845666885, -0.011318175122141838, -0.10070984065532684, 0.042769432067871094, 0.08534467965364456, 0.05989654362201691, -0.06920735538005829, -0.04142448306083679, -0.01115487888455391, -0.003278749994933605, 0.07584867626428604, -0.019968828186392784, 0.11063864082098007, -0.02666531689465046, -0.026701048016548157, -0.06157832220196724, -0.02459247037768364, 0.0000878942446433939, 0.027282893657684326, -0.06097013130784035, 0.120262011885643, 0.023322490975260735, 0.06224854290485382, -0.17017966508865356, -0.08373416215181351, -0.03906449303030968, 0.1585770696401596, 0.04550187289714813, 0.08418143540620804, 0.05021682754158974, -0.036395348608493805, 0.00379683799110353, 0.037993382662534714, 0.14149852097034454, -0.02962159737944603, -0.0818386971950531, -0.07175067067146301, 0.07642564177513123, -0.04745430499315262, 0.09756408631801605, -0.06889675557613373, 0.019421139732003212, 0.057717464864254, 0.08794429898262024, -0.003982020076364279, 0.026669146493077278, -0.02193254418671131, -0.004009902011603117, -0.06591019034385681, 0.02046162262558937, 0.10637809336185455, 0.0185330081731081, -0.05848022550344467, 0.18897905945777893, -0.13251544535160065, 0.19081510603427887, 0.1875167042016983, -0.27110064029693604, -0.001309665385633707, -0.14079439640045166, -0.042080022394657135, 0.022360391914844513, 0.034882739186286926, -0.048295095562934875, 0.16199713945388794, -0.004576326813548803, 0.16570283472537994, -0.03267013281583786, -0.048654910176992416, -0.010356144048273563, -0.043512437492609024, 0.0035424723755568266, 0.0826210230588913, 0.09433155506849289, -0.13206003606319427, 0.20087876915931702, 0.12549704313278198, 0.057311370968818665, 0.1693819910287857, 0.022904355078935623, 0.003355997847393155, 0.06217603757977486, -0.004655464086681604, -0.05255544185638428, -0.04736160859465599, -0.21712826192378998, -0.04324784129858017, 0.07360941916704178, 0.01802084967494011, 0.10532307624816895, -0.10670747607946396, -0.050960686057806015, -0.025190869346261024, -0.018653767183423042, -0.004074408672749996, 0.0838262289762497, 0.04893707111477852, 0.12901046872138977, 0.011793280020356178, 0.0008904612623155117, 0.0862765833735466, 0.01064157672226429, -0.08691413700580597, 0.1757267862558365, -0.1313033401966095, -0.345925509929657, -0.11498235166072845, -0.15135346353054047, -0.04845050722360611, 0.04076521843671799, 0.11034815013408661, -0.12426204979419708, -0.0003907875798176974, 0.02961895987391472, 0.09158872067928314, -0.09522868692874908, 0.026657268404960632, -0.05233220010995865, 0.03155338391661644, -0.1225532814860344, -0.07524893432855606, -0.05383923649787903, -0.04882016032934189, -0.07916593551635742, 0.15078029036521912, -0.1012367382645607, 0.058258939534425735, 0.1792561411857605, 0.061638329178094864, 0.05859043449163437, -0.04358500614762306, 0.2315061390399933, -0.10017995536327362, -0.009924553334712982, 0.17747993767261505, -0.033466920256614685, 0.08606743067502975, 0.12169092148542404, -0.007019777316600084, -0.07855183631181717, 0.03336597979068756, -0.012388537637889385, -0.08049163967370987, -0.23471374809741974, -0.1129327043890953, -0.12901754677295685, 0.06652732193470001, -0.010415531694889069, 0.04739988222718239, 0.13982358574867249, 0.07312269508838654, -0.028904886916279793, 0.018280671909451485, 0.046455226838588715, 0.07638436555862427, 0.22723747789859772, -0.04698612913489342, 0.1466842144727707, -0.02013683319091797, -0.13750334084033966, 0.08988777548074722, 0.05406058207154274, 0.14042158424854279, 0.06258223950862885, 0.05487731471657753, 0.01757018268108368, 0.0665176510810852, 0.1271013766527176, 0.06646858155727386, 0.021366722881793976, -0.023144548758864403, -0.04209905490279198, -0.02318994328379631, -0.07989354431629181, 0.04914827272295952, 0.06381320208311081, -0.16718235611915588, -0.017823252826929092, -0.04737061634659767, 0.09428777545690536, 0.12697337567806244, 0.05259637534618378, -0.1477990448474884, -0.025457099080085754, 0.08106572926044464, -0.024862153455615044, -0.11859841644763947, 0.08338530361652374, 0.01828259415924549, -0.1555035412311554, 0.023320674896240234, -0.045701414346694946, 0.12150884419679642, -0.06296820938587189, 0.0952569842338562, -0.08964943140745163, -0.04051636904478073, 0.015768812969326973, 0.0952533707022667, -0.3317590057849884, 0.2127750962972641, 0.0006195639725774527, -0.06273312866687775, -0.10499714314937592, -0.0057721566408872604, 0.020983630791306496, 0.048102665692567825, 0.11379168182611465, 0.006703088991343975, -0.03979417309165001, -0.07723921537399292, -0.024062994867563248, 0.04255614057183266, 0.1527692675590515, -0.04681124910712242, -0.002098271157592535, -0.056719984859228134, 0.003535230178385973, -0.03872174769639969, -0.024341410025954247, -0.020022129639983177, -0.1913326233625412, 0.09821782261133194, 0.01137829851359129, 0.10880158096551895, 0.023819129914045334, -0.008122914470732212, -0.09120669215917587, 0.24698105454444885, -0.12117035686969757, -0.09201791137456894, -0.1212165355682373, -0.014241010881960392, -0.005325394682586193, -0.05679059028625488, 0.036609433591365814, -0.07121364027261734, 0.05391742289066315, -0.0692785233259201, -0.18262968957424164, 0.11468278616666794, -0.09690466523170471, -0.056560978293418884, -0.03358164802193642, 0.1836470365524292, -0.024122975766658783, 0.021295635029673576, 0.030673004686832428, 0.01839791052043438, -0.11915513873100281, -0.10414303839206696, 0.041646841913461685, -0.00484382314607501, 0.02751413732767105, 0.05487615615129471, -0.08296132832765579, -0.03781910613179207, -0.07012614607810974, -0.0407542921602726, 0.34934869408607483, 0.15203143656253815, -0.04188966006040573, 0.18582750856876373, 0.11543850600719452, -0.07210923731327057, -0.316303014755249, -0.09817884862422943, -0.08418313413858414, -0.03661470115184784, -0.07154005765914917, -0.21963120996952057, 0.05768440663814545, 0.02518126182258129, -0.008832339197397232, 0.12374170124530792, -0.23784270882606506, -0.07174617052078247, 0.12663348019123077, -0.02002275362610817, 0.41670602560043335, -0.14490732550621033, -0.11146973073482513, -0.04278814420104027, -0.1412639319896698, 0.1747569590806961, -0.03909887373447418, 0.11213037371635437, 0.0010496997274458408, 0.14243727922439575, 0.046280764043331146, -0.011546771973371506, 0.09160415083169937, 0.008816281333565712, -0.04815347492694855, -0.08986703306436539, -0.09162980318069458, 0.020360790193080902, 0.02831489033997059, -0.018969928845763206, -0.0324314720928669, 0.021882446482777596, -0.10947105288505554, -0.03498980402946472, -0.09479787200689316, 0.04020193591713905, 0.024491991847753525, -0.06664413958787918, -0.0033107877243310213, -0.047592587769031525, -0.015918772667646408, 0.025792313739657402, 0.18150395154953003, -0.0852992832660675, 0.17869903147220612, 0.09809552878141403, 0.10576817393302917, -0.10492683202028275, -0.014143216423690319, -0.0706290528178215, -0.0556948184967041, 0.06105392426252365, -0.06182605400681496, 0.03586817532777786, 0.11761921644210815, -0.018088560551404953, 0.0783490538597107, 0.1126852035522461, 0.012629612348973751, 0.025303438305854797, 0.10339276492595673, -0.26143354177474976, -0.04142379388213158, -0.05643017217516899, 0.0312555730342865, 0.06904355436563492, 0.07409984618425369, 0.1949804723262787, 0.035116277635097504, -0.05157537758350372, -0.004333531484007835, 0.04518955200910568, -0.018840152770280838, 0.05442085117101669, 0.009779139421880245, 0.026788009330630302, -0.14608553051948547, 0.0401865616440773, 0.004762822762131691, -0.12158173322677612, 0.011962543241679668, 0.15781794488430023, -0.11872810870409012, -0.12452658265829086, -0.057349611073732376, 0.041758641600608826, -0.1108863428235054, 0.013107124716043472, -0.027990471571683884, -0.16105318069458008, 0.07644850760698318, 0.1415432095527649, 0.04600261151790619, 0.08385041356086731, -0.035103876143693924, 0.0010167277650907636, 0.004180854186415672, -0.014465502463281155, -0.009397701360285282, -0.010724406689405441, -0.04898809269070625, 0.04886732995510101, -0.0403924360871315, 0.14975780248641968, -0.09273579716682434, -0.0939839780330658, -0.17907869815826416, 0.023836452513933182, -0.07753481715917587, -0.06541606038808823, -0.0909687802195549, -0.06054827943444252, -0.0018807885935530066, -0.048733483999967575, -0.05863494053483009, -0.053966403007507324, -0.11849022656679153, 0.029515933245420456, -0.04240794479846954, 0.019173728302121162, -0.07490299642086029, 0.010114646516740322, 0.08885577321052551, -0.047851964831352234, 0.14908453822135925, 0.16316649317741394, -0.08479998260736465, 0.09499078243970871, -0.16176976263523102, -0.08124727010726929, 0.09364461898803711, 0.019400442019104958, 0.04032346233725548, 0.07352963089942932, 0.013828566297888756, 0.0370582677423954, 0.04894483834505081, 0.060594383627176285, 0.0998951643705368, -0.10131455957889557, 0.05623175948858261, -0.04847913235425949, -0.12243345379829407, -0.049322452396154404, -0.06508966535329819, 0.04795750603079796, 0.016303502023220062, 0.09776875376701355, -0.05616628751158714, 0.10032621771097183, -0.05655762180685997, 0.04846661537885666, 0.017921488732099533, -0.16759811341762543, -0.024071993306279182, -0.09620795398950577, 0.04575780779123306, 0.022350091487169266, 0.2325657457113266, 0.010064639151096344, 0.010928616859018803, 0.04661599546670914, 0.07869743555784225, -0.002850000048056245, 0.01932426169514656, 0.13401931524276733, 0.13594509661197662, -0.057310961186885834, -0.10559333860874176, 0.07253514230251312, 0.012907079420983791, 0.011007215827703476, 0.17799730598926544, 0.0023578903637826443, 0.03282802924513817, 0.10085295140743256, -0.0010897229658439755, 0.0032608036417514086, -0.060945816338062286, -0.07862481474876404, -0.06259013712406158, 0.03852523863315582, -0.04830721765756607, 0.12660127878189087, 0.13671045005321503, -0.033163849264383316, 0.024567775428295135, -0.05056760087609291, -0.048016950488090515, -0.1657344251871109, -0.09328390657901764, -0.06820239126682281, -0.11879529803991318, 0.007375452201813459, -0.11199790984392166, 0.05356352776288986, 0.07982882112264633, 0.06415926665067673, -0.03317343071103096, 0.04810468852519989, 0.03599347919225693, -0.10470180958509445, 0.0757451206445694, -0.029329683631658554, 0.06986953318119049, -0.041326165199279785, -0.03491341322660446, -0.07055871933698654, 0.020980579778552055, 0.023073608055710793, 0.06509874016046524, -0.03434571251273155, 0.009768263436853886, -0.1345933973789215, -0.09160683304071426, -0.05603542923927307, 0.055329594761133194, -0.03970347344875336, 0.12274465709924698, -0.0007326063350774348, -0.015001141466200352, 0.044744111597537994, 0.2268536537885666, -0.06788837164640427, -0.04407823458313942, -0.06774720549583435, 0.1726689487695694, 0.0066717625595629215, 0.12245407700538635, -0.03222928196191788, 0.0015799218090251088, -0.08528036624193192, 0.37533873319625854, 0.2940037250518799, -0.12201619893312454, 0.01174280047416687, 0.033962517976760864, 0.046858079731464386, 0.14911657571792603, 0.09575843065977097, 0.1191491186618805, 0.3069320619106293, -0.09321311116218567, -0.03807849809527397, -0.01917591132223606, -0.02737848088145256, -0.0850517526268959, 0.08995208889245987, 0.04128825664520264, -0.043323300778865814, -0.03853912279009819, 0.09020586311817169, -0.27187463641166687, 0.06527446955442429, -0.10360295325517654, -0.18932515382766724, -0.0794154554605484, 0.008852886036038399, 0.0857616737484932, 0.02885427512228489, 0.10600755363702774, 0.003667142940685153, -0.09495903551578522, 0.06254587322473526, 0.029987882822752, -0.2031451314687729, 0.0019171637250110507, 0.09243684262037277, -0.06862299889326096, -0.019676081836223602, -0.03306182846426964, 0.07289282977581024, 0.07374539971351624, 0.05758364871144295, -0.008171619847416878, 0.01144593395292759, -0.0011467324802652001, -0.07915712893009186, 0.00043485185597091913, 0.049420952796936035, 0.01438257098197937, -0.10344231873750687, 0.09222563356161118, -0.13409322500228882, 0.04748820513486862, 0.026479560881853104, -0.034556951373815536, 0.008468998596072197, 0.04739662632346153, -0.08073335140943527, 0.04732450097799301, 0.07816066592931747, -0.011984609067440033, -0.005093808751553297, -0.04710559546947479, -0.030353829264640808, -0.03746681287884712, -0.059711869806051254, -0.10583781450986862, -0.16568464040756226, -0.10307854413986206, 0.06715674698352814, -0.004404732957482338, -0.1881696879863739, 0.004661028739064932, -0.10518254339694977, 0.09359987825155258, -0.15954264998435974, 0.0941019132733345, 0.10140333324670792, 0.005055949557572603, -0.007899286225438118, -0.010043785907328129, 0.04430465027689934, 0.08685183525085449, -0.08117606490850449, -0.06704643368721008 ]
null
null
transformers
# Harry Potter DialoGPT Model
{"tags": ["conversational"]}
text-generation
NikhilKrishna/DialoGPT-medium-harrypotter
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Harry Potter DialoGPT Model
[ "# Harry Potter DialoGPT Model" ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Harry Potter DialoGPT Model" ]
[ 51, 8 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Harry Potter DialoGPT Model" ]
[ -0.0009023238671943545, 0.07815738022327423, -0.006546166725456715, 0.07792752981185913, 0.10655936598777771, 0.048972971737384796, 0.17639793455600739, 0.12185695022344589, 0.016568755730986595, -0.04774167761206627, 0.11647630482912064, 0.2130284160375595, -0.002118367003276944, 0.024608047679066658, -0.05022026598453522, -0.3065771162509918, 0.0474756620824337, 0.014356585219502449, -0.07174845039844513, 0.11724270135164261, 0.09064973145723343, -0.046179238706827164, 0.08330509811639786, -0.009135239757597446, -0.13198648393154144, -0.039482954889535904, 0.019292812794446945, -0.11745545268058777, 0.1662212759256363, 0.05298272892832756, 0.02469746209681034, -0.008447164669632912, -0.06598151475191116, -0.15036040544509888, 0.037190426141023636, -0.027472136542201042, -0.01080626156181097, 0.05462246760725975, 0.023526115342974663, -0.07521048933267593, 0.170567125082016, 0.17678891122341156, 0.0833497866988182, 0.0349111407995224, -0.14917024970054626, -0.045548245310783386, 0.008950977586209774, 0.05421316996216774, -0.017893504351377487, 0.09349167346954346, -0.019903047010302544, 0.11801653355360031, -0.04491448402404785, 0.09210366010665894, 0.15255063772201538, -0.4016275703907013, -0.027563704177737236, 0.08920855820178986, 0.05989706888794899, 0.12076901644468307, -0.10560955852270126, 0.03972794860601425, -0.0039703017100691795, 0.01236654631793499, -0.014540530741214752, -0.08304883539676666, -0.07308239489793777, 0.032504837960004807, -0.1272556483745575, 0.008525865152478218, 0.23756256699562073, -0.10643257945775986, 0.037069112062454224, -0.09791990369558334, -0.07414398342370987, 0.048336777836084366, -0.053761593997478485, -0.081727035343647, -0.054839808493852615, 0.06347949057817459, 0.004366500303149223, -0.06301609426736832, -0.08326146006584167, -0.0006536149303428829, -0.12781435251235962, 0.17595994472503662, 0.061243366450071335, 0.041611745953559875, -0.21322020888328552, 0.08940251916646957, 0.04477722570300102, -0.04711297154426575, 0.007116159424185753, -0.11796226352453232, 0.04023287072777748, 0.005483259446918964, -0.03256071358919144, -0.021854614838957787, 0.0393419973552227, 0.13909944891929626, -0.01777748204767704, 0.03252175822854042, 0.006831915583461523, 0.05811219662427902, 0.08162496984004974, 0.02222144603729248, 0.019291909411549568, -0.0818009302020073, 0.019385190680623055, -0.08128736168146133, -0.0030400939285755157, -0.048940129578113556, -0.17071883380413055, -0.07477642595767975, 0.052610911428928375, 0.020047198981046677, 0.03746970370411873, 0.08054786175489426, -0.0017944995779544115, -0.05560554191470146, 0.03284840285778046, 0.01671096310019493, -0.020622212439775467, -0.010361049324274063, -0.02412462793290615, 0.19123271107673645, 0.019619356840848923, 0.014111656695604324, -0.12379156798124313, 0.10023640841245651, -0.08179095387458801, 0.0037731381598860025, 0.02743307314813137, -0.04204464703798294, -0.004716555587947369, 0.02917117439210415, 0.023101668804883957, -0.1252521574497223, -0.1099385917186737, -0.0030569476075470448, -0.012054097838699818, -0.036421261727809906, -0.10490952432155609, -0.08483029156923294, -0.012153145857155323, 0.0449371263384819, -0.013397793285548687, 0.007936403155326843, -0.05143149942159653, 0.0985720232129097, -0.0514979362487793, 0.09873400628566742, -0.08342572301626205, 0.06359215080738068, -0.09124887734651566, -0.061886150389909744, -0.11452563107013702, 0.05216052383184433, 0.012905281968414783, 0.066250741481781, 0.016998225823044777, -0.044836658984422684, -0.014836243353784084, 0.05253177136182785, -0.07656687498092651, 0.1940697431564331, -0.041674621403217316, -0.12459053844213486, 0.24146439135074615, -0.09138800948858261, -0.1802034229040146, 0.12973085045814514, -0.022254703566432, 0.08523941785097122, 0.12802475690841675, 0.20380465686321259, -0.00019822151807602495, -0.01302915159612894, 0.07281201332807541, 0.07031642645597458, -0.09803894907236099, 0.06239739805459976, 0.029653839766979218, -0.008071083575487137, -0.08906278014183044, 0.05762826278805733, 0.046033453196287155, -0.010650773532688618, -0.035073768347501755, -0.001896020956337452, -0.012895751744508743, -0.022185025736689568, 0.14126582443714142, -0.02006692811846733, 0.1300428807735443, -0.06926563382148743, -0.03515486419200897, -0.009500149637460709, 0.03533667325973511, -0.04091939330101013, 0.08151165395975113, -0.0436173714697361, 0.10586477071046829, 0.09034156054258347, 0.053724925965070724, -0.13120363652706146, 0.00466286763548851, -0.015246815048158169, 0.17014820873737335, 0.08964069187641144, 0.05222717300057411, 0.06265474855899811, -0.0020888058934360743, -0.06708643585443497, 0.045407816767692566, 0.13778303563594818, -0.037020038813352585, -0.12218865007162094, -0.1755627691745758, 0.051157694309949875, -0.045444171875715256, 0.10855234414339066, -0.10010123997926712, 0.022670533508062363, -0.055906031280756, 0.07772238552570343, -0.024998966604471207, 0.020512236282229424, -0.0013405600329861045, -0.021700702607631683, -0.08356887847185135, -0.002377772703766823, 0.08597290515899658, -0.02048647589981556, -0.06707409024238586, 0.16556480526924133, -0.16400809586048126, 0.1631954461336136, 0.2116095870733261, -0.28542569279670715, -0.005696662236005068, -0.15163889527320862, -0.0208092350512743, 0.019645055755972862, 0.07834604382514954, 0.026225795969367027, 0.2044338881969452, -0.012928472831845284, 0.16565458476543427, -0.05699567869305611, -0.07730039209127426, -0.06881127506494522, -0.048101142048835754, 0.013522743247449398, 0.09095205366611481, 0.04542696103453636, -0.11962861567735672, 0.13119758665561676, 0.1054433062672615, 0.06484298408031464, 0.12711186707019806, 0.1030748188495636, -0.008113685995340347, 0.07252490520477295, -0.03624548763036728, -0.03462279960513115, -0.09254947304725647, -0.30446043610572815, -0.04840317741036415, 0.0939924493432045, 0.007963384501636028, 0.09285714477300644, -0.0919896736741066, -0.03311870992183685, 0.006042704917490482, 0.009473444893956184, 0.028337622061371803, 0.09653715789318085, 0.013490920886397362, 0.15320514142513275, -0.008011690340936184, -0.03430786728858948, 0.05891305208206177, 0.017982570454478264, -0.09147711098194122, 0.17280617356300354, -0.17050009965896606, -0.27190929651260376, -0.06990014761686325, -0.21745692193508148, -0.013139115646481514, 0.05258983001112938, 0.0786920040845871, -0.11818131804466248, -0.018352627754211426, -0.006239492911845446, 0.05685517191886902, -0.2425733357667923, 0.0004911290016025305, -0.1354890614748001, 0.0501418262720108, -0.1974833607673645, -0.09718500077724457, -0.02271542325615883, -0.013450481928884983, -0.0464281290769577, 0.13365240395069122, -0.1448695808649063, -0.011572926305234432, 0.2329535037279129, 0.032479673624038696, 0.027794739231467247, -0.05020907148718834, 0.19788463413715363, -0.0958966314792633, -0.023973820731043816, 0.11024576425552368, -0.05038975924253464, 0.04834126681089401, 0.06649978458881378, -0.012981836684048176, -0.08557141572237015, 0.023789849132299423, -0.068336620926857, -0.03150583803653717, -0.27926525473594666, -0.0930178239941597, -0.09319330751895905, 0.11305391043424606, 0.04079577326774597, 0.06421639025211334, 0.16545771062374115, 0.05191578343510628, -0.024325082078576088, -0.03006586618721485, 0.11609793454408646, 0.12905290722846985, 0.2277202159166336, -0.06067761778831482, 0.10221996158361435, 0.009445492178201675, -0.08203992247581482, 0.06062209978699684, 0.056782789528369904, 0.06324724853038788, 0.02584579586982727, 0.03694582358002663, -0.030939655378460884, 0.1121687963604927, 0.12571842968463898, 0.05258069559931755, 0.0481170229613781, 0.0002127334737451747, -0.0561506561934948, -0.008168719708919525, -0.05726633965969086, 0.06774696707725525, 0.061340972781181335, -0.12918008863925934, -0.08061543852090836, 0.0011613310780376196, 0.06660808622837067, -0.016230419278144836, 0.06823775917291641, -0.13560809195041656, -0.03582429885864258, 0.0790911465883255, -0.07693151384592056, -0.14156894385814667, 0.11972879618406296, -0.026570770889520645, -0.19904157519340515, 0.05265914276242256, 0.007704653777182102, 0.0908159390091896, -0.06360849738121033, 0.05343840271234512, -0.13023801147937775, -0.12935101985931396, -0.018437571823596954, 0.07945099472999573, -0.3450873792171478, 0.13536721467971802, -0.013286802917718887, -0.02876877970993519, -0.06474969536066055, -0.02640824392437935, 0.013905409723520279, 0.12719078361988068, 0.08667250722646713, 0.0008821099763736129, 0.0991629809141159, 0.03823768347501755, 0.04188435152173042, -0.002011700300499797, 0.10950417071580887, 0.0050011589191854, 0.004797275178134441, -0.04982118681073189, 0.007274609990417957, -0.05164213851094246, -0.07472953200340271, 0.08393982797861099, -0.20678792893886566, 0.09087453782558441, -0.03378438204526901, 0.08427679538726807, 0.04304937273263931, -0.018965769559144974, -0.1001204177737236, 0.19745583832263947, -0.012206900864839554, -0.11405988782644272, -0.07517550885677338, -0.02810264565050602, 0.09103139489889145, -0.013817726634442806, 0.012886416167020798, -0.045470476150512695, 0.032183047384023666, -0.1263762265443802, -0.1597503274679184, 0.08734500408172607, -0.04441224783658981, -0.10894393920898438, -0.025462759658694267, 0.20382575690746307, -0.007266622502356768, 0.08242089301347733, 0.01605331338942051, 0.010653935372829437, -0.18066231906414032, -0.04018142446875572, 0.02645772136747837, -0.0016437612939625978, 0.005979063920676708, 0.047698814421892166, 0.019091911613941193, 0.06207629665732384, -0.1069745197892189, -0.013920160941779613, 0.3158324360847473, 0.15978319942951202, -0.00912671908736229, 0.14943915605545044, 0.1093616932630539, -0.08669080585241318, -0.17238758504390717, -0.1171615794301033, -0.1210922971367836, -0.08425768464803696, -0.10681738704442978, -0.1525043100118637, 0.09535340964794159, -0.03392014652490616, 0.03498011827468872, 0.14615866541862488, -0.280263751745224, -0.10949636250734329, 0.13820378482341766, 0.010744688101112843, 0.3510635495185852, -0.12303631007671356, -0.044944874942302704, -0.06214528530836105, -0.16933435201644897, 0.08021392673254013, -0.031203703954815865, 0.11581093072891235, -0.0744495838880539, 0.19395925104618073, 0.01719796098768711, 0.014287159778177738, 0.0916559100151062, 0.05038322135806084, -0.05808406323194504, -0.07368700206279755, -0.10248131304979324, 0.010812131687998772, 0.03546109423041344, 0.010252019390463829, -0.008802837692201138, 0.0211968794465065, -0.11341743916273117, -0.050869911909103394, -0.06302189081907272, 0.0072614275850355625, -0.01001308299601078, -0.042155615985393524, -0.05533592775464058, -0.022557416930794716, -0.020093943923711777, 0.02266426384449005, 0.14185629785060883, -0.07527699321508408, 0.18586260080337524, 0.02357078716158867, 0.1586609035730362, -0.11956068128347397, -0.06724818795919418, -0.029193658381700516, -0.05280323326587677, 0.06468886137008667, -0.08884575963020325, -0.027708567678928375, 0.1332162618637085, -0.01903904788196087, 0.04655366763472557, 0.12936700880527496, 0.02046884410083294, 0.015383756719529629, 0.034968774765729904, -0.2578005790710449, -0.07463036477565765, -0.03505445644259453, -0.012416874058544636, 0.05272092670202255, 0.05525677278637886, 0.19735674560070038, -0.03551921248435974, -0.08521962910890579, 0.020131373777985573, 0.02735883742570877, -0.02776256389915943, 0.10749414563179016, 0.019579345360398293, -0.004837906453758478, -0.16151933372020721, 0.08257976174354553, -0.005964108742773533, -0.08297000825405121, 0.028665626421570778, 0.2024049311876297, -0.12141239643096924, -0.10309756547212601, -0.06804922968149185, 0.07315051555633545, -0.09220825880765915, 0.016043387353420258, -0.005091092549264431, -0.1521538347005844, 0.06916408240795135, 0.07598215341567993, 0.04075418785214424, 0.06513199955224991, -0.11743064224720001, -0.015730571001768112, -0.04170290008187294, -0.002195435343310237, 0.03521120920777321, 0.01863143965601921, -0.057492829859256744, 0.15846455097198486, -0.0676199421286583, 0.08538917452096939, -0.0744810476899147, -0.1058846190571785, -0.1395980566740036, 0.04660497233271599, -0.08038312196731567, -0.07247276604175568, -0.12832807004451752, -0.052204377949237823, -0.0067099276930093765, -0.03388519585132599, 0.006552806124091148, -0.06627799570560455, -0.10922821611166, 0.01822470687329769, -0.00743203004822135, -0.009385870769619942, -0.06096754968166351, 0.026706209406256676, 0.06246216222643852, -0.039788868278265, 0.15730851888656616, 0.22509248554706573, -0.13591648638248444, 0.11564400047063828, -0.09797432273626328, -0.105463907122612, 0.046008042991161346, 0.009427277371287346, 0.03594303876161575, 0.0503489226102829, -0.03594081476330757, 0.0044484552927315235, 0.03905477747321129, 0.08074651658535004, 0.08456914126873016, -0.06776505708694458, 0.020801106467843056, -0.05122765153646469, -0.14904099702835083, -0.016655439510941505, -0.0464773029088974, 0.06876829266548157, -0.006725262850522995, 0.11020535975694656, -0.0515950471162796, 0.07739507406949997, -0.07558431476354599, 0.050614211708307266, 0.021146971732378006, -0.14688286185264587, -0.006612539757043123, -0.07093682140111923, 0.042144812643527985, -0.008834975771605968, 0.20241086184978485, -0.03228091076016426, 0.010342049412429333, 0.033811055123806, 0.06203942745923996, -0.01957780309021473, 0.009357001632452011, 0.2014283686876297, 0.12640917301177979, -0.08496357500553131, -0.02679651789367199, 0.06793134659528732, 0.07248228788375854, 0.07093550264835358, 0.10807815194129944, -0.015352966263890266, 0.028434239327907562, 0.07829629629850388, -0.060215238481760025, 0.07576877623796463, -0.08603982627391815, -0.11668483167886734, 0.05793621391057968, 0.012955795042216778, -0.055695828050374985, 0.20305177569389343, 0.19142870604991913, -0.026278704404830933, 0.018410727381706238, -0.0029499190859496593, -0.10117456316947937, -0.15619947016239166, -0.05423750728368759, -0.07170962542295456, -0.1319410353899002, -0.004549739416688681, -0.16646917164325714, 0.022016216069459915, -0.01132756657898426, 0.09506805986166, -0.06855440139770508, -0.01345991250127554, 0.1364889293909073, -0.1055467277765274, 0.0847758799791336, -0.024517204612493515, 0.07877567410469055, -0.03746940940618515, -0.018209461122751236, -0.10342709720134735, 0.007514837197959423, 0.01131442841142416, 0.06840907037258148, -0.10897937417030334, 0.02432350255548954, -0.12208317965269089, -0.08617185056209564, -0.026142612099647522, 0.09279687702655792, -0.0403008833527565, 0.15116846561431885, 0.02645145356655121, -0.06710928678512573, -0.004313822835683823, 0.2646709978580475, -0.08046227693557739, -0.08319197595119476, -0.030799202620983124, 0.2152107208967209, 0.04053696244955063, 0.06396269053220749, 0.019140036776661873, 0.038027774542570114, -0.07184682041406631, 0.2957373559474945, 0.34401440620422363, -0.1318037211894989, -0.007773484103381634, 0.04225075617432594, 0.04406323283910751, 0.14687567949295044, 0.07998795062303543, 0.11360671371221542, 0.2849363386631012, -0.09197647124528885, 0.016657205298542976, -0.04230864346027374, -0.01424806285649538, -0.06908884644508362, 0.045314885675907135, 0.08216670155525208, -0.09241747111082077, -0.022950593382120132, 0.08125471323728561, -0.29741767048835754, 0.10791494697332382, -0.15600289404392242, -0.14948409795761108, -0.05027429759502411, -0.008771711029112339, 0.014683255925774574, 0.019041186198592186, 0.09663030505180359, 0.025651484727859497, -0.07275258749723434, 0.07816889137029648, 0.024486342445015907, -0.23020237684249878, -0.01345184724777937, 0.1456068754196167, -0.06789913028478622, -0.025938833132386208, -0.021313713863492012, 0.051610056310892105, 0.05763651058077812, 0.09027529507875443, -0.03809558227658272, -0.0746568813920021, -0.007141788024455309, -0.022818787023425102, 0.01914946548640728, 0.0597183033823967, 0.06841408461332321, -0.0920223817229271, 0.1167774423956871, -0.07350476831197739, 0.0650370642542839, 0.037623800337314606, -0.022277191281318665, 0.0018526542698964477, 0.013183658011257648, -0.06512464582920074, 0.05533479526638985, 0.1295643299818039, -0.025459708645939827, -0.002524374984204769, -0.028180841356515884, -0.0767761766910553, -0.024015206843614578, -0.04643676429986954, -0.09101243317127228, -0.18130090832710266, -0.12738600373268127, 0.041754670441150665, -0.03240608796477318, -0.2046082615852356, 0.0060346988029778, -0.1128578633069992, 0.03700976446270943, -0.14154092967510223, 0.10004086047410965, 0.07216610759496689, 0.004716616589576006, 0.006774604320526123, 0.0675399899482727, 0.045677728950977325, 0.14796748757362366, -0.16543124616146088, -0.04919974133372307 ]
null
null
transformers
# **-- EMODa --** ## BERT-model for danish multi-class classification of emotions Classifies a danish sentence into one of 6 different emotions: | Danish emotion | Ekman's emotion | | ----- | ----- | | 😞 **Afsky** | Disgust | | 😨 **Frygt** | Fear | | 😄 **Glæde** | Joy | | 😱 **Overraskelse** | Surprise | | 😢 **Tristhed** | Sadness | | 😠 **Vrede** | Anger | # How to use ```python from transformers import pipeline model_path = "NikolajMunch/danish-emotion-classification" classifier = pipeline("sentiment-analysis", model=model_path, tokenizer=model_path) prediction = classifier("Jeg er godt nok ked af at mine SMS'er er slettet") print(prediction) # [{'label': 'Tristhed', 'score': 0.9725030660629272}] ``` or ```python from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("NikolajMunch/danish-emotion-classification") model = AutoModelForSequenceClassification.from_pretrained("NikolajMunch/danish-emotion-classification") ```
{"language": ["da"], "tags": ["sentiment", "emotion", "danish"], "widget": [{"text": "Hold da op! Kan det virkelig passe?"}]}
text-classification
NikolajMunch/danish-emotion-classification
[ "transformers", "pytorch", "bert", "text-classification", "sentiment", "emotion", "danish", "da", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "da" ]
TAGS #transformers #pytorch #bert #text-classification #sentiment #emotion #danish #da #autotrain_compatible #endpoints_compatible #region-us
-- EMODa -- =========== BERT-model for danish multi-class classification of emotions ------------------------------------------------------------ Classifies a danish sentence into one of 6 different emotions: How to use ========== or
[]
[ "TAGS\n#transformers #pytorch #bert #text-classification #sentiment #emotion #danish #da #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ 47 ]
[ "passage: TAGS\n#transformers #pytorch #bert #text-classification #sentiment #emotion #danish #da #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ -0.0415271632373333, 0.001309699029661715, -0.008122066967189312, 0.02095496468245983, 0.1358557939529419, 0.0420796275138855, 0.006673609372228384, 0.05199701711535454, 0.1432308405637741, 0.03109455667436123, 0.11931753903627396, 0.16458606719970703, -0.021613897755742073, -0.001902140793390572, -0.08141276985406876, -0.3653740882873535, 0.050183530896902084, 0.08718115836381912, 0.0990593358874321, 0.0893336683511734, 0.10203463584184647, -0.055973753333091736, 0.09198615700006485, -0.02373163215816021, -0.060378555208444595, 0.06344839930534363, 0.008030491881072521, -0.04383102431893349, 0.1399523913860321, 0.02432311326265335, 0.10365530103445053, 0.05097581818699837, -0.0267769917845726, -0.20598812401294708, 0.04802684113383293, -0.025930991396307945, -0.07321731001138687, 0.01804228313267231, 0.055242497473955154, -0.11475107073783875, 0.18980666995048523, 0.025906525552272797, -0.006536013446748257, 0.03535453975200653, -0.08230646699666977, -0.1570834368467331, 0.05552196130156517, 0.08442217856645584, 0.01947558857500553, 0.06963776797056198, -0.04178519919514656, 0.1387113481760025, -0.16570553183555603, 0.11223619431257248, 0.08546271175146103, -0.13686591386795044, -0.021127799525856972, 0.09251946955919266, 0.1253999024629593, 0.015330024994909763, -0.16074897348880768, 0.06461795419454575, 0.022226577624678612, 0.05142601951956749, -0.04065972566604614, -0.10204830020666122, -0.016865679994225502, -0.005289718043059111, -0.04975615814328194, -0.006162126082926989, 0.1622275561094284, 0.012033980339765549, 0.05761484429240227, -0.07799557596445084, -0.030976103618741035, -0.06512957066297531, 0.0006680585211142898, -0.03046341985464096, -0.022533170878887177, 0.05930245667695999, 0.020245609804987907, 0.04584337770938873, -0.1313638836145401, 0.11332815885543823, -0.1736745983362198, 0.180526465177536, -0.03589698672294617, 0.021516423672437668, -0.03893560916185379, -0.002872262615710497, -0.08371467143297195, -0.08226105570793152, 0.024106301367282867, -0.11933176964521408, 0.055480897426605225, -0.04585956409573555, -0.07034498453140259, -0.02128027379512787, 0.0012021237052977085, 0.028816843405365944, 0.027900515124201775, 0.02117626741528511, -0.005300744902342558, 0.08571518957614899, 0.18298561871051788, 0.14342056214809418, 0.015866190195083618, -0.03842347860336304, -0.10840572416782379, -0.1500832438468933, 0.00631660595536232, -0.017045829445123672, -0.15417484939098358, -0.00551773002371192, 0.05360863357782364, 0.005693359766155481, -0.07224905490875244, 0.12519554793834686, -0.09411027282476425, 0.013733381405472755, -0.07478691637516022, 0.01871178112924099, 0.03262375667691231, 0.05025215819478035, -0.012376004830002785, 0.23769527673721313, -0.10422633588314056, 0.004244872834533453, -0.08274462819099426, 0.15141676366329193, 0.010899090208113194, 0.04128304123878479, -0.009743193164467812, -0.08266641944646835, 0.1350160539150238, -0.10815689712762833, 0.09166311472654343, -0.1393021047115326, -0.07765822112560272, -0.03977830708026886, 0.08225109428167343, -0.045896001160144806, -0.01116473414003849, -0.08083479106426239, 0.00660730293020606, 0.0936475396156311, -0.04424170032143593, -0.13877734541893005, -0.05731920152902603, 0.06056642532348633, -0.09721996635198593, 0.12337533384561539, -0.07276229560375214, 0.040499262511730194, -0.17657674849033356, -0.040400926023721695, -0.1379900574684143, 0.007359994575381279, -0.06821925193071365, 0.19026412069797516, 0.05954454094171524, -0.04765889048576355, -0.04184773564338684, 0.05473155528306961, -0.143796905875206, 0.2012714445590973, -0.16023777425289154, -0.12340744584798813, 0.1655147820711136, -0.0809205174446106, -0.04234141483902931, 0.1461898535490036, 0.010807011276483536, 0.052252791821956635, 0.12379596382379532, 0.22612158954143524, -0.031704314053058624, -0.04487874358892441, -0.044725991785526276, 0.1587984561920166, -0.10007486492395401, 0.035983867943286896, 0.00992436520755291, 0.08168641477823257, 0.02217133902013302, 0.06341806054115295, 0.18025623261928558, 0.09629315882921219, -0.053577788174152374, -0.030864622443914413, 0.016388511285185814, 0.009224324487149715, 0.13119801878929138, 0.14372967183589935, 0.05023002624511719, -0.14213648438453674, -0.026478921994566917, -0.05618739128112793, 0.008521318435668945, -0.0055540320463478565, 0.03692324832081795, -0.01596984639763832, 0.0751362070441246, 0.03334568440914154, 0.021660761907696724, -0.13156230747699738, 0.0021728631108999252, -0.07193716615438461, 0.14593248069286346, -0.013890678063035011, 0.1917789727449417, 0.1001790389418602, -0.10861272364854813, -0.06867846101522446, 0.025425249710679054, 0.1411418616771698, 0.009275609627366066, -0.010355362668633461, -0.2186002880334854, 0.10393248498439789, -0.06441360712051392, 0.032948464155197144, -0.103035569190979, -0.01660947874188423, 0.17810440063476562, 0.13207611441612244, -0.04888475313782692, 0.052269693464040756, -0.05618074908852577, 0.051704004406929016, -0.07189568877220154, 0.026193048804998398, 0.12338470667600632, -0.058517251163721085, -0.10033654421567917, 0.2767542898654938, -0.10236084461212158, 0.18613207340240479, 0.21024185419082642, -0.35028913617134094, -0.010505680926144123, -0.02112462744116783, -0.03614477440714836, 0.08478638529777527, 0.09820035845041275, 0.015071404166519642, 0.03976738825440407, -0.026197167113423347, 0.0918976217508316, -0.012962386012077332, -0.02696722187101841, 0.02892698347568512, -0.04676380008459091, -0.12789054214954376, 0.13895180821418762, 0.004138659220188856, -0.1479329615831375, 0.21789249777793884, 0.2339317798614502, 0.05036015063524246, 0.29048171639442444, 0.027589598670601845, 0.05849146842956543, 0.058690957725048065, -0.08871017396450043, -0.09950714558362961, 0.07823139429092407, -0.2013235092163086, -0.04182188957929611, 0.016458826139569283, -0.012983581982553005, 0.009802253916859627, -0.06766815483570099, -0.07294007390737534, -0.019139902666211128, 0.014357688836753368, -0.005631209816783667, 0.10107249766588211, 0.024035537615418434, 0.09610798209905624, 0.010050838813185692, -0.06957987695932388, 0.08670800179243088, 0.025930708274245262, -0.06953636556863785, 0.15596501529216766, -0.17686666548252106, -0.3032515048980713, -0.0752471461892128, -0.11221607029438019, -0.04219429939985275, 0.02191963791847229, 0.06693688780069351, -0.16930051147937775, -0.0455414280295372, 0.04022319242358208, 0.16027812659740448, -0.06726007908582687, -0.04641103744506836, -0.06737038493156433, 0.028167014941573143, -0.11833125352859497, -0.04022407904267311, -0.049949560314416885, -0.09028276056051254, 0.020746268332004547, 0.11125605553388596, -0.12693659961223602, 0.04695369675755501, 0.18320085108280182, 0.024214202538132668, 0.01118065882474184, -0.0548539012670517, 0.1762000322341919, -0.1542004942893982, 0.0200001522898674, 0.021390650421380997, -0.10695184767246246, 0.06872181594371796, 0.22749628126621246, 0.030549578368663788, -0.08762534707784653, -0.01576777920126915, 0.03918374702334404, -0.07775659114122391, -0.13093410432338715, -0.15456971526145935, -0.06514435261487961, 0.18941938877105713, -0.019139520823955536, 0.06291668862104416, 0.09797307848930359, 0.07148855179548264, -0.018353519961237907, -0.14112421870231628, -0.060978442430496216, 0.06532255560159683, 0.24876050651073456, -0.06044653430581093, 0.05209994688630104, -0.014951435849070549, -0.10003915429115295, 0.12557028234004974, -0.002613272052258253, -0.06954112648963928, 0.0668790191411972, 0.027129806578159332, -0.012097499333322048, 0.07418791949748993, 0.06446573883295059, 0.08767987042665482, -0.035826340317726135, -0.07509028911590576, -0.019994614645838737, 0.008710836991667747, -0.07496605068445206, 0.036684587597846985, 0.0893726572394371, -0.033974241465330124, -0.11586570739746094, -0.1973341703414917, 0.11988859623670578, 0.03008740022778511, 0.07555925101041794, -0.1114344522356987, -0.03461092710494995, 0.0695863887667656, -0.033892516046762466, -0.06848039478063583, 0.03271767497062683, 0.01751984842121601, -0.15733836591243744, 0.108342744410038, -0.0002980394347105175, 0.07057016342878342, -0.06278350949287415, 0.10143536329269409, -0.14769764244556427, -0.13238683342933655, 0.013664178550243378, 0.07986623793840408, -0.24201971292495728, 0.21591992676258087, 0.01168996561318636, -0.07090394198894501, -0.12700867652893066, -0.07330232858657837, 0.04782799258828163, 0.23319180309772491, 0.09140155464410782, 0.06288424879312515, -0.024672601372003555, -0.14452818036079407, 0.04055972769856453, 0.011062939651310444, 0.12653495371341705, -0.022883743047714233, -0.05575539916753769, 0.020103368908166885, 0.009084484539926052, -0.03249169513583183, 0.07602325081825256, 0.033175304532051086, -0.09006191790103912, 0.0576651468873024, 0.04532928392291069, 0.04625928774476051, 0.12061908096075058, -0.082274429500103, -0.16929715871810913, 0.16732433438301086, -0.049264904111623764, -0.03397386893630028, -0.1059592068195343, -0.0962570384144783, -0.0216488279402256, -0.03969964757561684, -0.04161743074655533, -0.043657854199409485, 0.032810062170028687, -0.11260770261287689, -0.1574734002351761, 0.15517786145210266, -0.10099927335977554, -0.0719200074672699, -0.03408423066139221, 0.13872522115707397, 0.02224581316113472, 0.049147021025419235, 0.03157590702176094, -0.0045598591677844524, -0.0888962671160698, -0.08452832698822021, 0.08216805756092072, -0.010518454015254974, -0.032871946692466736, 0.016504524275660515, -0.03951379656791687, -0.019343111664056778, -0.017622902989387512, -0.061413705348968506, 0.17641228437423706, 0.24834378063678741, -0.019714774563908577, 0.15097030997276306, 0.1644832044839859, -0.038436487317085266, -0.39417386054992676, 0.007791778538376093, -0.11128716170787811, 0.006265516392886639, 0.005105057265609503, -0.11796772480010986, 0.11807959526777267, -0.06585908681154251, -0.0030945944599807262, 0.01845192350447178, -0.07917215675115585, -0.07844093441963196, 0.2254553735256195, -0.033913467079401016, 0.44872021675109863, -0.07729941606521606, -0.021123429760336876, -0.06463214755058289, -0.12219151109457016, 0.18283028900623322, 0.02578074112534523, 0.058158427476882935, 0.008658789098262787, 0.22716248035430908, 0.060280170291662216, 0.005212096963077784, 0.11866661161184311, 0.021187487989664078, -0.01801704242825508, -0.17261922359466553, -0.07088623940944672, -0.021496430039405823, 0.012169040739536285, 0.002595454454421997, -0.10755462199449539, -0.07632680982351303, -0.13993524014949799, -0.05156625434756279, -0.18992535769939423, 0.09668760746717453, -0.011175209656357765, -0.08011813461780548, -0.06378614902496338, 0.06357511878013611, 0.02561138942837715, -0.03746767342090607, 0.13708311319351196, -0.10058651119470596, 0.06180860847234726, -0.003223342588171363, 0.2448105663061142, -0.06684668362140656, 0.0706566572189331, -0.016671234741806984, -0.08424843102693558, 0.07143663614988327, -0.057451535016298294, 0.05096915736794472, 0.15975208580493927, -0.06818458437919617, 0.04461970180273056, 0.07831821590662003, -0.04854942485690117, -0.05377493053674698, 0.10307876020669937, -0.19515274465084076, 0.005867337342351675, -0.08289550244808197, -0.11329732835292816, 0.11334224045276642, -0.02665349841117859, 0.07578357309103012, -0.009234122931957245, -0.015233264304697514, 0.01639670692384243, -0.03373698517680168, -0.010790023021399975, -0.0106020113453269, -0.015555664896965027, -0.0076660108752548695, -0.10582499206066132, 0.03759020194411278, -0.10413110256195068, -0.3803469240665436, 0.06524426490068436, 0.1525808572769165, -0.11634021997451782, -0.10377805680036545, 0.005817211698740721, 0.18100358545780182, -0.1868734210729599, -0.04917879030108452, -0.04544539004564285, -0.19393393397331238, 0.05371001735329628, 0.2746768891811371, 0.09714052826166153, 0.08188784122467041, -0.04008813574910164, 0.012364131398499012, 0.0029788268730044365, 0.015430023893713951, 0.008936227299273014, -0.052559979259967804, -0.04322956129908562, -0.02121960185468197, -0.053787097334861755, 0.14060761034488678, -0.10423938930034637, -0.05286690592765808, -0.12961417436599731, -0.028762655332684517, -0.09646452218294144, -0.14529429376125336, -0.0958239957690239, -0.03346717357635498, 0.033394645899534225, -0.03723723813891411, 0.03139927238225937, -0.12119884788990021, -0.12767294049263, 0.07626483589410782, 0.049578405916690826, 0.028352215886116028, -0.04674077033996582, -0.06370948255062103, 0.08850168436765671, -0.017514396458864212, 0.1377130001783371, 0.13228631019592285, -0.04942956566810608, 0.1498984396457672, -0.23105557262897491, -0.03899040073156357, 0.1181238517165184, -0.05371436849236488, 0.05047161504626274, 0.19284474849700928, -0.04208557307720184, 0.06640701740980148, 0.05051547288894653, 0.09510026127099991, 0.04181014373898506, -0.04627244174480438, 0.11901377886533737, 0.13340629637241364, -0.2080659121274948, -0.045136235654354095, 0.010200881399214268, -0.005148868542164564, -0.058335304260253906, 0.15334218740463257, -0.07280712574720383, 0.05651625618338585, 0.02104649320244789, 0.02234027348458767, 0.049785397946834564, -0.16793563961982727, -0.06830015033483505, -0.06297598034143448, 0.001221232581883669, -0.011116287671029568, 0.21382322907447815, 0.07498527318239212, -0.01592031493782997, 0.06324432045221329, 0.10105358064174652, -0.04478200897574425, 0.0014803883386775851, 0.10454870015382767, 0.0856165885925293, -0.0814327746629715, -0.1329813152551651, 0.0895104855298996, 0.035255301743745804, 0.02112766169011593, 0.11220463365316391, 0.02455083094537258, 0.15805332362651825, 0.10474972426891327, -0.029604386538267136, 0.08880022168159485, -0.026451196521520615, -0.1621301919221878, -0.051306866109371185, 0.0903095230460167, 0.009763682261109352, 0.19060687720775604, 0.0759468600153923, 0.02794717065989971, 0.05981649458408356, -0.06196346506476402, -0.034596558660268784, -0.17001627385616302, -0.18009033799171448, -0.04882265254855156, -0.09966792911291122, 0.029598480090498924, -0.11630300432443619, 0.00010539386857999489, -0.06423501670360565, 0.11994726210832596, -0.08326030522584915, 0.07069554179906845, -0.042881619185209274, -0.060125499963760376, 0.13323132693767548, -0.010835468769073486, 0.01093646977096796, -0.03098226524889469, -0.014637066051363945, -0.15823809802532196, -0.022782662883400917, -0.010179433971643448, 0.007324214559048414, -0.12003657966852188, -0.08393686264753342, -0.15502573549747467, -0.11442146450281143, -0.028482060879468918, 0.041575659066438675, -0.019175196066498756, 0.05110109597444534, -0.040238190442323685, 0.0342913381755352, -0.0025629117153584957, 0.12290836125612259, -0.015679264441132545, 0.050225261598825455, -0.009958973154425621, 0.10802233219146729, -0.05512060225009918, 0.13468222320079803, -0.05586528405547142, 0.022876882925629616, -0.07714705169200897, 0.2681247889995575, 0.28600621223449707, -0.05213097110390663, 0.023849578574299812, -0.018149591982364655, 0.06670395284891129, 0.062332648783922195, 0.06326648592948914, 0.08292365819215775, 0.1846877485513687, -0.1310737580060959, 0.03283119574189186, -0.05796634033322334, -0.014902804046869278, -0.011465068906545639, 0.030750250443816185, 0.13237044215202332, -0.03142109140753746, -0.12052915245294571, 0.09310007840394974, -0.24903753399848938, 0.19290992617607117, 0.059870973229408264, -0.20392541587352753, -0.05720995366573334, -0.06582901626825333, 0.17677269876003265, 0.10746074467897415, 0.09244844317436218, 0.0198421198874712, -0.10372453927993774, 0.08106297254562378, 0.0027215820737183094, -0.257326602935791, -0.026224032044410706, 0.08286016434431076, -0.08115027844905853, 0.007937668822705746, -0.0634639784693718, -0.011973070912063122, 0.11159241199493408, 0.019143709912896156, 0.07439780980348587, -0.007780204992741346, 0.045080941170454025, -0.1017538532614708, -0.05991988256573677, 0.16444864869117737, 0.016590431332588196, -0.017979903146624565, 0.06053198501467705, -0.240289568901062, 0.05549396574497223, -0.045719344168901443, -0.09238612651824951, 0.012921443209052086, 0.21472065150737762, -0.060183871537446976, 0.011212706565856934, 0.1225314736366272, 0.0729651153087616, -0.07044224441051483, -0.10074479877948761, -0.027939073741436005, -0.025023356080055237, -0.13024283945560455, -0.02882644534111023, -0.01774623990058899, -0.08838214725255966, 0.180294930934906, -0.04019685834646225, -0.17028400301933289, -0.02872556820511818, -0.06530335545539856, 0.060370951890945435, -0.1527356505393982, 0.016097456216812134, -0.01668393984436989, 0.03954802453517914, 0.025243548676371574, -0.09291477501392365, 0.11327028274536133, 0.10050809383392334, -0.058422770351171494, -0.06766467541456223 ]
null
null
transformers
# AOT-GAN CelebA-HQ AOT-GAN is a model that can be used for image in-painting. The CelebA-HQ checkpoint is trained on synthetic human faces, which should make it suitable for touching up and restoring portraits. This model was generated using [AOT-GAN-for-Inpainting](https://github.com/researchmm/AOT-GAN-for-Inpainting), cited as ``` @inproceedings{yan2021agg, author = {Zeng, Yanhong and Fu, Jianlong and Chao, Hongyang and Guo, Baining}, title = {Aggregated Contextual Transformations for High-Resolution Image Inpainting}, booktitle = {Arxiv}, pages={-}, year = {2020} } ``` ## Dataset The CelebA-HQ dataset was created with this codebase: https://github.com/tkarras/progressive_growing_of_gans, owned by NVidia and licensed under Creative Commons Attribution-NonCommercial 4.0 International.
{"tags": ["face-recognition", "face-generation", "face-segmentation", "generative-adversarial-network"], "datasets": ["celeba-hq"], "metrics": ["L1", "PSNR", "SSIM", "FID"]}
null
NimaBoscarino/aot-gan-celebahq
[ "transformers", "pytorch", "face-recognition", "face-generation", "face-segmentation", "generative-adversarial-network", "dataset:celeba-hq", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #face-recognition #face-generation #face-segmentation #generative-adversarial-network #dataset-celeba-hq #endpoints_compatible #has_space #region-us
# AOT-GAN CelebA-HQ AOT-GAN is a model that can be used for image in-painting. The CelebA-HQ checkpoint is trained on synthetic human faces, which should make it suitable for touching up and restoring portraits. This model was generated using AOT-GAN-for-Inpainting, cited as ## Dataset The CelebA-HQ dataset was created with this codebase: URL owned by NVidia and licensed under Creative Commons Attribution-NonCommercial 4.0 International.
[ "# AOT-GAN CelebA-HQ\nAOT-GAN is a model that can be used for image in-painting. The CelebA-HQ checkpoint is trained on synthetic human faces, which should make it suitable for touching up and restoring portraits.\n\nThis model was generated using AOT-GAN-for-Inpainting, cited as", "## Dataset\nThe CelebA-HQ dataset was created with this codebase: URL owned by NVidia and licensed under Creative Commons Attribution-NonCommercial 4.0 International." ]
[ "TAGS\n#transformers #pytorch #face-recognition #face-generation #face-segmentation #generative-adversarial-network #dataset-celeba-hq #endpoints_compatible #has_space #region-us \n", "# AOT-GAN CelebA-HQ\nAOT-GAN is a model that can be used for image in-painting. The CelebA-HQ checkpoint is trained on synthetic human faces, which should make it suitable for touching up and restoring portraits.\n\nThis model was generated using AOT-GAN-for-Inpainting, cited as", "## Dataset\nThe CelebA-HQ dataset was created with this codebase: URL owned by NVidia and licensed under Creative Commons Attribution-NonCommercial 4.0 International." ]
[ 61, 83, 40 ]
[ "passage: TAGS\n#transformers #pytorch #face-recognition #face-generation #face-segmentation #generative-adversarial-network #dataset-celeba-hq #endpoints_compatible #has_space #region-us \n# AOT-GAN CelebA-HQ\nAOT-GAN is a model that can be used for image in-painting. The CelebA-HQ checkpoint is trained on synthetic human faces, which should make it suitable for touching up and restoring portraits.\n\nThis model was generated using AOT-GAN-for-Inpainting, cited as## Dataset\nThe CelebA-HQ dataset was created with this codebase: URL owned by NVidia and licensed under Creative Commons Attribution-NonCommercial 4.0 International." ]
[ -0.12059642374515533, 0.06364324688911438, -0.0011887848377227783, 0.035698458552360535, 0.07676995545625687, 0.09274786710739136, 0.09868425130844116, 0.06846047937870026, 0.07629382610321045, -0.01592729613184929, 0.09985877573490143, -0.0329742431640625, 0.07716236263513565, 0.11836022883653641, 0.05193975567817688, -0.11295205354690552, 0.033130913972854614, 0.13968223333358765, 0.001079159090295434, 0.07562339305877686, 0.05427584424614906, -0.008050316013395786, 0.14119981229305267, 0.018635807558894157, -0.29104387760162354, 0.02432957850396633, -0.031781233847141266, -0.04504946619272232, 0.11557567864656448, 0.031154103577136993, 0.0596175454556942, 0.05864642933011055, -0.019971134141087532, -0.04112904518842697, 0.042482901364564896, -0.034593671560287476, -0.10855358093976974, 0.04423164576292038, -0.08287320286035538, -0.02025105245411396, 0.31278809905052185, 0.018613623455166817, -0.09537719935178757, -0.00036568648647516966, -0.08660789579153061, -0.19932685792446136, 0.020864015445113182, 0.12827640771865845, 0.0029885012190788984, 0.03432070091366768, 0.01803424581885338, 0.09159534424543381, -0.04893168807029724, 0.06258468329906464, 0.10022477060556412, -0.266017884016037, -0.03910617157816887, 0.12956073880195618, -0.02468252182006836, -0.09647790342569351, 0.011290703900158405, 0.08349573612213135, 0.010933810845017433, 0.05208659917116165, 0.05436071753501892, -0.07875926792621613, -0.12170548737049103, -0.010507139377295971, -0.014263846911489964, -0.007324853911995888, 0.14603422582149506, 0.009642831981182098, -0.02068638615310192, -0.014088459312915802, -0.051937036216259, -0.029393920674920082, -0.009698697365820408, 0.006761907134205103, -0.034166399389505386, -0.01110159233212471, -0.1118258610367775, -0.009852555580437183, -0.0654931515455246, -0.03638770058751106, -0.061131253838539124, -0.08975564688444138, 0.024145418778061867, 0.11686703562736511, -0.10409469902515411, 0.0841701328754425, -0.12000676244497299, -0.07949236035346985, -0.07430732250213623, -0.05770798772573471, 0.0064298007637262344, 0.055755890905857086, -0.012040662579238415, 0.04839874804019928, 0.06473463773727417, 0.14086300134658813, 0.13506653904914856, -0.026875309646129608, -0.06362447142601013, 0.07656842470169067, 0.022876210510730743, 0.11283701658248901, -0.1259213387966156, -0.08560699224472046, 0.09823755919933319, -0.06583862751722336, 0.02081199921667576, -0.07404523342847824, -0.11987311393022537, -0.10127127915620804, 0.06392237544059753, 0.009767194278538227, -0.019898494705557823, 0.11162279546260834, -0.012164353393018246, 0.007592127658426762, 0.1726963371038437, 0.005237886682152748, 0.016340281814336777, 0.01920301653444767, 0.0054019601084291935, -0.005933098029345274, 0.17867924273014069, -0.0054070535115897655, -0.024914110079407692, -0.09564226865768433, -0.01011054776608944, -0.04321720078587532, -0.02703498862683773, -0.014362476766109467, 0.007138948421925306, -0.08255691826343536, 0.036386534571647644, -0.19514082372188568, -0.008345452137291431, -0.0023498081136494875, 0.08109012246131897, -0.046244844794273376, -0.003394140163436532, -0.036089081317186356, -0.038923513144254684, -0.046087440103292465, -0.010372988879680634, -0.11172852665185928, -0.024650244042277336, 0.01989872194826603, 0.058230627328157425, 0.1849627047777176, -0.24925486743450165, 0.010813676752150059, -0.04752635583281517, 0.015043691731989384, -0.01710360124707222, -0.01888715662062168, 0.05491838976740837, 0.052265655249357224, 0.012644415721297264, -0.027315301820635796, -0.05620352551341057, 0.007514262106269598, 0.021492445841431618, 0.1172850951552391, -0.12581148743629456, -0.027115073055028915, 0.25194084644317627, -0.06283611804246902, -0.2263089120388031, 0.08078092336654663, -0.010900963097810745, 0.12405296415090561, 0.0811719223856926, 0.06786943227052689, 0.03556809201836586, -0.06370346993207932, -0.011758851818740368, -0.0785716250538826, -0.033336468040943146, -0.21327486634254456, -0.04844500124454498, 0.12229563295841217, -0.1291847974061966, 0.03489486873149872, -0.1780548095703125, 0.13611556589603424, -0.05219513177871704, -0.04887155070900917, 0.01458643563091755, -0.0639738216996193, -0.16382363438606262, 0.055907636880874634, 0.032489314675331116, 0.024038514122366905, -0.034313082695007324, -0.3186051845550537, -0.004608467686921358, -0.11173297464847565, 0.061625752598047256, -0.11361426115036011, 0.12229657918214798, -0.04476805031299591, 0.0431019626557827, -0.03187265992164612, 0.0030878365505486727, 0.010280140675604343, 0.051829222589731216, 0.039166294038295746, 0.0020139673724770546, -0.005148719530552626, -0.004448133986443281, -0.014370077289640903, -0.015755854547023773, 0.022641507908701897, -0.015935121104121208, 0.08537214994430542, -0.08424220234155655, -0.02066885307431221, -0.03916623443365097, 0.07079742848873138, -0.0013434868305921555, 0.020636001601815224, -0.034617990255355835, 0.03857537731528282, 0.023118333891034126, 0.038567084819078445, -0.0208115316927433, 0.033128771930933, -0.04416626691818237, -0.009731318801641464, 0.06246782839298248, 0.016115201637148857, -0.08837007731199265, 0.13832834362983704, -0.04644050821661949, 0.08878744393587112, 0.11404789239168167, -0.1002763956785202, -0.03357723355293274, 0.04393729940056801, -0.0070053692907094955, -0.005585937760770321, 0.1005743145942688, 0.019264506176114082, 0.10395923256874084, -0.08417656272649765, 0.07362094521522522, -0.04321375861763954, 0.06893236190080643, 0.026834072545170784, 0.02430007793009281, -0.02152944914996624, -0.019372042268514633, 0.277040034532547, -0.11058412492275238, 0.018071992322802544, 0.078436940908432, 0.011117566376924515, 0.11159683018922806, 0.08873884379863739, 0.007903722114861012, -0.05710151046514511, -0.04189392551779747, 0.02261045016348362, 0.18468670547008514, -0.1958133578300476, -0.09450298547744751, 0.021916134282946587, -0.09514383971691132, 0.05725351348519325, -0.06867223232984543, -0.02013634704053402, -0.006506489124149084, -0.016806762665510178, -0.053500089794397354, 0.08327988535165787, -0.08357948809862137, 0.07723424583673477, 0.004063804168254137, -0.14604692161083221, 0.01148141548037529, -0.025851771235466003, -0.053235866129398346, 0.14031901955604553, 0.006741904187947512, -0.23812280595302582, -0.04336469992995262, -0.03710214048624039, -0.012374315410852432, 0.03780851885676384, 0.03696667030453682, 0.015801256522536278, 0.010484525933861732, -0.0521659180521965, 0.05164312571287155, 0.06378515064716339, 0.029637683182954788, -0.03079286217689514, -0.06575441360473633, -0.03595395386219025, -0.04304663464426994, -0.02002129517495632, -0.08008786290884018, -0.05725938454270363, 0.17472068965435028, -0.18909163773059845, 0.20590868592262268, 0.010619379580020905, -0.00008197992428904399, 0.04316593334078789, -0.059969231486320496, 0.18073803186416626, -0.07186280936002731, 0.0687592476606369, 0.017169808968901634, 0.051617398858070374, -0.008810307830572128, 0.14640893042087555, -0.02001062221825123, -0.11126089096069336, -0.05959472805261612, -0.08887985348701477, -0.12344709783792496, 0.05194840207695961, -0.05064475163817406, -0.0778469517827034, -0.021794112399220467, 0.059127889573574066, 0.02447441965341568, 0.09764739125967026, 0.07380788773298264, 0.01186139602214098, 0.010545427910983562, -0.0014877585927024484, 0.027544258162379265, -0.031232593581080437, 0.020376935601234436, 0.035433489829301834, -0.036131054162979126, -0.06383541971445084, 0.035013314336538315, 0.1570066213607788, 0.24572017788887024, 0.017067695036530495, 0.0031151052098721266, 0.0683147981762886, 0.17643189430236816, 0.08429518342018127, 0.08539384603500366, -0.024299725890159607, -0.006503757555037737, -0.018370794132351875, 0.0014061345718801022, 0.009253628551959991, 0.025306852534413338, 0.06763307750225067, -0.06006155163049698, 0.03386472165584564, -0.06229320541024208, 0.008662164211273193, 0.1486087143421173, 0.053918976336717606, -0.3055080771446228, 0.064632847905159, -0.02335386350750923, 0.16087695956230164, -0.13201279938220978, 0.03831151872873306, 0.045021895319223404, -0.021002469584345818, 0.09375745803117752, -0.038597702980041504, 0.046727750450372696, -0.021292999386787415, 0.015291092917323112, 0.05647057294845581, -0.20367197692394257, 0.07697070389986038, 0.027690047398209572, -0.16943462193012238, 0.1363818496465683, -0.028745656833052635, 0.01024395227432251, -0.04941853880882263, -0.07634641230106354, 0.013500409200787544, 0.2217123657464981, 0.17492173612117767, 0.01373294461518526, 0.027631986886262894, -0.004568703006953001, 0.0012966640060767531, 0.15149670839309692, -0.03714035078883171, -0.07216041535139084, 0.05608363822102547, 0.0268317349255085, -0.0339055061340332, 0.025204438716173172, 0.18452338874340057, -0.046759046614170074, -0.11073778569698334, 0.027383465319871902, 0.17725422978401184, 0.018641272559762, 0.007618504110723734, 0.021374009549617767, -0.02031060867011547, 0.08521654456853867, -0.032076191157102585, -0.025763263925909996, -0.07642325013875961, 0.017688363790512085, 0.030302736908197403, -0.06688127666711807, 0.034944888204336166, -0.0356406643986702, 0.01170949824154377, -0.009077093563973904, -0.20233963429927826, 0.0752616748213768, -0.06015859916806221, 0.06822177022695541, -0.037453584372997284, -0.0010881865164265037, 0.05515633523464203, -0.03626270219683647, 0.057077620178461075, 0.028356580063700676, -0.126937597990036, -0.018319077789783478, 0.07984946668148041, 0.012711449526250362, -0.10709141939878464, 0.03787212446331978, -0.0752330794930458, -0.1936744749546051, -0.030745523050427437, 0.08640515059232712, 0.03858662769198418, 0.07219494879245758, -0.094990573823452, 0.04436248540878296, 0.10559771955013275, -0.04010820388793945, -0.2696264088153839, -0.10118452459573746, -0.0880555659532547, 0.13747048377990723, -0.08005630224943161, -0.11186966300010681, 0.06058558076620102, -0.0667760893702507, -0.024358661845326424, 0.14663349092006683, -0.1607675850391388, -0.027202727273106575, 0.18800611793994904, 0.03082883358001709, 0.36153289675712585, -0.11850790679454803, 0.0433335080742836, -0.03909067064523697, -0.1138109490275383, 0.10756296664476395, -0.008557576686143875, 0.059804584830999374, -0.1398351639509201, 0.0690384954214096, -0.03526842221617699, -0.02782067283987999, 0.052628058940172195, 0.1091979369521141, 0.07132496684789658, -0.04894350841641426, -0.043616291135549545, 0.09354933351278305, -0.04364161938428879, 0.07007662951946259, -0.02974281646311283, 0.03012496419250965, -0.0621396005153656, -0.008350452408194542, -0.1390710324048996, 0.1000608429312706, 0.011422715149819851, -0.004090069327503443, -0.10205139964818954, 0.01211321447044611, 0.021924152970314026, 0.06461922079324722, 0.053343869745731354, -0.058571506291627884, 0.10733772069215775, -0.09212826192378998, -0.016546213999390602, -0.13012932240962982, 0.03515204042196274, -0.02391396090388298, -0.03343058004975319, 0.1663188934326172, -0.02178000472486019, 0.021712051704525948, 0.09061140567064285, -0.03663421422243118, -0.000035431097785476595, 0.050078947097063065, -0.07554330676794052, 0.011846509762108326, 0.1306018978357315, -0.1253744214773178, 0.10039614140987396, -0.014531044289469719, -0.10364978760480881, 0.12894122302532196, 0.11869111657142639, 0.11419886350631714, -0.0649331733584404, -0.03629741072654724, -0.019778814166784286, 0.00351652130484581, -0.03779501095414162, -0.08381273597478867, 0.07255401462316513, -0.04188808798789978, -0.05725652724504471, -0.0457671619951725, 0.008155577816069126, 0.03402131795883179, 0.0088860634714365, -0.042060136795043945, -0.144272580742836, -0.11778780817985535, -0.06083507090806961, 0.213798388838768, -0.238371342420578, -0.09890792518854141, 0.009391029365360737, -0.07777798175811768, 0.02267010696232319, 0.09977313131093979, 0.05215098708868027, 0.03554182127118111, 0.0011437180219218135, -0.0777551680803299, -0.083102285861969, -0.02408592775464058, -0.1492345929145813, 0.0695890411734581, 0.01060204952955246, -0.15865622460842133, 0.027781104668974876, 0.13793644309043884, -0.07361964136362076, -0.005624281242489815, -0.06566546857357025, 0.02659727819263935, -0.1157156452536583, -0.009143400005996227, -0.045270878821611404, -0.015549381263554096, 0.028740592300891876, -0.042337674647569656, -0.09153789281845093, 0.017123335972428322, -0.04126273840665817, 0.023894265294075012, -0.0200077835470438, -0.03521498292684555, -0.05548962205648422, -0.052655793726444244, -0.0064757647924125195, -0.060575660318136215, 0.09890001267194748, 0.008866065181791782, -0.11348146945238113, 0.003202949883416295, -0.1613471657037735, -0.10754745453596115, 0.031563982367515564, 0.01232623215764761, 0.002967654727399349, -0.02532481960952282, 0.041948672384023666, 0.07484666258096695, -0.006870635785162449, 0.00753775704652071, 0.1130613312125206, 0.00521423714235425, 0.021414751186966896, -0.11520781368017197, -0.030398281291127205, 0.03616650775074959, 0.03845781087875366, 0.050141509622335434, 0.09008853882551193, 0.010982215404510498, -0.04056745022535324, 0.025111498311161995, -0.05405746027827263, -0.01971663348376751, -0.009940765798091888, -0.021393191069364548, -0.05624089017510414, -0.0564996562898159, 0.03817371651530266, -0.023514768108725548, 0.23803232610225677, -0.020468823611736298, -0.10014903545379639, -0.05878806859254837, -0.005135280545800924, -0.0897870808839798, -0.046944864094257355, 0.12811113893985748, -0.028470885008573532, -0.013373201712965965, 0.10757230967283249, 0.07197083532810211, -0.020037688314914703, 0.0855899453163147, 0.056127045303583145, 0.003244678722694516, 0.14870089292526245, 0.11823702603578568, 0.08623715490102768, -0.030161956325173378, 0.03527923300862312, -0.14913654327392578, -0.10989541560411453, 0.07702144235372543, 0.05077337473630905, 0.06121540069580078, 0.1882237046957016, -0.018988043069839478, 0.004284291993826628, 0.004791600164026022, -0.03341614082455635, -0.023593928664922714, -0.2221708744764328, -0.0768769308924675, -0.07458437234163284, 0.02428843267261982, -0.039996299892663956, -0.15113747119903564, 0.14203687012195587, -0.0064809503965079784, -0.020797446370124817, 0.12368807941675186, 0.047509320080280304, -0.07650972157716751, 0.1421346664428711, 0.02987484261393547, -0.06493746489286423, 0.029869437217712402, 0.10740575194358826, -0.003361277049407363, 0.11225436627864838, 0.01969200186431408, 0.026798609644174576, 0.04951870068907738, 0.06253936886787415, -0.08162818104028702, -0.05175095424056053, -0.0587766170501709, 0.01926255412399769, -0.019466673955321312, 0.03080768510699272, -0.029152289032936096, -0.03339007869362831, 0.045405104756355286, -0.034950848668813705, 0.0012838460970669985, 0.007679613772779703, -0.13231244683265686, 0.08854793012142181, -0.06086282432079315, -0.03451889008283615, -0.06365105509757996, 0.01557337399572134, -0.035095203667879105, 0.2532162070274353, 0.31443190574645996, -0.06169945001602173, 0.006890782620757818, 0.012459555640816689, 0.029983853921294212, -0.002882040571421385, 0.10945997387170792, 0.10766185075044632, 0.2489432841539383, -0.002071397379040718, -0.09014295041561127, -0.0906655564904213, 0.008211320266127586, 0.0034691791515797377, 0.02085733786225319, 0.007765633519738913, -0.07457559555768967, -0.08642291277647018, 0.05911986529827118, -0.011030446738004684, -0.03576551750302315, 0.24483178555965424, -0.09627882391214371, -0.04780706763267517, -0.008227298967540264, -0.016178589314222336, 0.00273062102496624, 0.03571417182683945, -0.12435216456651688, -0.005637860391288996, 0.19119113683700562, 0.0077814022079110146, -0.16076219081878662, 0.038590531796216965, 0.006427268497645855, -0.17551609873771667, 0.18313035368919373, -0.019377443939447403, 0.031135454773902893, 0.054601166397333145, 0.022350028157234192, -0.047169286757707596, 0.04664788022637367, -0.022850334644317627, 0.13935251533985138, -0.0181170292198658, 0.018760347738862038, -0.016798483207821846, 0.012775289826095104, 0.04237169399857521, 0.06116354838013649, -0.016087369993329048, -0.05886157974600792, 0.023408569395542145, -0.06962891668081284, -0.002320236060768366, -0.05661656707525253, 0.08343677222728729, 0.0040403977036476135, -0.020329615101218224, -0.06282561272382736, -0.01002661045640707, 0.004349996335804462, 0.03498010337352753, 0.014024176634848118, -0.04867884889245033, 0.004233162384480238, -0.04828694090247154, -0.11139329522848129, 0.025175968185067177, -0.1516866832971573, 0.021399542689323425, -0.12369852513074875, -0.05417484790086746, -0.018989337608218193, 0.08733338862657547, -0.008553902618587017, 0.021466610953211784, -0.03513888269662857, 0.1026703342795372, 0.03811751306056976, 0.049254484474658966, -0.13130898773670197, -0.15610972046852112 ]
null
null
transformers
# AOT-GAN Places2 AOT-GAN is a model that can be used for image in-painting. The Places2 checkpoint is trained on a dataset which should make it suitable for touching up and restoring images of landscapes, buildings, and other natural and developed places. This model was generated using [AOT-GAN-for-Inpainting](https://github.com/researchmm/AOT-GAN-for-Inpainting), cited as ``` @inproceedings{yan2021agg, author = {Zeng, Yanhong and Fu, Jianlong and Chao, Hongyang and Guo, Baining}, title = {Aggregated Contextual Transformations for High-Resolution Image Inpainting}, booktitle = {Arxiv}, pages={-}, year = {2020} } ``` ## Dataset The Places2 dataset can be found here: http://places2.csail.mit.edu/download.html
{"tags": ["scene-recognition", "scene-generation", "generative-adversarial-network"], "datasets": ["places2"], "metrics": ["L1", "PSNR", "SSIM", "FID"]}
null
NimaBoscarino/aot-gan-places2
[ "transformers", "pytorch", "scene-recognition", "scene-generation", "generative-adversarial-network", "dataset:places2", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #scene-recognition #scene-generation #generative-adversarial-network #dataset-places2 #endpoints_compatible #has_space #region-us
# AOT-GAN Places2 AOT-GAN is a model that can be used for image in-painting. The Places2 checkpoint is trained on a dataset which should make it suitable for touching up and restoring images of landscapes, buildings, and other natural and developed places. This model was generated using AOT-GAN-for-Inpainting, cited as ## Dataset The Places2 dataset can be found here: URL
[ "# AOT-GAN Places2\nAOT-GAN is a model that can be used for image in-painting. The Places2 checkpoint is trained on a dataset which should make it suitable for touching up and restoring images of landscapes, buildings, and other natural and developed places.\n\nThis model was generated using AOT-GAN-for-Inpainting, cited as", "## Dataset\nThe Places2 dataset can be found here: URL" ]
[ "TAGS\n#transformers #pytorch #scene-recognition #scene-generation #generative-adversarial-network #dataset-places2 #endpoints_compatible #has_space #region-us \n", "# AOT-GAN Places2\nAOT-GAN is a model that can be used for image in-painting. The Places2 checkpoint is trained on a dataset which should make it suitable for touching up and restoring images of landscapes, buildings, and other natural and developed places.\n\nThis model was generated using AOT-GAN-for-Inpainting, cited as", "## Dataset\nThe Places2 dataset can be found here: URL" ]
[ 55, 86, 15 ]
[ "passage: TAGS\n#transformers #pytorch #scene-recognition #scene-generation #generative-adversarial-network #dataset-places2 #endpoints_compatible #has_space #region-us \n# AOT-GAN Places2\nAOT-GAN is a model that can be used for image in-painting. The Places2 checkpoint is trained on a dataset which should make it suitable for touching up and restoring images of landscapes, buildings, and other natural and developed places.\n\nThis model was generated using AOT-GAN-for-Inpainting, cited as## Dataset\nThe Places2 dataset can be found here: URL" ]
[ -0.1521386355161667, 0.13262362778186798, -0.0031547776889055967, 0.09317400306463242, 0.07158641517162323, 0.02821098081767559, 0.0811384841799736, 0.10913333296775818, 0.09956803172826767, -0.054206132888793945, 0.06240123137831688, 0.034347642213106155, 0.04356715455651283, 0.027432242408394814, 0.058489538729190826, -0.22554077208042145, -0.04023612663149834, 0.09938796609640121, -0.23217681050300598, 0.09610696882009506, 0.061157338321208954, -0.02536122128367424, 0.10910431295633316, 0.04326889291405678, -0.17651541531085968, 0.03438368812203407, -0.058357954025268555, -0.09587116539478302, 0.2309548556804657, -0.018343091011047363, 0.04541689157485962, 0.018158847466111183, 0.04360048100352287, 0.0073323966935276985, 0.03311594948172569, 0.00430286955088377, -0.09252482652664185, 0.06601554900407791, 0.04557213559746742, -0.003915173001587391, 0.021171746775507927, 0.034046780318021774, -0.11492855101823807, -0.0340454988181591, -0.12948821485042572, -0.18836863338947296, 0.050718046724796295, 0.011033708229660988, -0.011749588884413242, 0.07534953206777573, 0.0018781304825097322, 0.15185879170894623, -0.07589209079742432, 0.05942240357398987, 0.11387453973293304, -0.20068785548210144, -0.03700980916619301, 0.15489572286605835, 0.048152171075344086, 0.10763993859291077, -0.00787251815199852, 0.06355573236942291, -0.022495275363326073, 0.07485372573137283, 0.0038814248982816935, -0.053151585161685944, -0.06696608662605286, 0.0606766939163208, -0.047285787761211395, -0.052732810378074646, 0.13842061161994934, -0.016082899644970894, 0.01770027168095112, 0.016679460182785988, -0.15088754892349243, 0.09930908679962158, -0.016998423263430595, -0.06479932367801666, -0.02994263730943203, 0.04085325822234154, -0.18292805552482605, -0.08834324777126312, -0.11926975846290588, 0.019251268357038498, -0.004193319473415613, 0.0012269525323063135, 0.009621474891901016, 0.1342281699180603, -0.24594177305698395, 0.18357490003108978, -0.04747428745031357, -0.06329531222581863, -0.02017781510949135, -0.02771470509469509, -0.04347572103142738, 0.057719770818948746, -0.02998877689242363, 0.17497733235359192, 0.05594317987561226, 0.14391174912452698, 0.19883719086647034, -0.061113037168979645, -0.09765204787254333, 0.12450477480888367, 0.04120052605867386, 0.027386443689465523, -0.0983080118894577, -0.11497130244970322, 0.05174257233738899, -0.09454181790351868, 0.009287700057029724, -0.031548675149679184, -0.1366180181503296, -0.08949902653694153, 0.010339832864701748, -0.014131966978311539, 0.051869746297597885, 0.03229682520031929, -0.04661480709910393, -0.035652920603752136, 0.0852050632238388, 0.026590270921587944, 0.02307124435901642, -0.04342247545719147, 0.03177694231271744, -0.00606357678771019, 0.05777667462825775, 0.019318586215376854, -0.052662067115306854, -0.10030501335859299, -0.030057314783334732, 0.024950774386525154, -0.0013763236347585917, 0.0030455775558948517, -0.00580908078700304, -0.032114628702402115, 0.06562711298465729, -0.19115406274795532, -0.16625648736953735, 0.02558579295873642, 0.10585501790046692, -0.01744755171239376, -0.035145580768585205, -0.009588375687599182, -0.034255802631378174, -0.01362601574510336, 0.015644878149032593, 0.004294298589229584, -0.01284688338637352, 0.052566319704055786, -0.024457864463329315, 0.18800689280033112, -0.0971839502453804, -0.0028883807826787233, -0.10237064957618713, -0.03521132841706276, 0.08183087408542633, -0.003555652452632785, 0.021107368171215057, 0.007304065860807896, -0.025221101939678192, -0.013362555764615536, -0.09465627372264862, 0.05400844290852547, 0.027893191203475, 0.15974999964237213, -0.13336780667304993, -0.050794512033462524, 0.20246842503547668, -0.10524356365203857, -0.13231626152992249, 0.11515036970376968, -0.03581807762384415, 0.16315658390522003, 0.05506708100438118, 0.05722236633300781, 0.028039835393428802, -0.00953364185988903, 0.05330486223101616, 0.077092245221138, -0.16299250721931458, -0.22883746027946472, 0.005659156013280153, 0.19067087769508362, -0.16168250143527985, 0.02212318778038025, -0.19500719010829926, 0.12336228787899017, -0.04275381565093994, -0.05922604352235794, -0.03023097850382328, 0.015952952206134796, -0.06690072268247604, 0.03770199790596962, 0.1005198284983635, 0.02078351192176342, -0.07279593497514725, -0.20129117369651794, 0.03791797533631325, 0.007921977899968624, 0.10383228957653046, -0.029039282351732254, 0.05396004021167755, -0.024936044588685036, -0.01535406056791544, -0.12548629939556122, 0.10217788815498352, -0.03115956299006939, 0.07846035808324814, -0.05442410707473755, 0.1470140665769577, 0.04485105723142624, -0.008740103803575039, -0.014327628538012505, 0.014652568846940994, -0.03400348126888275, -0.047364991158246994, 0.04982028529047966, -0.016346830874681473, -0.017346270382404327, -0.0878315269947052, 0.02105777896940708, -0.03167565166950226, -0.001681242254562676, 0.021343719214200974, 0.09421150386333466, 0.03760385140776634, -0.000189449725439772, -0.06995584070682526, -0.03015672229230404, -0.14210984110832214, -0.05156952515244484, 0.05857725441455841, 0.07026966661214828, -0.04273196682333946, 0.1495818942785263, 0.09860934317111969, 0.15288248658180237, 0.11218652874231339, -0.08571399748325348, -0.01807365007698536, 0.04851693660020828, -0.10255665332078934, 0.03919321298599243, -0.04608982056379318, -0.036949336528778076, 0.02081589587032795, -0.001232675276696682, 0.0961606428027153, -0.005947756581008434, 0.02941123954951763, 0.002189749153330922, -0.007942868396639824, -0.05971360206604004, -0.005711269099265337, 0.24253587424755096, -0.13115374743938446, 0.08454925566911697, 0.1152929812669754, 0.03442014381289482, -0.04658620059490204, 0.101361483335495, -0.006037540268152952, -0.053364645689725876, -0.010191178880631924, 0.027132375165820122, 0.17131365835666656, -0.1623813807964325, -0.042379382997751236, 0.03572928532958031, -0.06263400614261627, 0.0525616817176342, -0.14022241532802582, -0.037631962448358536, 0.0009215475874952972, 0.07127489894628525, 0.017841104418039322, 0.03954177349805832, -0.08224736899137497, 0.11013133078813553, -0.06928564608097076, -0.13003088533878326, 0.07566942274570465, 0.008671490475535393, -0.021969018504023552, 0.1595546305179596, -0.015308638103306293, -0.3961743414402008, -0.06108125299215317, 0.03184371069073677, 0.12465567141771317, 0.004477785900235176, 0.07242558151483536, -0.059234436601400375, 0.0060654510743916035, 0.034319642931222916, 0.029209863394498825, -0.02090349607169628, -0.0090539725497365, -0.005453256890177727, 0.018747739493846893, -0.07708963006734848, -0.06100623309612274, 0.002393345581367612, -0.0316648855805397, -0.0450693778693676, 0.2353142648935318, -0.0726521760225296, 0.11092820763587952, 0.011597800999879837, 0.049849189817905426, 0.04776079207658768, 0.0018119299784302711, 0.13569660484790802, -0.12073464691638947, 0.13423891365528107, 0.015378816984593868, 0.03686506673693657, 0.06607069075107574, 0.07563647627830505, 0.04051996394991875, -0.09049711376428604, -0.042023953050374985, -0.03154171630740166, -0.14101704955101013, -0.030658766627311707, -0.037254150956869125, -0.06371357291936874, -0.025679685175418854, 0.08540613204240799, 0.05424898862838745, 0.03853519633412361, 0.1506919413805008, 0.0416073277592659, -0.02875184454023838, -0.05874740332365036, 0.02803054451942444, -0.015019201673567295, -0.044144533574581146, 0.055915940552949905, -0.01279272511601448, -0.18468628823757172, 0.09435804188251495, 0.1924334466457367, 0.2989729642868042, 0.012370879761874676, -0.04135192185640335, 0.02162162773311138, 0.16419771313667297, 0.10607745498418808, 0.021052250638604164, 0.05243770033121109, -0.007324610371142626, -0.06674828380346298, 0.02038365788757801, 0.1541939079761505, 0.10764966905117035, 0.0028223979752510786, -0.13412876427173615, 0.08791591972112656, -0.01846040040254593, 0.02296980284154415, 0.0037082359194755554, 0.11306282877922058, -0.3260551691055298, 0.06587696075439453, 0.00923011265695095, 0.11355043202638626, -0.1324634552001953, 0.08691832423210144, 0.14788393676280975, -0.06711205840110779, -0.029772749170660973, -0.11183294653892517, 0.034597113728523254, 0.03975873067975044, -0.0036854480858892202, 0.10179001092910767, -0.18247181177139282, 0.05297918990254402, 0.04896550253033638, -0.11337579786777496, 0.26761913299560547, -0.040473341941833496, -0.0714360699057579, -0.0840667113661766, -0.02218414470553398, 0.06467825174331665, 0.08244649320840836, 0.2496698945760727, 0.06186821311712265, -0.01103189680725336, -0.011508055031299591, -0.02510206215083599, 0.08088182657957077, -0.04230885952711105, -0.1066640168428421, 0.010172799229621887, 0.025765934959053993, -0.030790355056524277, 0.0171212125569582, 0.05706155672669411, -0.06763068586587906, -0.12976163625717163, 0.005221656523644924, 0.1629069745540619, -0.07502438873052597, 0.012917920015752316, -0.0605429969727993, 0.03032284788787365, 0.15952323377132416, 0.13413149118423462, -0.02553774230182171, -0.09730693697929382, 0.10071683675050735, 0.009744212031364441, -0.03752516955137253, 0.022604163736104965, 0.020385390147566795, 0.07723037898540497, -0.09054703265428543, -0.1837974190711975, 0.08294438570737839, -0.11678880453109741, 0.08989471197128296, -0.07768920809030533, 0.09050045907497406, 0.12086033076047897, -0.04769979044795036, 0.026471387594938278, 0.03852168470621109, -0.12113142013549805, -0.010141735896468163, 0.11108288168907166, -0.08706874400377274, -0.08046405762434006, -0.016673162579536438, -0.0042741489596664906, -0.03433576598763466, 0.019067231565713882, 0.007794625125825405, 0.1585538238286972, 0.035715967416763306, -0.12861798703670502, 0.0894826129078865, 0.042113158851861954, -0.061523597687482834, -0.29132887721061707, -0.0100041339173913, -0.054724253714084625, 0.0882699117064476, 0.10678516328334808, -0.1557473987340927, 0.14991004765033722, -0.05327644571661949, -0.05851233750581741, 0.09887921810150146, -0.2698887288570404, -0.02449662983417511, 0.20626653730869293, 0.06190333142876625, 0.4296468496322632, -0.05675828084349632, 0.03305811062455177, 0.03872951865196228, -0.051085732877254486, 0.11229326575994492, 0.02388494275510311, 0.02439206838607788, -0.08550428599119186, 0.10193110257387161, 0.005953267216682434, -0.06691593676805496, 0.09702068567276001, 0.1465386301279068, 0.05382772907614708, -0.06386440992355347, -0.1467878371477127, 0.27369335293769836, 0.0020163150038570166, -0.04610234498977661, 0.165648952126503, 0.07919064164161682, -0.09917376935482025, -0.018022755160927773, -0.1040666326880455, 0.08162780106067657, 0.03007417917251587, -0.08045904338359833, -0.08153337240219116, 0.09869322180747986, -0.02269207313656807, -0.014104790054261684, 0.10251989215612411, 0.10808749496936798, 0.052177347242832184, 0.04282717406749725, -0.11955271661281586, 0.037477586418390274, -0.07740207016468048, -0.03127102926373482, -0.036357469856739044, 0.1171020120382309, -0.03716365620493889, 0.013968600891530514, 0.10050734877586365, 0.03865215182304382, -0.015650268644094467, 0.10010573267936707, -0.02732667326927185, 0.043036263436079025, 0.12604506313800812, -0.07274061441421509, -0.04665451869368553, -0.1492333859205246, -0.037608444690704346, 0.15250417590141296, 0.09472804516553879, 0.15562383830547333, -0.05520457401871681, -0.04351460188627243, -0.002067319815978408, -0.00849761813879013, 0.002858678810298443, -0.05777633190155029, -0.002319582039490342, -0.01643269881606102, -0.06840673834085464, 0.03206224739551544, -0.016936488449573517, -0.025238318368792534, -0.028514791280031204, -0.04511275514960289, -0.11529526859521866, -0.10439471155405045, 0.013357597403228283, 0.06071993336081505, -0.19950146973133087, -0.03078489378094673, -0.015098998323082924, -0.010445477440953255, 0.06253299117088318, -0.09572400897741318, 0.09160105884075165, 0.00012174609582871199, -0.0817820355296135, -0.04324936866760254, -0.03953542187809944, -0.009755294770002365, -0.003430423093959689, 0.02971966750919819, -0.09395629912614822, -0.055482443422079086, -0.0007128743454813957, 0.11333639919757843, -0.06921587884426117, -0.10747604817152023, -0.07252709567546844, 0.0660824179649353, -0.10408998280763626, -0.0023023823741823435, -0.09269848465919495, -0.037814393639564514, -0.025263212621212006, -0.0812772810459137, -0.05339311435818672, -0.0025721522979438305, -0.08034881949424744, 0.03286487236618996, -0.005614304915070534, -0.03512517735362053, -0.028380027040839195, -0.05055573210120201, 0.03179305046796799, -0.06420742720365524, 0.09234742820262909, 0.04509398341178894, -0.13227416574954987, -0.03629134222865105, -0.07281599938869476, -0.14070874452590942, 0.10055519640445709, 0.023116545751690865, -0.00026244911714456975, -0.02182776667177677, 0.08089527487754822, 0.03587479889392853, 0.05969364941120148, -0.015526087023317814, 0.08530428260564804, -0.0498976856470108, 0.10326883941888809, -0.17593491077423096, -0.04578228294849396, -0.03400728106498718, 0.02935813181102276, 0.02927793748676777, 0.061607036739587784, 0.018944542855024338, -0.01683938503265381, 0.06568422168493271, -0.15770845115184784, -0.014570090919733047, -0.0024622748605906963, -0.039776504039764404, -0.07294595241546631, -0.07168669998645782, 0.013678758405148983, -0.023494282737374306, 0.21239452064037323, -0.021555887535214424, -0.031945206224918365, -0.046949103474617004, 0.02270430140197277, -0.10725364834070206, -0.08270082622766495, 0.10463335365056992, 0.03216477856040001, -0.04913680627942085, 0.03319467976689339, 0.06489425897598267, 0.04324720799922943, 0.029703136533498764, 0.05664489045739174, 0.04744130000472069, 0.08218012005090714, 0.06711901724338531, 0.0029736708384007215, -0.03624065965414047, -0.175380140542984, -0.14831286668777466, -0.0854807049036026, 0.07074917107820511, -0.008734703063964844, -0.014115551486611366, 0.14943373203277588, 0.04144559055566788, 0.01585201732814312, 0.026647472754120827, 0.008842632174491882, -0.06497332453727722, -0.30567941069602966, -0.0706166923046112, -0.13618934154510498, 0.028432132676243782, -0.05277489125728607, -0.07967275381088257, 0.1636982262134552, 0.031056219711899757, -0.02264869213104248, 0.024415291845798492, 0.13587386906147003, -0.08973181992769241, 0.0845702737569809, 0.001188867725431919, 0.0333276093006134, 0.018389444798231125, 0.04653732106089592, -0.09797709435224533, 0.1259242594242096, -0.04130902141332626, -0.011925307102501392, 0.05277363583445549, 0.07314649969339371, -0.1299630105495453, -0.008466459810733795, -0.10083526372909546, -0.006347506772726774, -0.0515567809343338, -0.06848201155662537, 0.03422597050666809, -0.040398649871349335, 0.06133054196834564, 0.15970107913017273, -0.05927562713623047, -0.00035646132891997695, -0.05006564408540726, 0.12905316054821014, -0.02572372369468212, -0.03758472949266434, 0.029336383566260338, 0.010541762225329876, -0.027401074767112732, 0.19024181365966797, 0.2736947536468506, -0.07331694662570953, -0.027584919705986977, 0.08206940442323685, 0.004872413352131844, 0.054943352937698364, 0.1478254497051239, 0.06365407258272171, 0.2617310881614685, -0.13236834108829498, -0.1883973777294159, -0.07845952361822128, -0.024092713370919228, -0.001075316802598536, 0.09989360719919205, 0.04394768923521042, -0.06071144714951515, -0.13036763668060303, 0.0817931517958641, -0.05589994043111801, 0.02678232640028, 0.08576347678899765, -0.12389115244150162, -0.11332125961780548, 0.014580746181309223, 0.004526827484369278, -0.014788201078772545, 0.07845071703195572, -0.06493467837572098, 0.009641792625188828, 0.13161233067512512, 0.03814211115241051, -0.21159029006958008, 0.034937430173158646, 0.04342084005475044, -0.14567212760448456, 0.09234481304883957, -0.030011150985956192, -0.009167352691292763, 0.041899941861629486, 0.036777324974536896, -0.09642209857702255, -0.045209430158138275, -0.03325873613357544, 0.07681388407945633, -0.07528906315565109, -0.06507684290409088, -0.01521885022521019, -0.10544174164533615, 0.0674840584397316, -0.03993681073188782, 0.011190569028258324, 0.029695240780711174, 0.06362225860357285, -0.14949090778827667, 0.0008454148774035275, 0.000007644202923984267, 0.06650400161743164, -0.0793362706899643, -0.022113768383860588, -0.02712964452803135, -0.04809430614113808, 0.004390772897750139, 0.03211470693349838, -0.0035210521891713142, -0.0009158390457741916, 0.016355765983462334, -0.04638413339853287, -0.15340958535671234, -0.013185204938054085, -0.07318858057260513, -0.11131493747234344, -0.001973428064957261, -0.04796219989657402, -0.028867224231362343, 0.05742860957980156, 0.060136646032333374, 0.01409319881349802, 0.025163935497403145, 0.11116588115692139, 0.00535215251147747, 0.04368162900209427, -0.05523727089166641, -0.08414684236049652 ]
null
null
transformers
# Harry Potter DialoGPT Model
{"tags": ["conversational"]}
text-generation
Ninja5000/DialoGPT-medium-HarryPotter
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Harry Potter DialoGPT Model
[ "# Harry Potter DialoGPT Model" ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Harry Potter DialoGPT Model" ]
[ 51, 8 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Harry Potter DialoGPT Model" ]
[ -0.0009023238671943545, 0.07815738022327423, -0.006546166725456715, 0.07792752981185913, 0.10655936598777771, 0.048972971737384796, 0.17639793455600739, 0.12185695022344589, 0.016568755730986595, -0.04774167761206627, 0.11647630482912064, 0.2130284160375595, -0.002118367003276944, 0.024608047679066658, -0.05022026598453522, -0.3065771162509918, 0.0474756620824337, 0.014356585219502449, -0.07174845039844513, 0.11724270135164261, 0.09064973145723343, -0.046179238706827164, 0.08330509811639786, -0.009135239757597446, -0.13198648393154144, -0.039482954889535904, 0.019292812794446945, -0.11745545268058777, 0.1662212759256363, 0.05298272892832756, 0.02469746209681034, -0.008447164669632912, -0.06598151475191116, -0.15036040544509888, 0.037190426141023636, -0.027472136542201042, -0.01080626156181097, 0.05462246760725975, 0.023526115342974663, -0.07521048933267593, 0.170567125082016, 0.17678891122341156, 0.0833497866988182, 0.0349111407995224, -0.14917024970054626, -0.045548245310783386, 0.008950977586209774, 0.05421316996216774, -0.017893504351377487, 0.09349167346954346, -0.019903047010302544, 0.11801653355360031, -0.04491448402404785, 0.09210366010665894, 0.15255063772201538, -0.4016275703907013, -0.027563704177737236, 0.08920855820178986, 0.05989706888794899, 0.12076901644468307, -0.10560955852270126, 0.03972794860601425, -0.0039703017100691795, 0.01236654631793499, -0.014540530741214752, -0.08304883539676666, -0.07308239489793777, 0.032504837960004807, -0.1272556483745575, 0.008525865152478218, 0.23756256699562073, -0.10643257945775986, 0.037069112062454224, -0.09791990369558334, -0.07414398342370987, 0.048336777836084366, -0.053761593997478485, -0.081727035343647, -0.054839808493852615, 0.06347949057817459, 0.004366500303149223, -0.06301609426736832, -0.08326146006584167, -0.0006536149303428829, -0.12781435251235962, 0.17595994472503662, 0.061243366450071335, 0.041611745953559875, -0.21322020888328552, 0.08940251916646957, 0.04477722570300102, -0.04711297154426575, 0.007116159424185753, -0.11796226352453232, 0.04023287072777748, 0.005483259446918964, -0.03256071358919144, -0.021854614838957787, 0.0393419973552227, 0.13909944891929626, -0.01777748204767704, 0.03252175822854042, 0.006831915583461523, 0.05811219662427902, 0.08162496984004974, 0.02222144603729248, 0.019291909411549568, -0.0818009302020073, 0.019385190680623055, -0.08128736168146133, -0.0030400939285755157, -0.048940129578113556, -0.17071883380413055, -0.07477642595767975, 0.052610911428928375, 0.020047198981046677, 0.03746970370411873, 0.08054786175489426, -0.0017944995779544115, -0.05560554191470146, 0.03284840285778046, 0.01671096310019493, -0.020622212439775467, -0.010361049324274063, -0.02412462793290615, 0.19123271107673645, 0.019619356840848923, 0.014111656695604324, -0.12379156798124313, 0.10023640841245651, -0.08179095387458801, 0.0037731381598860025, 0.02743307314813137, -0.04204464703798294, -0.004716555587947369, 0.02917117439210415, 0.023101668804883957, -0.1252521574497223, -0.1099385917186737, -0.0030569476075470448, -0.012054097838699818, -0.036421261727809906, -0.10490952432155609, -0.08483029156923294, -0.012153145857155323, 0.0449371263384819, -0.013397793285548687, 0.007936403155326843, -0.05143149942159653, 0.0985720232129097, -0.0514979362487793, 0.09873400628566742, -0.08342572301626205, 0.06359215080738068, -0.09124887734651566, -0.061886150389909744, -0.11452563107013702, 0.05216052383184433, 0.012905281968414783, 0.066250741481781, 0.016998225823044777, -0.044836658984422684, -0.014836243353784084, 0.05253177136182785, -0.07656687498092651, 0.1940697431564331, -0.041674621403217316, -0.12459053844213486, 0.24146439135074615, -0.09138800948858261, -0.1802034229040146, 0.12973085045814514, -0.022254703566432, 0.08523941785097122, 0.12802475690841675, 0.20380465686321259, -0.00019822151807602495, -0.01302915159612894, 0.07281201332807541, 0.07031642645597458, -0.09803894907236099, 0.06239739805459976, 0.029653839766979218, -0.008071083575487137, -0.08906278014183044, 0.05762826278805733, 0.046033453196287155, -0.010650773532688618, -0.035073768347501755, -0.001896020956337452, -0.012895751744508743, -0.022185025736689568, 0.14126582443714142, -0.02006692811846733, 0.1300428807735443, -0.06926563382148743, -0.03515486419200897, -0.009500149637460709, 0.03533667325973511, -0.04091939330101013, 0.08151165395975113, -0.0436173714697361, 0.10586477071046829, 0.09034156054258347, 0.053724925965070724, -0.13120363652706146, 0.00466286763548851, -0.015246815048158169, 0.17014820873737335, 0.08964069187641144, 0.05222717300057411, 0.06265474855899811, -0.0020888058934360743, -0.06708643585443497, 0.045407816767692566, 0.13778303563594818, -0.037020038813352585, -0.12218865007162094, -0.1755627691745758, 0.051157694309949875, -0.045444171875715256, 0.10855234414339066, -0.10010123997926712, 0.022670533508062363, -0.055906031280756, 0.07772238552570343, -0.024998966604471207, 0.020512236282229424, -0.0013405600329861045, -0.021700702607631683, -0.08356887847185135, -0.002377772703766823, 0.08597290515899658, -0.02048647589981556, -0.06707409024238586, 0.16556480526924133, -0.16400809586048126, 0.1631954461336136, 0.2116095870733261, -0.28542569279670715, -0.005696662236005068, -0.15163889527320862, -0.0208092350512743, 0.019645055755972862, 0.07834604382514954, 0.026225795969367027, 0.2044338881969452, -0.012928472831845284, 0.16565458476543427, -0.05699567869305611, -0.07730039209127426, -0.06881127506494522, -0.048101142048835754, 0.013522743247449398, 0.09095205366611481, 0.04542696103453636, -0.11962861567735672, 0.13119758665561676, 0.1054433062672615, 0.06484298408031464, 0.12711186707019806, 0.1030748188495636, -0.008113685995340347, 0.07252490520477295, -0.03624548763036728, -0.03462279960513115, -0.09254947304725647, -0.30446043610572815, -0.04840317741036415, 0.0939924493432045, 0.007963384501636028, 0.09285714477300644, -0.0919896736741066, -0.03311870992183685, 0.006042704917490482, 0.009473444893956184, 0.028337622061371803, 0.09653715789318085, 0.013490920886397362, 0.15320514142513275, -0.008011690340936184, -0.03430786728858948, 0.05891305208206177, 0.017982570454478264, -0.09147711098194122, 0.17280617356300354, -0.17050009965896606, -0.27190929651260376, -0.06990014761686325, -0.21745692193508148, -0.013139115646481514, 0.05258983001112938, 0.0786920040845871, -0.11818131804466248, -0.018352627754211426, -0.006239492911845446, 0.05685517191886902, -0.2425733357667923, 0.0004911290016025305, -0.1354890614748001, 0.0501418262720108, -0.1974833607673645, -0.09718500077724457, -0.02271542325615883, -0.013450481928884983, -0.0464281290769577, 0.13365240395069122, -0.1448695808649063, -0.011572926305234432, 0.2329535037279129, 0.032479673624038696, 0.027794739231467247, -0.05020907148718834, 0.19788463413715363, -0.0958966314792633, -0.023973820731043816, 0.11024576425552368, -0.05038975924253464, 0.04834126681089401, 0.06649978458881378, -0.012981836684048176, -0.08557141572237015, 0.023789849132299423, -0.068336620926857, -0.03150583803653717, -0.27926525473594666, -0.0930178239941597, -0.09319330751895905, 0.11305391043424606, 0.04079577326774597, 0.06421639025211334, 0.16545771062374115, 0.05191578343510628, -0.024325082078576088, -0.03006586618721485, 0.11609793454408646, 0.12905290722846985, 0.2277202159166336, -0.06067761778831482, 0.10221996158361435, 0.009445492178201675, -0.08203992247581482, 0.06062209978699684, 0.056782789528369904, 0.06324724853038788, 0.02584579586982727, 0.03694582358002663, -0.030939655378460884, 0.1121687963604927, 0.12571842968463898, 0.05258069559931755, 0.0481170229613781, 0.0002127334737451747, -0.0561506561934948, -0.008168719708919525, -0.05726633965969086, 0.06774696707725525, 0.061340972781181335, -0.12918008863925934, -0.08061543852090836, 0.0011613310780376196, 0.06660808622837067, -0.016230419278144836, 0.06823775917291641, -0.13560809195041656, -0.03582429885864258, 0.0790911465883255, -0.07693151384592056, -0.14156894385814667, 0.11972879618406296, -0.026570770889520645, -0.19904157519340515, 0.05265914276242256, 0.007704653777182102, 0.0908159390091896, -0.06360849738121033, 0.05343840271234512, -0.13023801147937775, -0.12935101985931396, -0.018437571823596954, 0.07945099472999573, -0.3450873792171478, 0.13536721467971802, -0.013286802917718887, -0.02876877970993519, -0.06474969536066055, -0.02640824392437935, 0.013905409723520279, 0.12719078361988068, 0.08667250722646713, 0.0008821099763736129, 0.0991629809141159, 0.03823768347501755, 0.04188435152173042, -0.002011700300499797, 0.10950417071580887, 0.0050011589191854, 0.004797275178134441, -0.04982118681073189, 0.007274609990417957, -0.05164213851094246, -0.07472953200340271, 0.08393982797861099, -0.20678792893886566, 0.09087453782558441, -0.03378438204526901, 0.08427679538726807, 0.04304937273263931, -0.018965769559144974, -0.1001204177737236, 0.19745583832263947, -0.012206900864839554, -0.11405988782644272, -0.07517550885677338, -0.02810264565050602, 0.09103139489889145, -0.013817726634442806, 0.012886416167020798, -0.045470476150512695, 0.032183047384023666, -0.1263762265443802, -0.1597503274679184, 0.08734500408172607, -0.04441224783658981, -0.10894393920898438, -0.025462759658694267, 0.20382575690746307, -0.007266622502356768, 0.08242089301347733, 0.01605331338942051, 0.010653935372829437, -0.18066231906414032, -0.04018142446875572, 0.02645772136747837, -0.0016437612939625978, 0.005979063920676708, 0.047698814421892166, 0.019091911613941193, 0.06207629665732384, -0.1069745197892189, -0.013920160941779613, 0.3158324360847473, 0.15978319942951202, -0.00912671908736229, 0.14943915605545044, 0.1093616932630539, -0.08669080585241318, -0.17238758504390717, -0.1171615794301033, -0.1210922971367836, -0.08425768464803696, -0.10681738704442978, -0.1525043100118637, 0.09535340964794159, -0.03392014652490616, 0.03498011827468872, 0.14615866541862488, -0.280263751745224, -0.10949636250734329, 0.13820378482341766, 0.010744688101112843, 0.3510635495185852, -0.12303631007671356, -0.044944874942302704, -0.06214528530836105, -0.16933435201644897, 0.08021392673254013, -0.031203703954815865, 0.11581093072891235, -0.0744495838880539, 0.19395925104618073, 0.01719796098768711, 0.014287159778177738, 0.0916559100151062, 0.05038322135806084, -0.05808406323194504, -0.07368700206279755, -0.10248131304979324, 0.010812131687998772, 0.03546109423041344, 0.010252019390463829, -0.008802837692201138, 0.0211968794465065, -0.11341743916273117, -0.050869911909103394, -0.06302189081907272, 0.0072614275850355625, -0.01001308299601078, -0.042155615985393524, -0.05533592775464058, -0.022557416930794716, -0.020093943923711777, 0.02266426384449005, 0.14185629785060883, -0.07527699321508408, 0.18586260080337524, 0.02357078716158867, 0.1586609035730362, -0.11956068128347397, -0.06724818795919418, -0.029193658381700516, -0.05280323326587677, 0.06468886137008667, -0.08884575963020325, -0.027708567678928375, 0.1332162618637085, -0.01903904788196087, 0.04655366763472557, 0.12936700880527496, 0.02046884410083294, 0.015383756719529629, 0.034968774765729904, -0.2578005790710449, -0.07463036477565765, -0.03505445644259453, -0.012416874058544636, 0.05272092670202255, 0.05525677278637886, 0.19735674560070038, -0.03551921248435974, -0.08521962910890579, 0.020131373777985573, 0.02735883742570877, -0.02776256389915943, 0.10749414563179016, 0.019579345360398293, -0.004837906453758478, -0.16151933372020721, 0.08257976174354553, -0.005964108742773533, -0.08297000825405121, 0.028665626421570778, 0.2024049311876297, -0.12141239643096924, -0.10309756547212601, -0.06804922968149185, 0.07315051555633545, -0.09220825880765915, 0.016043387353420258, -0.005091092549264431, -0.1521538347005844, 0.06916408240795135, 0.07598215341567993, 0.04075418785214424, 0.06513199955224991, -0.11743064224720001, -0.015730571001768112, -0.04170290008187294, -0.002195435343310237, 0.03521120920777321, 0.01863143965601921, -0.057492829859256744, 0.15846455097198486, -0.0676199421286583, 0.08538917452096939, -0.0744810476899147, -0.1058846190571785, -0.1395980566740036, 0.04660497233271599, -0.08038312196731567, -0.07247276604175568, -0.12832807004451752, -0.052204377949237823, -0.0067099276930093765, -0.03388519585132599, 0.006552806124091148, -0.06627799570560455, -0.10922821611166, 0.01822470687329769, -0.00743203004822135, -0.009385870769619942, -0.06096754968166351, 0.026706209406256676, 0.06246216222643852, -0.039788868278265, 0.15730851888656616, 0.22509248554706573, -0.13591648638248444, 0.11564400047063828, -0.09797432273626328, -0.105463907122612, 0.046008042991161346, 0.009427277371287346, 0.03594303876161575, 0.0503489226102829, -0.03594081476330757, 0.0044484552927315235, 0.03905477747321129, 0.08074651658535004, 0.08456914126873016, -0.06776505708694458, 0.020801106467843056, -0.05122765153646469, -0.14904099702835083, -0.016655439510941505, -0.0464773029088974, 0.06876829266548157, -0.006725262850522995, 0.11020535975694656, -0.0515950471162796, 0.07739507406949997, -0.07558431476354599, 0.050614211708307266, 0.021146971732378006, -0.14688286185264587, -0.006612539757043123, -0.07093682140111923, 0.042144812643527985, -0.008834975771605968, 0.20241086184978485, -0.03228091076016426, 0.010342049412429333, 0.033811055123806, 0.06203942745923996, -0.01957780309021473, 0.009357001632452011, 0.2014283686876297, 0.12640917301177979, -0.08496357500553131, -0.02679651789367199, 0.06793134659528732, 0.07248228788375854, 0.07093550264835358, 0.10807815194129944, -0.015352966263890266, 0.028434239327907562, 0.07829629629850388, -0.060215238481760025, 0.07576877623796463, -0.08603982627391815, -0.11668483167886734, 0.05793621391057968, 0.012955795042216778, -0.055695828050374985, 0.20305177569389343, 0.19142870604991913, -0.026278704404830933, 0.018410727381706238, -0.0029499190859496593, -0.10117456316947937, -0.15619947016239166, -0.05423750728368759, -0.07170962542295456, -0.1319410353899002, -0.004549739416688681, -0.16646917164325714, 0.022016216069459915, -0.01132756657898426, 0.09506805986166, -0.06855440139770508, -0.01345991250127554, 0.1364889293909073, -0.1055467277765274, 0.0847758799791336, -0.024517204612493515, 0.07877567410469055, -0.03746940940618515, -0.018209461122751236, -0.10342709720134735, 0.007514837197959423, 0.01131442841142416, 0.06840907037258148, -0.10897937417030334, 0.02432350255548954, -0.12208317965269089, -0.08617185056209564, -0.026142612099647522, 0.09279687702655792, -0.0403008833527565, 0.15116846561431885, 0.02645145356655121, -0.06710928678512573, -0.004313822835683823, 0.2646709978580475, -0.08046227693557739, -0.08319197595119476, -0.030799202620983124, 0.2152107208967209, 0.04053696244955063, 0.06396269053220749, 0.019140036776661873, 0.038027774542570114, -0.07184682041406631, 0.2957373559474945, 0.34401440620422363, -0.1318037211894989, -0.007773484103381634, 0.04225075617432594, 0.04406323283910751, 0.14687567949295044, 0.07998795062303543, 0.11360671371221542, 0.2849363386631012, -0.09197647124528885, 0.016657205298542976, -0.04230864346027374, -0.01424806285649538, -0.06908884644508362, 0.045314885675907135, 0.08216670155525208, -0.09241747111082077, -0.022950593382120132, 0.08125471323728561, -0.29741767048835754, 0.10791494697332382, -0.15600289404392242, -0.14948409795761108, -0.05027429759502411, -0.008771711029112339, 0.014683255925774574, 0.019041186198592186, 0.09663030505180359, 0.025651484727859497, -0.07275258749723434, 0.07816889137029648, 0.024486342445015907, -0.23020237684249878, -0.01345184724777937, 0.1456068754196167, -0.06789913028478622, -0.025938833132386208, -0.021313713863492012, 0.051610056310892105, 0.05763651058077812, 0.09027529507875443, -0.03809558227658272, -0.0746568813920021, -0.007141788024455309, -0.022818787023425102, 0.01914946548640728, 0.0597183033823967, 0.06841408461332321, -0.0920223817229271, 0.1167774423956871, -0.07350476831197739, 0.0650370642542839, 0.037623800337314606, -0.022277191281318665, 0.0018526542698964477, 0.013183658011257648, -0.06512464582920074, 0.05533479526638985, 0.1295643299818039, -0.025459708645939827, -0.002524374984204769, -0.028180841356515884, -0.0767761766910553, -0.024015206843614578, -0.04643676429986954, -0.09101243317127228, -0.18130090832710266, -0.12738600373268127, 0.041754670441150665, -0.03240608796477318, -0.2046082615852356, 0.0060346988029778, -0.1128578633069992, 0.03700976446270943, -0.14154092967510223, 0.10004086047410965, 0.07216610759496689, 0.004716616589576006, 0.006774604320526123, 0.0675399899482727, 0.045677728950977325, 0.14796748757362366, -0.16543124616146088, -0.04919974133372307 ]
null
null
transformers
# DialoGPT-medium-TWEWYJoshua Another not-so-good AI chatbot. Joshua from the game TWEWY(The World Ends With You). * Credits to Lynn's Devlab who made the amazing tutorial.
{"tags": ["conversational"]}
text-generation
Ninja5000/DialoGPT-medium-TWEWYJoshua
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
# DialoGPT-medium-TWEWYJoshua Another not-so-good AI chatbot. Joshua from the game TWEWY(The World Ends With You). * Credits to Lynn's Devlab who made the amazing tutorial.
[ "# DialoGPT-medium-TWEWYJoshua\n\nAnother not-so-good AI chatbot. Joshua from the game TWEWY(The World Ends With You).\n\n* Credits to Lynn's Devlab who made the amazing tutorial." ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n", "# DialoGPT-medium-TWEWYJoshua\n\nAnother not-so-good AI chatbot. Joshua from the game TWEWY(The World Ends With You).\n\n* Credits to Lynn's Devlab who made the amazing tutorial." ]
[ 55, 56 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n# DialoGPT-medium-TWEWYJoshua\n\nAnother not-so-good AI chatbot. Joshua from the game TWEWY(The World Ends With You).\n\n* Credits to Lynn's Devlab who made the amazing tutorial." ]
[ 0.002207736950367689, 0.0789964571595192, -0.0030414159409701824, 0.0679536685347557, 0.1326257884502411, 0.006409132853150368, 0.14200977981090546, 0.09946509450674057, 0.025142371654510498, -0.01302186306566, 0.1159868985414505, 0.1588083654642105, 0.010862999595701694, 0.14931990206241608, 0.006111084017902613, -0.26275700330734253, 0.028973346576094627, 0.07726865261793137, 0.0016700135311111808, 0.10506896674633026, 0.06581439822912216, -0.06305268406867981, 0.09770026057958603, -0.013720039278268814, -0.13077709078788757, -0.012475100345909595, 0.05228385329246521, -0.04567888379096985, 0.12092112749814987, 0.04732751101255417, 0.03998740762472153, 0.004805384669452906, -0.10260997712612152, -0.04383494704961777, 0.07300493866205215, 0.01813153363764286, -0.025041742250323296, 0.022015752270817757, -0.05417611077427864, 0.03700333088636398, 0.1989631950855255, 0.09651759266853333, -0.05985282361507416, 0.08531377464532852, -0.11462089419364929, 0.0009644352830946445, 0.012168151326477528, 0.03778625652194023, 0.04510601609945297, 0.07819874584674835, -0.015779662877321243, 0.3000876307487488, 0.022201484069228172, 0.10724444687366486, 0.1761249601840973, -0.354708194732666, -0.08312661200761795, 0.07795753329992294, 0.11098731309175491, 0.028090007603168488, -0.02993728034198284, 0.09256444126367569, 0.04093925654888153, 0.016270505264401436, -0.10381101816892624, -0.0709008127450943, -0.07959769666194916, 0.00467382138594985, -0.050442174077034, 0.04948682337999344, 0.2877310514450073, -0.00607520854100585, 0.027370963245630264, 0.006528133060783148, -0.10930954664945602, -0.04728635400533676, 0.03906983882188797, -0.12012872844934464, -0.08169586956501007, 0.018329676240682602, -0.10485387593507767, -0.08362362533807755, -0.12709510326385498, -0.018889866769313812, -0.19114142656326294, 0.23692473769187927, 0.03567498177289963, 0.02153073623776436, -0.2008974701166153, 0.10891274362802505, 0.045708976686000824, -0.08736763149499893, -0.01993611641228199, -0.07620333135128021, -0.016238059848546982, -0.0037068743258714676, -0.030522627755999565, -0.044162239879369736, 0.06353925168514252, 0.04311313480138779, 0.005134113132953644, 0.003604084486141801, 0.002290270058438182, 0.027856215834617615, 0.053919535130262375, 0.02703697793185711, -0.024801457300782204, -0.0854058787226677, 0.07767332345247269, -0.08841200172901154, 0.07012947648763657, -0.09943747520446777, -0.13831160962581635, -0.02091754786670208, 0.012886524200439453, 0.008953290060162544, 0.0863817110657692, 0.1447381228208542, -0.0556086003780365, -0.055783409625291824, 0.07890188694000244, 0.001437353785149753, -0.01505215559154749, 0.016655191779136658, -0.04865867644548416, 0.010274031199514866, -0.011515897698700428, 0.027820708230137825, -0.0861133560538292, -0.08302292227745056, -0.055010028183460236, -0.047765232622623444, -0.05674193799495697, -0.023673798888921738, 0.05460334196686745, 0.09164780378341675, 0.029123878106474876, -0.2159987837076187, -0.06193029507994652, 0.059353459626436234, -0.038278210908174515, -0.025191834196448326, -0.13434086740016937, -0.07518161833286285, -0.01989085227251053, 0.030474357306957245, -0.05272475257515907, -0.0454549640417099, -0.003512000199407339, 0.10885142534971237, -0.0030742438975721598, 0.1129336729645729, -0.21153652667999268, 0.0420357771217823, -0.038307856768369675, -0.06979212909936905, -0.09513171017169952, 0.0344734862446785, 0.012709559872746468, 0.06191449612379074, 0.008630833588540554, 0.030572660267353058, -0.059140220284461975, 0.05048780143260956, -0.03298192471265793, 0.17804588377475739, -0.1512436866760254, -0.07086358219385147, 0.13828428089618683, -0.062217406928539276, -0.2009555995464325, 0.15339729189872742, -0.017525408416986465, 0.11535459756851196, 0.12852942943572998, 0.221517875790596, 0.05051400884985924, -0.029539527371525764, 0.009043235331773758, 0.027341211214661598, -0.11431549489498138, -0.000014734332580701448, 0.04622383788228035, 0.12216227501630783, -0.08932224661111832, -0.007600517012178898, 0.020220763981342316, 0.11076144129037857, -0.0314912348985672, -0.03871763125061989, 0.016165560111403465, -0.07612882554531097, 0.07153742760419846, 0.046328961849212646, 0.09842779487371445, -0.07898161560297012, -0.018313448876142502, -0.05841619893908501, 0.044408757239580154, 0.01580377295613289, 0.053130194544792175, -0.14915622770786285, 0.06054970622062683, 0.12277017533779144, 0.05682406574487686, -0.07324566692113876, -0.10974130034446716, -0.021200714632868767, 0.230881467461586, 0.11374954879283905, 0.1605219691991806, 0.06852122396230698, -0.03732485696673393, -0.0020201164297759533, 0.011655050329864025, 0.11166335642337799, -0.01893545687198639, -0.03127772733569145, -0.08108532428741455, 0.11074697226285934, -0.041325345635414124, -0.047151874750852585, -0.053052693605422974, -0.0003049102670047432, 0.008365298621356487, 0.1117800772190094, 0.017713740468025208, 0.00871287751942873, 0.022883053869009018, 0.014139493927359581, -0.09530874341726303, -0.00897915381938219, 0.0672067403793335, 0.0007252737414091825, -0.16902895271778107, 0.2395753711462021, -0.1282302737236023, 0.15859389305114746, 0.17858171463012695, -0.20501476526260376, -0.00599473062902689, 0.09830114245414734, 0.028889326378703117, -0.020782364532351494, 0.10159803181886673, -0.041758086532354355, 0.23226895928382874, -0.057310134172439575, 0.12831319868564606, -0.001127583091147244, 0.013838326558470726, -0.045428670942783356, -0.014560333453118801, -0.02056051418185234, 0.10872086137533188, 0.010860283859074116, -0.16336221992969513, 0.09687864780426025, 0.11377803981304169, 0.05511757358908653, 0.22936232388019562, 0.04048991948366165, 0.06148212030529976, -0.01554172020405531, -0.00790418591350317, -0.03679465502500534, -0.036095134913921356, -0.2521402835845947, -0.013697914779186249, 0.035713519901037216, -0.0065672509372234344, 0.005179538857191801, -0.07157830893993378, -0.08498965203762054, -0.005426861811429262, 0.010094338096678257, -0.04722794517874718, 0.1893351674079895, -0.04184089973568916, 0.12147772312164307, -0.005230986513197422, -0.02654716745018959, 0.028759896755218506, 0.004108200781047344, -0.04541413113474846, 0.08643604069948196, -0.10921508818864822, -0.3403927683830261, -0.028887899592518806, -0.09018754214048386, -0.029681697487831116, 0.054021354764699936, 0.1008150577545166, -0.11370263993740082, 0.02271670289337635, 0.03685261309146881, 0.11994114518165588, -0.09533281624317169, -0.01904524303972721, 0.04063994064927101, -0.053616590797901154, -0.07817777246236801, -0.09354571253061295, -0.013264070264995098, 0.0013932414585724473, -0.1215537041425705, 0.15507884323596954, -0.10176108777523041, 0.09070515632629395, 0.13622784614562988, 0.02679784595966339, 0.05749542638659477, -0.04177665337920189, 0.18175344169139862, -0.13956518471240997, 0.053537920117378235, 0.1648237407207489, -0.019847560673952103, 0.02039840817451477, 0.04549184441566467, -0.017796019092202187, -0.04641516134142876, 0.07860098779201508, -0.03517431020736694, -0.07682797312736511, -0.14816808700561523, -0.07078801840543747, -0.0887388363480568, 0.15118815004825592, -0.03940984606742859, 0.04314834624528885, 0.061606112867593765, 0.04344693198800087, -0.030453311279416084, -0.0020598906558007, -0.039110586047172546, 0.06084797903895378, 0.07747354358434677, -0.08810658007860184, 0.09674300253391266, -0.03597182035446167, -0.10764260590076447, 0.12226047366857529, 0.04422345012426376, -0.027093825861811638, 0.002417242154479027, 0.10643760859966278, 0.029018716886639595, 0.07616026699542999, 0.05244239419698715, 0.029592249542474747, 0.04451550170779228, -0.06500016897916794, -0.04680994153022766, -0.020073188468813896, -0.057356271892786026, 0.03378008306026459, 0.05794652923941612, -0.0806136354804039, -0.03537192568182945, 0.09970394521951675, 0.06221223250031471, 0.05688183754682541, 0.030703522264957428, -0.09427328407764435, -0.05802707001566887, 0.056410398334264755, -0.08179721981287003, -0.08713854849338531, 0.11307535320520401, 0.15974724292755127, -0.08296442776918411, 0.059278834611177444, 0.05469578504562378, 0.08706414699554443, -0.05138558894395828, 0.07481251657009125, -0.12352693825960159, -0.11497660726308823, 0.01892244443297386, 0.06778441369533539, -0.3199770152568817, 0.0620647594332695, -0.053625572472810745, -0.03913377225399017, -0.05788364261388779, -0.09463808685541153, 0.041857749223709106, 0.09933498501777649, 0.0706091970205307, 0.02636694349348545, 0.02209516428411007, 0.012955011799931526, -0.05653991550207138, 0.029840223491191864, 0.05276733636856079, -0.03656927868723869, -0.02337549813091755, -0.029961904510855675, 0.010350096970796585, 0.0026777670718729496, 0.08294038474559784, -0.00120842014439404, -0.07418705523014069, 0.0503622442483902, 0.19049172103405, 0.08058000355958939, 0.03356209769845009, -0.051300209015607834, -0.04701041057705879, 0.15896622836589813, 0.04835819825530052, -0.011176316998898983, -0.0399470329284668, 0.004389147739857435, -0.03536084666848183, -0.10435684025287628, -0.04309287294745445, -0.0355922169983387, 0.0034661837853491306, -0.0503900945186615, -0.13011965155601501, 0.08855132758617401, -0.05176053196191788, -0.05478560924530029, -0.03769198805093765, 0.10282467305660248, 0.034460294991731644, 0.0694991797208786, 0.021675439551472664, -0.057811394333839417, -0.1555902361869812, -0.05816853418946266, 0.02269633114337921, -0.00035366506199352443, -0.01805383898317814, -0.054809052497148514, 0.02032865583896637, -0.04409993812441826, -0.10654802620410919, -0.05620428919792175, 0.29258689284324646, 0.05400291085243225, -0.08735466748476028, 0.09526897221803665, 0.004174604546278715, 0.018478786572813988, -0.28822511434555054, -0.1368277370929718, -0.06643706560134888, -0.019351761788129807, -0.026984404772520065, -0.1097368374466896, 0.06612124294042587, -0.12604166567325592, 0.009727725759148598, 0.12765707075595856, -0.1734681874513626, -0.12101505696773529, 0.08872053772211075, 0.0022061497438699007, 0.4536094069480896, -0.12044578045606613, 0.013263363391160965, 0.010668543167412281, -0.1692311018705368, 0.16608096659183502, -0.08098506927490234, 0.11520058661699295, 0.002424512291327119, 0.20683987438678741, 0.04701314494013786, 0.008138801902532578, 0.0369788222014904, -0.04815589636564255, -0.05809483677148819, -0.13926537334918976, -0.18555592000484467, -0.029060695320367813, -0.0050759995356202126, -0.00010425033542560413, -0.07707977294921875, -0.006019661668688059, -0.08697008341550827, 0.00986500084400177, -0.099365234375, 0.04253533482551575, -0.006707207765430212, -0.04041651263833046, -0.05933814123272896, 0.03074077144265175, -0.020771432667970657, 0.04484676569700241, 0.1648314893245697, -0.05737004801630974, 0.09495603293180466, 0.00024081235460471362, 0.1310572773218155, -0.10257618874311447, 0.08549212664365768, -0.043690647929906845, -0.07755036652088165, 0.11719118058681488, -0.13981135189533234, 0.03501289710402489, 0.08407179266214371, -0.026441175490617752, 0.023819370195269585, 0.04521746560931206, -0.04969252273440361, 0.09421778470277786, 0.1566910743713379, -0.24289512634277344, -0.22856667637825012, -0.09233511239290237, 0.07903070002794266, 0.15530791878700256, 0.10707933455705643, 0.14547008275985718, -0.03252987191081047, -0.0429958701133728, 0.015395406633615494, -0.0018612660933285952, -0.04695502668619156, -0.0016307785408571362, -0.05244097858667374, 0.0075201005674898624, -0.13321486115455627, 0.022208917886018753, 0.020731253549456596, -0.07025925815105438, 0.07550373673439026, 0.12607994675636292, -0.0893879383802414, -0.14558303356170654, -0.08268889784812927, 0.034997086971998215, -0.08586706221103668, -0.014746629633009434, -0.03393968567252159, -0.058580994606018066, 0.012256302870810032, 0.1105717346072197, 0.045682113617658615, 0.05568868666887283, -0.019191710278391838, -0.02278122305870056, 0.03203219175338745, 0.0016409065574407578, 0.04598775506019592, 0.00353154051117599, -0.15487675368785858, 0.1566110998392105, 0.0016548933926969767, 0.16077323257923126, -0.07604023814201355, -0.07241152971982956, -0.14020097255706787, 0.008532851934432983, -0.006426975131034851, -0.1045491024851799, -0.09131782501935959, -0.03811577335000038, 0.022971848025918007, -0.11984534561634064, -0.06970714032649994, -0.0009567891829647124, -0.06557221710681915, 0.013012983836233616, 0.015496942214667797, -0.011240162886679173, -0.03644206374883652, 0.01045910269021988, 0.09515605121850967, -0.009716601110994816, 0.15032319724559784, 0.062340378761291504, -0.09876681864261627, 0.08445242047309875, -0.08246710151433945, -0.025659875944256783, 0.05307759717106819, 0.039648473262786865, 0.06493404507637024, 0.08259382098913193, -0.029003305360674858, 0.02576187439262867, 0.11910976469516754, 0.02076011523604393, 0.18238291144371033, -0.03560766577720642, 0.010580589063465595, 0.018117276951670647, -0.13311626017093658, -0.06613826751708984, -0.02220909856259823, 0.02089201845228672, -0.001534075359813869, 0.11068029701709747, -0.025114264339208603, 0.04627314209938049, -0.0213021170347929, 0.051668692380189896, 0.036991674453020096, -0.1538763791322708, -0.0010966956615447998, -0.09271261841058731, 0.019529905170202255, -0.03609172999858856, -0.023224439471960068, -0.06706619262695312, -0.11646687984466553, -0.0017307011876255274, -0.0169101320207119, -0.059231050312519073, -0.04027374088764191, 0.1248234212398529, 0.002148505998775363, -0.053936030715703964, -0.09719959646463394, 0.07862788438796997, 0.03645004332065582, 0.1154157817363739, 0.15374593436717987, 0.01077390555292368, -0.011934418231248856, 0.06646357476711273, -0.02992219664156437, 0.062424398958683014, -0.169714093208313, -0.23739783465862274, -0.13383959233760834, 0.036517225205898285, -0.0452439971268177, 0.028212059289216995, 0.12468799948692322, 0.0054696714505553246, -0.020841969177126884, -0.09866419434547424, -0.0064833564683794975, -0.11283299326896667, -0.10951483249664307, -0.07307653874158859, -0.13151410222053528, 0.017696626484394073, -0.08555935323238373, 0.04006896913051605, -0.10810714215040207, 0.06099651753902435, -0.1403352916240692, 0.16521142423152924, 0.01211363822221756, -0.06865936517715454, 0.06902800500392914, -0.010844538919627666, -0.012872658669948578, 0.03553897887468338, 0.01780058443546295, -0.107820063829422, 0.06028638035058975, -0.003780600382015109, 0.027293384075164795, -0.0880291685461998, 0.0288144052028656, -0.11137834936380386, -0.0817507654428482, -0.06664827466011047, 0.04719794914126396, 0.0164024755358696, 0.16545192897319794, 0.02900465950369835, -0.015107986517250538, -0.0006530265673063695, 0.216374471783638, -0.010459459386765957, -0.04523992910981178, -0.08217813074588776, 0.16931983828544617, -0.03377765417098999, -0.027638671919703484, -0.08253655582666397, -0.004527530632913113, -0.1134181097149849, 0.3261765241622925, 0.24927327036857605, -0.11922802031040192, 0.010551334358751774, -0.005757180042564869, 0.06504269689321518, -0.0105343172326684, 0.17422547936439514, 0.11199089884757996, 0.19670121371746063, -0.050802476704120636, -0.09219180047512054, -0.052453648298978806, 0.04708574712276459, -0.029842950403690338, 0.031709421426057816, 0.05830474942922592, -0.029490917921066284, -0.0460219644010067, 0.014646134339272976, -0.21032936871051788, 0.033690549433231354, -0.14806000888347626, -0.1811237335205078, -0.018175266683101654, 0.00377926928922534, 0.13716311752796173, 0.024613922461867332, 0.11657852679491043, -0.0015568208182230592, 0.03367993235588074, 0.018510546535253525, 0.010174763388931751, -0.18139159679412842, 0.1090174987912178, 0.11873932182788849, -0.20232929289340973, 0.05333822965621948, -0.05762874335050583, 0.004080825485289097, 0.06513627618551254, 0.03553216531872749, -0.029389314353466034, 0.07810042053461075, -0.008983265608549118, -0.05393959954380989, -0.03334411606192589, 0.08691930025815964, 0.0022681127302348614, 0.013453583233058453, 0.11937843263149261, -0.047175392508506775, 0.010566199198365211, 0.05166583135724068, 0.03415931016206741, -0.06974905729293823, 0.1337386667728424, -0.07566895335912704, 0.09488079696893692, 0.12312189489603043, -0.06375135481357574, -0.02290281280875206, -0.027390075847506523, -0.014738031663000584, -0.028880104422569275, -0.003275749972090125, -0.11209383606910706, -0.1597621738910675, -0.09088604897260666, -0.020819082856178284, 0.07432695478200912, -0.20251113176345825, 0.0176373478025198, -0.14382144808769226, -0.021331103518605232, -0.05827919393777847, 0.11926233768463135, 0.11785708367824554, -0.01850280724465847, 0.03631451353430748, 0.002283829264342785, 0.044723570346832275, 0.06388480216264725, -0.07238158583641052, -0.11158613115549088 ]
null
null
transformers
#LOTR DialoGPT Model
{"tags": ["conversational"]}
text-generation
Niphredil/DialoGPT-small-lotr
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
#LOTR DialoGPT Model
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 51 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.009697278961539268, 0.03208012506365776, -0.007204889785498381, 0.004809224978089333, 0.16726240515708923, 0.014898733235895634, 0.09765533357858658, 0.13672804832458496, -0.007841327227652073, -0.031050153076648712, 0.14490588009357452, 0.20411323010921478, -0.006439372431486845, 0.0661218985915184, -0.07572533935308456, -0.2683109939098358, 0.05759621039032936, 0.046649303287267685, 0.016515716910362244, 0.1200079694390297, 0.08573378622531891, -0.05473608896136284, 0.08714032918214798, -0.014583407901227474, -0.150366872549057, 0.017733458429574966, 0.043394338339567184, -0.12260226160287857, 0.11910516023635864, 0.05462685227394104, 0.07063519209623337, 0.014929565601050854, -0.07541623711585999, -0.1631229966878891, 0.03031250834465027, 0.01425902172923088, -0.0594632662832737, 0.04757995903491974, 0.059961482882499695, -0.10165371745824814, 0.10819483548402786, 0.09530027210712433, -0.013078106567263603, 0.06798283755779266, -0.16849711537361145, -0.020869607105851173, -0.01446688175201416, 0.009899779222905636, 0.05550243332982063, 0.09964893013238907, -0.03413357585668564, 0.10497362166643143, -0.09214533120393753, 0.11017382889986038, 0.10932035744190216, -0.32057443261146545, -0.005767723545432091, 0.09167823940515518, 0.039358653128147125, 0.07352814823389053, -0.04467793554067612, 0.06258884817361832, 0.018015462905168533, 0.017986174672842026, -0.014015024527907372, -0.07283061742782593, -0.11612214148044586, 0.04717336222529411, -0.08668071031570435, -0.059868961572647095, 0.2244078367948532, -0.05464440956711769, 0.06881742179393768, -0.05281897634267807, -0.10522868484258652, -0.04308144748210907, -0.029833965003490448, 0.00475557055324316, -0.07660607248544693, 0.08692064881324768, 0.00869679357856512, -0.09547875821590424, -0.1376667022705078, -0.02496783249080181, -0.1776352822780609, 0.16140350699424744, 0.02465328387916088, 0.05232657864689827, -0.2027255892753601, 0.09623090922832489, 0.017906051129102707, -0.08045592904090881, 0.022091427817940712, -0.10046248883008957, 0.029131146147847176, 0.013760408386588097, -0.04754498973488808, -0.061387211084365845, 0.0843690037727356, 0.11199145019054413, -0.01731434464454651, 0.025486016646027565, -0.039331406354904175, 0.08100687712430954, 0.03553595021367073, 0.09077847748994827, 0.007288969587534666, -0.028338588774204254, 0.025842782109975815, -0.13719046115875244, -0.003647835226729512, -0.07116208970546722, -0.16572439670562744, -0.021088803187012672, 0.02994808368384838, 0.08289173990488052, 0.015449047088623047, 0.11682453751564026, -0.03272046521306038, -0.025152435526251793, 0.03602350503206253, -0.047656361013650894, -0.012649794109165668, 0.016648368909955025, 0.013163427822291851, 0.12399329990148544, -0.0022096503525972366, 0.03235051408410072, -0.13653022050857544, 0.031423524022102356, -0.06793295592069626, -0.003740974934771657, -0.03486552834510803, -0.040637075901031494, 0.009043924510478973, -0.06862333416938782, 0.003486064961180091, -0.15030112862586975, -0.15063877403736115, 0.007587034720927477, -0.007836631499230862, -0.04107699543237686, -0.06370922178030014, -0.06952770054340363, -0.013550350442528725, 0.04251532256603241, -0.07093454152345657, -0.011352915316820145, -0.06403283774852753, 0.11004766076803207, -0.03197755664587021, 0.07921615242958069, -0.11953279376029968, 0.08390819281339645, -0.11260783672332764, -0.02386913076043129, -0.060801517218351364, 0.09317506104707718, -0.0006014376995153725, 0.09549830108880997, -0.006563255097717047, -0.017931854352355003, -0.07981178909540176, 0.06445012241601944, -0.042872510850429535, 0.21701598167419434, -0.0615808479487896, -0.11181682348251343, 0.28781595826148987, -0.052628401666879654, -0.1370542049407959, 0.11647392809391022, 0.008682746440172195, 0.05777018144726753, 0.10703510791063309, 0.19733482599258423, -0.015276194550096989, 0.004040541127324104, 0.09471915662288666, 0.11263324320316315, -0.11276852339506149, -0.033160366117954254, 0.013019153848290443, -0.04081077128648758, -0.10867965966463089, 0.04689536616206169, 0.09810488671064377, 0.07090286910533905, -0.04786505550146103, -0.03377414867281914, -0.01366397924721241, 0.0052589005790650845, 0.08885077387094498, -0.007157256826758385, 0.10962837189435959, -0.05819983780384064, -0.03796621412038803, -0.029282379895448685, -0.012126247398555279, -0.03951939567923546, 0.03137664496898651, -0.043376367539167404, 0.10821941494941711, -0.011204327456653118, 0.06364280730485916, -0.16185984015464783, -0.07691477984189987, -0.017002692446112633, 0.1581239402294159, 0.024538565427064896, 0.09859629720449448, 0.0552486926317215, -0.040398042649030685, -0.0012767292791977525, 0.012792680412530899, 0.15581141412258148, -0.022091681137681007, -0.065607450902462, -0.052166227251291275, 0.08642971515655518, -0.05641226842999458, 0.04504093527793884, -0.05937713757157326, 0.012367865070700645, 0.05064384639263153, 0.10342344641685486, -0.00018274025933351368, 0.03323284164071083, -0.008164864964783192, 0.002145637758076191, -0.058205123990774155, 0.007405933458358049, 0.10799351334571838, 0.00036868182360194623, -0.07365862280130386, 0.22074243426322937, -0.17796069383621216, 0.1765957772731781, 0.1893044263124466, -0.299345999956131, 0.017949223518371582, -0.10759581625461578, -0.04561871662735939, 0.014407722279429436, 0.05567655712366104, -0.0454222597181797, 0.1703362911939621, -0.009871348738670349, 0.18874616920948029, -0.04946064203977585, -0.04464937001466751, -0.0200483538210392, -0.05118836089968681, -0.0024189651012420654, 0.07781197130680084, 0.10685696452856064, -0.13992026448249817, 0.1964332014322281, 0.1621224284172058, 0.048237916082143784, 0.19945049285888672, 0.015346456319093704, -0.011589210480451584, 0.0909530371427536, 0.005220826715230942, -0.058739423751831055, -0.07409929484128952, -0.2594851851463318, -0.030033592134714127, 0.07992640137672424, 0.0422382652759552, 0.1212305948138237, -0.11349532753229141, -0.038956157863140106, -0.01763172075152397, -0.023146281018853188, 0.021672505885362625, 0.0914369598031044, 0.06075398623943329, 0.13201528787612915, -0.001710098935291171, -0.007300339173525572, 0.10524573177099228, 0.01783694699406624, -0.09354141354560852, 0.18308524787425995, -0.13652534782886505, -0.37097251415252686, -0.13911493122577667, -0.18057456612586975, -0.05449081212282181, 0.05712554603815079, 0.11679314076900482, -0.12011238187551498, -0.018752124160528183, 0.01578843593597412, 0.10931742936372757, -0.08449502289295197, 0.0021454424131661654, -0.06880278885364532, 0.0321490578353405, -0.10310184955596924, -0.09194442629814148, -0.055416494607925415, -0.031392451375722885, -0.08001253753900528, 0.1423761546611786, -0.10777941346168518, 0.04476889222860336, 0.20262959599494934, 0.04653622955083847, 0.05625178664922714, -0.044105201959609985, 0.19377262890338898, -0.11264272034168243, -0.01661740615963936, 0.19215328991413116, -0.048360925167798996, 0.07476246356964111, 0.1232115849852562, -0.006348740309476852, -0.08765771239995956, 0.03011748194694519, -0.02085109055042267, -0.07988511025905609, -0.23219464719295502, -0.13938382267951965, -0.12429051846265793, 0.09477275609970093, 0.028005298227071762, 0.056365787982940674, 0.17219258844852448, 0.06577219814062119, -0.038416244089603424, 0.006410336587578058, 0.02959546446800232, 0.08237514644861221, 0.23417828977108002, -0.06035616248846054, 0.1364797055721283, -0.03420931473374367, -0.14982740581035614, 0.08169995993375778, 0.0713929831981659, 0.10213395953178406, 0.06678459793329239, 0.0804823637008667, 0.0149586396291852, 0.06188136339187622, 0.1311223804950714, 0.08191446959972382, 0.019586285576224327, -0.02480296604335308, -0.03388110175728798, -0.025523077696561813, -0.05937909707427025, 0.040128443390131, 0.06589099019765854, -0.16763372719287872, -0.039227183908224106, -0.09338314831256866, 0.09657008945941925, 0.0873042419552803, 0.06609832495450974, -0.1842060089111328, -0.008006223477423191, 0.08488986641168594, -0.03854905813932419, -0.13727426528930664, 0.09535189718008041, 0.01523482333868742, -0.15144726634025574, 0.03139317408204079, -0.04061909019947052, 0.12188644707202911, -0.07804752141237259, 0.09809603542089462, -0.08108244836330414, -0.07448557764291763, 0.02123199962079525, 0.1261177361011505, -0.30527687072753906, 0.20240111649036407, -0.0024993624538183212, -0.06486981362104416, -0.1243603527545929, -0.0032166161108762026, 0.002410882618278265, 0.07357452809810638, 0.10519039630889893, -0.007196315098553896, 0.001897757756523788, -0.06300821900367737, -0.01829923689365387, 0.032471053302288055, 0.13080233335494995, -0.0401318334043026, -0.021158374845981598, -0.050194524228572845, -0.001653497340157628, -0.03173094615340233, -0.06934895366430283, 0.02002747356891632, -0.19509181380271912, 0.08751901984214783, 0.04166261479258537, 0.09648149460554123, 0.029994789510965347, 0.004265148192644119, -0.09651939570903778, 0.24698667228221893, -0.07148019969463348, -0.10072879493236542, -0.10919588059186935, -0.046813901513814926, 0.03569883480668068, -0.05628936365246773, 0.04309194162487984, -0.0788632407784462, 0.028997479006648064, -0.06352769583463669, -0.19235502183437347, 0.12410202622413635, -0.09027006477117538, -0.04412810131907463, -0.02371402643620968, 0.2110891044139862, -0.05598580464720726, 0.010335659608244896, 0.02930437959730625, 0.01208863127976656, -0.11645778268575668, -0.09678568691015244, 0.031018631532788277, -0.007351789623498917, 0.050603240728378296, 0.041841957718133926, -0.05915454775094986, -0.017138581722974777, -0.052199993282556534, -0.022926922887563705, 0.3496883809566498, 0.14231905341148376, -0.043836336582899094, 0.19347235560417175, 0.12347975373268127, -0.07452994585037231, -0.3159443140029907, -0.1066238060593605, -0.10937739163637161, -0.04680149629712105, -0.07012093812227249, -0.2002030611038208, 0.06474938243627548, 0.00662544509395957, -0.013415241613984108, 0.12749312818050385, -0.2561831772327423, -0.07571036368608475, 0.15906259417533875, -0.017980827018618584, 0.3745945692062378, -0.1168576180934906, -0.10926306992769241, -0.03950892388820648, -0.14175476133823395, 0.16968177258968353, -0.01989765651524067, 0.11221715062856674, -0.009765521623194218, 0.14388824999332428, 0.05548359826207161, -0.023479344323277473, 0.08544106781482697, 0.004999885335564613, -0.03290518373250961, -0.10304180532693863, -0.05676887184381485, 0.007092386484146118, 0.02477436140179634, 0.018026655539870262, -0.041834570467472076, 0.02227151393890381, -0.11731979995965958, -0.04657655209302902, -0.08982590585947037, 0.04431166127324104, 0.03899754583835602, -0.07325074821710587, -0.002380647463724017, -0.07165111601352692, -0.012272949330508709, 0.022334342822432518, 0.20356793701648712, -0.08029330521821976, 0.16448934376239777, 0.09239562600851059, 0.12419285625219345, -0.14376309514045715, -0.00019283240544609725, -0.0762530043721199, -0.05611240118741989, 0.07737895101308823, -0.09433035552501678, 0.058893077075481415, 0.10901971161365509, -0.04567738622426987, 0.08828683942556381, 0.10377411544322968, 0.008936077356338501, 0.003213887568563223, 0.10916902124881744, -0.2667325437068939, -0.0296600554138422, -0.07532413303852081, 0.000883326749317348, 0.09092561900615692, 0.08562852442264557, 0.18840822577476501, 0.025361526757478714, -0.04293036088347435, -0.002770674182102084, 0.028597986325621605, -0.039021048694849014, 0.051667019724845886, 0.001123449532315135, 0.01947369985282421, -0.1530752182006836, 0.072522833943367, 0.01490565575659275, -0.15215420722961426, 0.021316176280379295, 0.16572684049606323, -0.11656328290700912, -0.1283872276544571, -0.06520111113786697, 0.08313824236392975, -0.11755692958831787, -0.01578943058848381, -0.03279297426342964, -0.13145680725574493, 0.07992171496152878, 0.12629036605358124, 0.05557859688997269, 0.0972496047616005, -0.06061713397502899, -0.020469192415475845, -0.018721895292401314, -0.014099318534135818, -0.012384648434817791, -0.007667020428925753, -0.055978111922740936, 0.0590752474963665, -0.026677248999476433, 0.1425808072090149, -0.09221141785383224, -0.1037059873342514, -0.16142144799232483, 0.0374140702188015, -0.11013076454401016, -0.08825794607400894, -0.08821134269237518, -0.050188567489385605, 0.002360827289521694, -0.019856395199894905, -0.04037635400891304, -0.05829505994915962, -0.12300454825162888, 0.0338277705013752, -0.040771447122097015, 0.024727050215005875, -0.07512269169092178, 0.015856385231018066, 0.08507686108350754, -0.03285100311040878, 0.15655414760112762, 0.1450488418340683, -0.1006515845656395, 0.10741901397705078, -0.14806775748729706, -0.09138492494821548, 0.11116421222686768, 0.015329592861235142, 0.0449691042304039, 0.09723787009716034, 0.013362943194806576, 0.0635865181684494, 0.032776717096567154, 0.05308786407113075, 0.027619892731308937, -0.11959987878799438, 0.06483134627342224, -0.03626115620136261, -0.14700546860694885, -0.049338050186634064, -0.05282869189977646, 0.01647452637553215, 0.013054544106125832, 0.09622690081596375, -0.05301849544048309, 0.10698331147432327, -0.04055701196193695, 0.0346808135509491, 0.017554637044668198, -0.1730053424835205, -0.03816922754049301, -0.08538098633289337, 0.03681723028421402, 0.014741539023816586, 0.25266793370246887, 0.030072299763560295, 0.012416383251547813, 0.032671261578798294, 0.08285367488861084, 0.03899408504366875, 0.010228337720036507, 0.17482228577136993, 0.1162426546216011, -0.06621865928173065, -0.10445023328065872, 0.0729617029428482, 0.016332454979419708, 0.01286179106682539, 0.13617953658103943, 0.008365051820874214, 0.005795429926365614, 0.08649782836437225, -0.016865963116288185, 0.009968153201043606, -0.10052056610584259, -0.13426925241947174, -0.022176474332809448, 0.05151832848787308, -0.04655967652797699, 0.11727844923734665, 0.1406494379043579, -0.01806013658642769, 0.03222079202532768, -0.021771740168333054, -0.05699979141354561, -0.1683429479598999, -0.1429590880870819, -0.06883849948644638, -0.13416796922683716, 0.00897989235818386, -0.11180389672517776, 0.05395037308335304, 0.06001098081469536, 0.06750501692295074, -0.06899319589138031, 0.10220931470394135, 0.04626858979463577, -0.11440542340278625, 0.06264589726924896, -0.0296088308095932, 0.09430401772260666, -0.02759445086121559, -0.019505485892295837, -0.09039592742919922, 0.014574515633285046, 0.011419114656746387, 0.06245238706469536, -0.04707273095846176, 0.007463190704584122, -0.14696238934993744, -0.08972041308879852, -0.0523175448179245, 0.0718572810292244, -0.050409089773893356, 0.14282815158367157, 0.00775480642914772, -0.0170906875282526, 0.039554283022880554, 0.22787313163280487, -0.07476283609867096, -0.04778539761900902, -0.05269690603017807, 0.20717895030975342, 0.02975541539490223, 0.1171872541308403, -0.022938819602131844, -0.006106364540755749, -0.0919521227478981, 0.3764844834804535, 0.30030161142349243, -0.09031439572572708, 0.011794124729931355, 0.02137952297925949, 0.04502861574292183, 0.1316293478012085, 0.1216534823179245, 0.10318691283464432, 0.3006802201271057, -0.07452366501092911, -0.04653361067175865, -0.012629742734134197, -0.023858042433857918, -0.09059546142816544, 0.1021224707365036, 0.04839762672781944, -0.06382183730602264, -0.03313443064689636, 0.0954432487487793, -0.25862133502960205, 0.1277991235256195, -0.12311873584985733, -0.17578600347042084, -0.06654827296733856, 0.009760108776390553, 0.10465722531080246, 0.015642458572983742, 0.0946015790104866, 0.007128213066607714, -0.11252258718013763, 0.06305865943431854, 0.03397420793771744, -0.22762253880500793, 0.0006893770187161863, 0.06642123311758041, -0.07006710022687912, -0.0024247700348496437, -0.026499588042497635, 0.05657242611050606, 0.0656052976846695, 0.054629553109407425, -0.00971333310008049, 0.03816632181406021, 0.0034184439573436975, -0.0585215799510479, 0.016623929142951965, 0.05121519789099693, 0.02472509816288948, -0.09763528406620026, 0.06927435845136642, -0.1574270874261856, 0.04766253009438515, -0.0030655991286039352, -0.04124255105853081, 0.006064958870410919, 0.008823691867291927, -0.06491616368293762, 0.05165379121899605, 0.07916834205389023, -0.0016257909592241049, -0.0062433634884655476, -0.057178743183612823, -0.02632102556526661, -0.027755750343203545, -0.09291748702526093, -0.10495562851428986, -0.14682936668395996, -0.11640441417694092, 0.09368976950645447, -0.01011267676949501, -0.1848134547472, 0.022154374048113823, -0.08606051653623581, 0.08319322764873505, -0.1670055389404297, 0.08040720224380493, 0.07041648775339127, 0.013038921169936657, -0.0031511052511632442, -0.02002427540719509, 0.054132770746946335, 0.086809903383255, -0.10407156497240067, -0.07400695979595184 ]
null
null
transformers
license: apache-2.0 --- ### Rick DialoGPT Model
{"tags": ["conversational"]}
text-generation
Nisarg2701/DialoGPT-medium-Rick
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
license: apache-2.0 --- ### Rick DialoGPT Model
[ "### Rick DialoGPT Model" ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### Rick DialoGPT Model" ]
[ 51, 8 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Rick DialoGPT Model" ]
[ -0.025413312017917633, 0.09312345832586288, -0.005162056535482407, 0.011607836931943893, 0.13549868762493134, 0.0018298749346286058, 0.1495768129825592, 0.130999356508255, -0.0077923801727592945, -0.044110145419836044, 0.1429838091135025, 0.20048430562019348, -0.0021355445496737957, 0.05890142545104027, -0.07545654475688934, -0.3306746184825897, 0.050926387310028076, 0.05926493555307388, -0.025559278205037117, 0.11864221096038818, 0.09784988313913345, -0.04036334529519081, 0.07717374712228775, 0.007345940917730331, -0.14124640822410583, 0.008731000125408173, 0.017771651968359947, -0.11458604037761688, 0.11309308558702469, 0.06654632836580276, 0.036507707089185715, 0.044759854674339294, -0.04057559743523598, -0.1401142030954361, 0.04153025522828102, -0.004023453686386347, -0.04013814404606819, 0.06077854335308075, 0.02377299778163433, -0.0881878137588501, 0.11851460486650467, 0.12788087129592896, -0.015752460807561874, 0.04034607484936714, -0.1585608571767807, -0.006652865093201399, -0.015511149540543556, 0.06772593408823013, 0.059873443096876144, 0.1097317710518837, -0.039601825177669525, 0.11911729723215103, -0.06207960844039917, 0.11206045746803284, 0.1205565482378006, -0.2934388220310211, -0.011758731678128242, 0.14702121913433075, 0.043554890900850296, 0.039237722754478455, -0.041472744196653366, 0.09781163930892944, 0.023659149184823036, -0.008307565934956074, -0.04633359983563423, -0.07571808993816376, -0.07137920707464218, 0.027764812111854553, -0.08807270973920822, -0.017448676750063896, 0.25613731145858765, -0.031170541420578957, 0.07836173474788666, -0.07065629214048386, -0.09033548831939697, -0.022775936871767044, -0.033078763633966446, -0.03552408143877983, -0.09944596886634827, 0.07922898977994919, -0.03075677528977394, -0.09429214149713516, -0.11695519834756851, -0.03076823800802231, -0.16565042734146118, 0.1787114441394806, 0.02623685635626316, 0.031433477997779846, -0.22814996540546417, 0.09947609156370163, -0.01850845292210579, -0.10185559839010239, 0.025555264204740524, -0.08776295930147171, 0.020035279914736748, 0.016504161059856415, -0.018119389191269875, -0.0011531691998243332, 0.080464206635952, 0.11158853024244308, 0.011199954897165298, 0.01640055887401104, -0.01607147790491581, 0.051932621747255325, 0.04229697957634926, 0.07549137622117996, -0.027696121484041214, -0.03071208856999874, 0.020599056035280228, -0.09879844635725021, -0.013612693175673485, -0.06524600833654404, -0.19932471215724945, -0.007673978805541992, 0.05888914689421654, 0.06404034793376923, 0.03664683178067207, 0.12803208827972412, 0.0020643568132072687, -0.04798604175448418, 0.036807067692279816, -0.016705121845006943, -0.019799217581748962, 0.011381683871150017, -0.000567600887734443, 0.14477424323558807, 0.009410920552909374, 0.049282293766736984, -0.11613954603672028, 0.01606225036084652, -0.047925882041454315, -0.020887719467282295, -0.036321528255939484, -0.05544900521636009, -0.010740289464592934, -0.02770245261490345, 0.022352509200572968, -0.13552983105182648, -0.16765061020851135, -0.009857002645730972, -0.007351231295615435, -0.04765838384628296, -0.114654541015625, -0.10575094074010849, -0.03285408020019531, 0.04166030138731003, -0.06368850916624069, -0.0024775846395641565, -0.0539090558886528, 0.09397117793560028, -0.03306370601058006, 0.07714997977018356, -0.09804625809192657, 0.08385122567415237, -0.07397409528493881, -0.03847332298755646, -0.08302593231201172, 0.12718430161476135, 0.009711487218737602, 0.05438073351979256, -0.03422717750072479, -0.023551540449261665, -0.09702390432357788, 0.07607568800449371, -0.04108187556266785, 0.23816026747226715, -0.09224523603916168, -0.10274914652109146, 0.27373141050338745, -0.05444721132516861, -0.14456064999103546, 0.10977078974246979, -0.01900210604071617, 0.11534583568572998, 0.11953683942556381, 0.19405582547187805, 0.06296877562999725, 0.001984680537134409, 0.07552992552518845, 0.11304278671741486, -0.07483243942260742, -0.018239116296172142, 0.014894796535372734, -0.02566966600716114, -0.0835549384355545, 0.021904105320572853, 0.07210532575845718, 0.051827460527420044, -0.05941735580563545, -0.017449067905545235, 0.004651714116334915, 0.004100979305803776, 0.05548004433512688, -0.02314450591802597, 0.12615177035331726, -0.0296939704567194, -0.06366891413927078, -0.028767621144652367, 0.028939921408891678, -0.057691968977451324, 0.0304481890052557, -0.07899361848831177, 0.043644018471241, -0.020748483017086983, 0.06715301424264908, -0.16692428290843964, -0.08818136155605316, -0.04975733906030655, 0.19024549424648285, 0.0651085153222084, 0.11861510574817657, 0.05720169469714165, -0.062263716012239456, -0.003021731274202466, 0.012878939509391785, 0.19674640893936157, -0.015059832483530045, -0.07794132083654404, -0.09883095324039459, 0.10147342085838318, -0.07179374247789383, 0.055116456001996994, -0.05588318035006523, 0.014525315724313259, 0.012787281535565853, 0.10331959277391434, -0.03575039654970169, 0.043479107320308685, 0.011354541406035423, -0.03467288240790367, -0.06714975088834763, -0.003501792438328266, 0.093502476811409, 0.0001633678184589371, -0.10478156059980392, 0.2363431751728058, -0.192146897315979, 0.12266795337200165, 0.17975205183029175, -0.20772798359394073, -0.0011036746436730027, -0.1226116195321083, -0.028113221749663353, 0.009094691835343838, 0.0379585437476635, -0.038009289652109146, 0.23699983954429626, -0.009830361232161522, 0.164081871509552, -0.038274720311164856, -0.04061569646000862, -0.038266636431217194, -0.053150493651628494, 0.007183412089943886, 0.11623285710811615, 0.1069716140627861, -0.17304037511348724, 0.18366195261478424, 0.06931430846452713, 0.05452869087457657, 0.16730473935604095, 0.018488436937332153, 0.014268893748521805, 0.06420514732599258, -0.0017100382829084992, -0.036925602704286575, -0.06880984455347061, -0.2122652679681778, -0.017870774492621422, 0.0786411389708519, 0.05038889870047569, 0.10847067087888718, -0.10926072299480438, -0.03676426783204079, -0.00701566506177187, -0.020849628373980522, 0.03178391978144646, 0.13479545712471008, 0.015766892582178116, 0.12959885597229004, -0.020557312294840813, -0.07056758552789688, 0.06748966872692108, 0.01528566237539053, -0.08785329014062881, 0.19718299806118011, -0.10789331048727036, -0.33008527755737305, -0.10673893243074417, -0.1917273849248886, -0.056284211575984955, 0.04729343578219414, 0.10903682559728622, -0.13956227898597717, -0.024092644453048706, 0.003875764086842537, 0.07156971842050552, -0.12339842319488525, 0.008168553933501244, -0.03937212750315666, -0.010348732583224773, -0.1377360075712204, -0.09894730895757675, -0.05475780740380287, -0.04504374414682388, -0.044258132576942444, 0.11822060495615005, -0.15374580025672913, 0.024178992956876755, 0.24039819836616516, 0.05719635263085365, 0.0687137171626091, -0.036739181727170944, 0.1779303103685379, -0.10556811839342117, 0.008601149544119835, 0.206468865275383, -0.03538570553064346, 0.06729882955551147, 0.12036361545324326, -0.011550151742994785, -0.07053536921739578, 0.03585716709494591, -0.012808484025299549, -0.07473361492156982, -0.22093303501605988, -0.11876584589481354, -0.10936980694532394, 0.06452792137861252, 0.051697418093681335, 0.048640698194503784, 0.16190262138843536, 0.07743373513221741, -0.04540489614009857, 0.000863497203681618, 0.05378114804625511, 0.08270050585269928, 0.24506022036075592, -0.0621301494538784, 0.13748610019683838, -0.026507433503866196, -0.16775651276111603, 0.0646020770072937, 0.06864825636148453, 0.09858483076095581, 0.06099119782447815, 0.10426733642816544, 0.007944096811115742, 0.017886226996779442, 0.12701360881328583, 0.07694946974515915, 0.01239494327455759, -0.03690680116415024, -0.0423397496342659, -0.03563550487160683, -0.011839799582958221, 0.036417026072740555, 0.04945746064186096, -0.1659635752439499, -0.02557341940701008, 0.00849960744380951, 0.05638359859585762, 0.02280269004404545, 0.09184551239013672, -0.18712642788887024, -0.015488875098526478, 0.06965582817792892, -0.008097166195511818, -0.11557783931493759, 0.09033557772636414, 0.0010952698066830635, -0.11290615051984787, 0.03304382413625717, -0.02719421498477459, 0.1312606930732727, -0.09154918044805527, 0.07375496625900269, -0.12279841303825378, -0.039067674428224564, -0.011441050097346306, 0.12701736390590668, -0.2892184853553772, 0.20540407299995422, -0.009327992796897888, -0.04721532762050629, -0.10816318541765213, -0.0166139118373394, 0.02918194606900215, 0.1008835881948471, 0.11635888367891312, -0.018317503854632378, -0.03041737899184227, 0.05055782571434975, -0.07037102431058884, 0.0320945642888546, 0.10094740241765976, -0.07129764556884766, -0.008443494327366352, -0.05018338933587074, 0.004331748466938734, 0.009311215952038765, -0.11870677024126053, 0.022950246930122375, -0.18725070357322693, 0.08033385127782822, 0.0739237442612648, 0.09259399771690369, 0.03737480565905571, -0.02790791541337967, -0.07977753132581711, 0.2550199627876282, 0.00549525860697031, -0.09659118205308914, -0.10687055438756943, 0.012265050783753395, 0.054912131279706955, -0.07758218050003052, -0.014319623820483685, -0.07215062528848648, 0.04044594615697861, -0.06386993825435638, -0.18098653852939606, 0.12059667706489563, -0.09730299562215805, -0.03764447569847107, -0.036971818655729294, 0.21454687416553497, -0.03276537358760834, 0.018821733072400093, 0.04205634072422981, -0.006420786492526531, -0.11755988746881485, -0.11483851075172424, 0.0026741090696305037, -0.007381222676485777, 0.0222337543964386, 0.030567297711968422, -0.030579399317502975, -0.004651008173823357, -0.06769606471061707, -0.018655773252248764, 0.3284361958503723, 0.14012615382671356, -0.042728498578071594, 0.14557716250419617, 0.108784981071949, -0.06586406379938126, -0.29363498091697693, -0.10999730974435806, -0.08354075998067856, -0.0568871833384037, -0.08512935787439346, -0.17736168205738068, 0.08213600516319275, -0.054124701768159866, -0.012475022114813328, 0.08978164196014404, -0.253679096698761, -0.10242746025323868, 0.21005234122276306, -0.028505340218544006, 0.4199415445327759, -0.10873056203126907, -0.08496208488941193, -0.04818785935640335, -0.13633574545383453, 0.18631918728351593, 0.005370096769183874, 0.10498684644699097, -0.002517115557566285, 0.19696369767189026, 0.057973869144916534, -0.007568690460175276, 0.07672296464443207, 0.018410617485642433, -0.05276210233569145, -0.09497855603694916, -0.0944344773888588, -0.028905028477311134, 0.011441238224506378, 0.033599358052015305, -0.06148914620280266, 0.04559621214866638, -0.13501228392124176, -0.053870789706707, -0.08265447616577148, 0.035316914319992065, 0.02645695023238659, -0.07149342447519302, -0.009650390595197678, -0.048362117260694504, 0.0014440747909247875, 0.005096945445984602, 0.208197221159935, -0.1052820011973381, 0.1382274031639099, 0.043538354337215424, 0.15203367173671722, -0.08731967955827713, -0.04129652678966522, -0.05997578427195549, -0.0520259328186512, 0.07111673802137375, -0.12494954466819763, 0.032224684953689575, 0.1104445531964302, -0.027098441496491432, 0.08857349306344986, 0.11257883906364441, -0.01701214164495468, 0.005912725813686848, 0.09007294476032257, -0.2481793910264969, -0.07843498885631561, -0.08273200690746307, 0.04688714072108269, 0.05852865055203438, 0.10027871280908585, 0.20555686950683594, 0.002952883718535304, -0.024905890226364136, 0.020373962819576263, 0.028153683990240097, -0.022673698142170906, 0.06590735167264938, 0.012659834697842598, 0.03057236224412918, -0.14725670218467712, 0.05152782425284386, -0.010443867184221745, -0.09752604365348816, 0.025633329525589943, 0.14957043528556824, -0.11054875701665878, -0.1212187260389328, -0.03411915898323059, 0.1427105963230133, -0.15329577028751373, -0.012512874789536, -0.04996157065033913, -0.12764443457126617, 0.06936612725257874, 0.10225962102413177, 0.045239366590976715, 0.04039812833070755, -0.09208659827709198, -0.028230223804712296, -0.05761025473475456, 0.0017709628446027637, 0.03482883796095848, -0.0206970926374197, -0.05629501864314079, 0.0343213751912117, -0.03532790765166283, 0.12593966722488403, -0.08570976555347443, -0.1002461165189743, -0.1670316904783249, 0.03975862264633179, -0.07684562355279922, -0.08851262927055359, -0.09018424898386002, -0.038247428834438324, 0.003083163406699896, -0.037343960255384445, -0.031633976846933365, -0.03941403329372406, -0.11623693257570267, 0.03392031043767929, -0.04468661546707153, 0.008214011788368225, -0.07037744671106339, 0.0282769575715065, 0.0551554374396801, -0.02738128788769245, 0.1493792086839676, 0.14054113626480103, -0.10579223930835724, 0.09399647265672684, -0.14710433781147003, -0.06872136145830154, 0.09804587811231613, 0.018653327599167824, 0.054240599274635315, 0.04268227517604828, 0.007165399845689535, 0.046455010771751404, 0.05903632938861847, 0.039731714874506, 0.015779580920934677, -0.07858280837535858, 0.0632837563753128, -0.05867265537381172, -0.10969637334346771, -0.05310700833797455, -0.004056182224303484, 0.01613820157945156, 0.06698485463857651, 0.09732817858457565, -0.05417533591389656, 0.09726493060588837, -0.05756323039531708, 0.044696684926748276, 0.022041747346520424, -0.17496995627880096, 0.03596636280417442, -0.09091048687696457, 0.05007437616586685, 0.010267782025039196, 0.18233734369277954, 0.018474919721484184, -0.012619992718100548, 0.023284850642085075, 0.07080445438623428, 0.04552442952990532, -0.009183981455862522, 0.19343984127044678, 0.10242828726768494, -0.044018618762493134, -0.08738429844379425, 0.10163026303052902, 0.04349548742175102, 0.052731748670339584, 0.15045559406280518, -0.05517276003956795, -0.03464581444859505, 0.08031102269887924, -0.008727089501917362, 0.006741922814399004, -0.10563398152589798, -0.1353350430727005, -0.01793110929429531, 0.03988603129982948, -0.02969387173652649, 0.10387176275253296, 0.1586938202381134, -0.012117670848965645, 0.02041335217654705, -0.019340544939041138, -0.06282396614551544, -0.1992274671792984, -0.19432158768177032, -0.08128879219293594, -0.13820840418338776, 0.006095437798649073, -0.14042456448078156, 0.04458213970065117, 0.0312360767275095, 0.09814401715993881, -0.05305533856153488, 0.05867365375161171, 0.03604401648044586, -0.11038713902235031, 0.056459859013557434, -0.03799363970756531, 0.09102023392915726, -0.038950927555561066, 0.010997306555509567, -0.06946901232004166, 0.03109005279839039, 0.012545717880129814, 0.04088572785258293, -0.035878438502550125, 0.019293351098895073, -0.12212494760751724, -0.08936943858861923, -0.06681713461875916, 0.06648603826761246, 0.007211015094071627, 0.1723766326904297, 0.014598800800740719, -0.029322393238544464, 0.029859323054552078, 0.23865127563476562, -0.07857538014650345, -0.09254417568445206, -0.06842808425426483, 0.21690116822719574, -0.00620760815218091, 0.090630903840065, -0.038229696452617645, 0.01051494013518095, -0.08591272681951523, 0.35651057958602905, 0.2949316203594208, -0.09504411369562149, 0.010222222656011581, -0.004256130196154118, 0.041484516113996506, 0.12805554270744324, 0.09380204230546951, 0.10819242149591446, 0.2881949543952942, -0.07485105097293854, -0.030435338616371155, -0.01491188257932663, -0.02712395042181015, -0.05097600445151329, 0.06670726090669632, 0.06313516944646835, -0.06422963738441467, -0.020813165232539177, 0.12473512440919876, -0.25889691710472107, 0.06485335528850555, -0.15243640542030334, -0.163478285074234, -0.07279390841722488, 0.002048675436526537, 0.09454232454299927, 0.014270401559770107, 0.09511784464120865, -0.012610796838998795, -0.06932329386472702, 0.04089910164475441, 0.022833727300167084, -0.21507444977760315, 0.013424748554825783, 0.06894704699516296, -0.036393385380506516, -0.043789900839328766, -0.016323184594511986, 0.06653105467557907, 0.09072965383529663, 0.03182951360940933, -0.02560192532837391, 0.0410161018371582, -0.004507903009653091, -0.07601465284824371, 0.038169149309396744, 0.025504540652036667, 0.006378805730491877, -0.09310044348239899, 0.07619175314903259, -0.16966107487678528, 0.034948136657476425, -0.003448901465162635, -0.04513508081436157, -0.01649547927081585, 0.026924386620521545, -0.06217795982956886, 0.08732637763023376, 0.08358006924390793, -0.016879426315426826, -0.015885312110185623, -0.023426644504070282, -0.008602092042565346, -0.02419489063322544, -0.0728517547249794, -0.09037132561206818, -0.1558411866426468, -0.1263885647058487, 0.0975518599152565, -0.0010766644263640046, -0.20942999422550201, 0.029067600145936012, -0.11550918221473694, 0.045085564255714417, -0.12162791192531586, 0.10203573107719421, 0.0827576294541359, 0.025336498394608498, -0.006597404833883047, 0.004998429212719202, 0.032340891659259796, 0.07994244992733002, -0.1367950290441513, -0.0885855033993721 ]
null
null
transformers
# ELECTRA ## Introduction **ELECTRA** is a method for self-supervised language representation learning. It can be used to pre-train transformer networks using relatively little compute. ELECTRA models are trained to distinguish "real" input tokens vs "fake" input tokens generated by another neural network, similar to the discriminator of a [GAN](https://arxiv.org/pdf/1406.2661.pdf). At small scale, ELECTRA achieves strong results even when trained on a single GPU. At large scale, ELECTRA achieves state-of-the-art results on the [SQuAD 2.0](https://rajpurkar.github.io/SQuAD-explorer/) dataset. Electra-base-vn is trained on more 148gb text with max length 512. You can download tensorflow version at [Electra base TF version](https://drive.google.com/drive/folders/1hN0LiOlMfNDDQVo2bgEYHd03I-xXDLVr?usp=sharing) ### Contact information For personal communication related to this project, please contact Nha Nguyen Van ([email protected]).
{}
null
NlpHUST/electra-base-vn
[ "transformers", "pytorch", "electra", "pretraining", "arxiv:1406.2661", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "1406.2661" ]
[]
TAGS #transformers #pytorch #electra #pretraining #arxiv-1406.2661 #endpoints_compatible #region-us
# ELECTRA ## Introduction ELECTRA is a method for self-supervised language representation learning. It can be used to pre-train transformer networks using relatively little compute. ELECTRA models are trained to distinguish "real" input tokens vs "fake" input tokens generated by another neural network, similar to the discriminator of a GAN. At small scale, ELECTRA achieves strong results even when trained on a single GPU. At large scale, ELECTRA achieves state-of-the-art results on the SQuAD 2.0 dataset. Electra-base-vn is trained on more 148gb text with max length 512. You can download tensorflow version at Electra base TF version ### Contact information For personal communication related to this project, please contact Nha Nguyen Van (nha282@URL).
[ "# ELECTRA", "## Introduction\nELECTRA is a method for self-supervised language representation learning. It can be used to pre-train transformer networks using relatively little compute. ELECTRA models are trained to distinguish \"real\" input tokens vs \"fake\" input tokens generated by another neural network, similar to the discriminator of a GAN. At small scale, ELECTRA achieves strong results even when trained on a single GPU. At large scale, ELECTRA achieves state-of-the-art results on the SQuAD 2.0 dataset.\n\nElectra-base-vn is trained on more 148gb text with max length 512.\n\nYou can download tensorflow version at Electra base TF version", "### Contact information\nFor personal communication related to this project, please contact Nha Nguyen Van (nha282@URL)." ]
[ "TAGS\n#transformers #pytorch #electra #pretraining #arxiv-1406.2661 #endpoints_compatible #region-us \n", "# ELECTRA", "## Introduction\nELECTRA is a method for self-supervised language representation learning. It can be used to pre-train transformer networks using relatively little compute. ELECTRA models are trained to distinguish \"real\" input tokens vs \"fake\" input tokens generated by another neural network, similar to the discriminator of a GAN. At small scale, ELECTRA achieves strong results even when trained on a single GPU. At large scale, ELECTRA achieves state-of-the-art results on the SQuAD 2.0 dataset.\n\nElectra-base-vn is trained on more 148gb text with max length 512.\n\nYou can download tensorflow version at Electra base TF version", "### Contact information\nFor personal communication related to this project, please contact Nha Nguyen Van (nha282@URL)." ]
[ 35, 4, 161, 25 ]
[ "passage: TAGS\n#transformers #pytorch #electra #pretraining #arxiv-1406.2661 #endpoints_compatible #region-us \n# ELECTRA## Introduction\nELECTRA is a method for self-supervised language representation learning. It can be used to pre-train transformer networks using relatively little compute. ELECTRA models are trained to distinguish \"real\" input tokens vs \"fake\" input tokens generated by another neural network, similar to the discriminator of a GAN. At small scale, ELECTRA achieves strong results even when trained on a single GPU. At large scale, ELECTRA achieves state-of-the-art results on the SQuAD 2.0 dataset.\n\nElectra-base-vn is trained on more 148gb text with max length 512.\n\nYou can download tensorflow version at Electra base TF version### Contact information\nFor personal communication related to this project, please contact Nha Nguyen Van (nha282@URL)." ]
[ -0.0723825991153717, 0.046114757657051086, -0.003251203102990985, 0.07239015400409698, 0.1745908409357071, -0.019042234867811203, 0.023605965077877045, 0.03920502960681915, -0.0383501835167408, 0.04200894013047218, 0.07878091186285019, -0.028004733845591545, 0.04983018338680267, 0.10540637373924255, 0.05178553983569145, -0.175580233335495, -0.00970486644655466, 0.1070457175374031, -0.13843640685081482, 0.0036835046485066414, 0.11055770516395569, -0.09782359004020691, 0.08730607479810715, 0.03572941944003105, -0.081732377409935, 0.0817737728357315, -0.027618110179901123, -0.11862999945878983, 0.1433086097240448, 0.06655023992061615, 0.13294360041618347, -0.01777755469083786, 0.06298130750656128, -0.0669177919626236, 0.03277917578816414, 0.05327528342604637, -0.015692802146077156, 0.04860610142350197, -0.025034116581082344, 0.06922081857919693, 0.24970752000808716, -0.052051953971385956, 0.03380917012691498, -0.02852759137749672, -0.037489477545022964, -0.2275393307209015, 0.02747930958867073, 0.033455245196819305, 0.10210836678743362, 0.10135309398174286, 0.015138257294893265, 0.1783701330423355, 0.0114907156676054, 0.044906746596097946, 0.10506467521190643, -0.2349415123462677, -0.019060583785176277, 0.07639070600271225, 0.0071138315834105015, 0.06295306980609894, -0.01663440838456154, 0.006022976245731115, 0.06362716853618622, 0.08779080957174301, 0.11884161084890366, -0.0144638167694211, -0.1023121178150177, 0.07499571889638901, -0.12796273827552795, -0.012516548857092857, 0.21111023426055908, -0.00817190296947956, 0.018485315144062042, 0.036512475460767746, -0.12262807786464691, -0.09382487088441849, 0.04282744973897934, -0.06663653999567032, -0.07427758723497391, 0.0035990606993436813, -0.0903259739279747, -0.0042200274765491486, -0.13902273774147034, -0.03358951956033707, -0.09609349071979523, 0.08242156356573105, 0.07181919366121292, 0.10561670362949371, -0.08466518670320511, 0.05688369274139404, -0.0010250975610688329, -0.05006027594208717, 0.01252636220306158, -0.04218710586428642, -0.10913051664829254, 0.004485323093831539, -0.04146834462881088, -0.16100414097309113, 0.04463081806898117, 0.05912986770272255, -0.005982306320220232, 0.007229064125567675, 0.09031366556882858, 0.020021256059408188, 0.06011185422539711, 0.03769468888640404, -0.15396495163440704, 0.07758324593305588, 0.041416484862565994, -0.10154280066490173, 0.019690271466970444, -0.014619872905313969, -0.13448113203048706, -0.02243490144610405, 0.05696564540266991, 0.010218264535069466, -0.13268601894378662, 0.00924841407686472, -0.0015505796764045954, -0.03261973336338997, 0.020690228790044785, -0.0351271778345108, -0.040322672575712204, -0.02274499274790287, -0.06723306328058243, 0.12692199647426605, 0.046507976949214935, -0.0358591191470623, -0.07019637525081635, -0.009542448446154594, -0.0565832294523716, -0.11645883321762085, -0.0865120068192482, -0.06872035562992096, 0.04578607156872749, -0.009890195913612843, 0.045380670577287674, -0.2603296935558319, -0.14172996580600739, 0.06851857900619507, 0.0909363180398941, -0.032407328486442566, -0.035355184227228165, -0.00024582247715443373, 0.004339858889579773, 0.006416942924261093, -0.034527167677879333, 0.039190810173749924, -0.011429885402321815, 0.017652569338679314, 0.08056189119815826, 0.13484732806682587, -0.11270344257354736, 0.026821576058864594, 0.002418386284261942, -0.025151612237095833, -0.024432776495814323, -0.03227613493800163, -0.04731869325041771, -0.00718824565410614, -0.09555716812610626, -0.0862213522195816, -0.030761132016777992, -0.006880965083837509, 0.1448175013065338, -0.014959215186536312, -0.07807312160730362, -0.036391716450452805, 0.0701204165816307, -0.13410480320453644, -0.08660818636417389, 0.19460099935531616, -0.03857986629009247, -0.009034917689859867, 0.08942068368196487, -0.03923870250582695, 0.10586004704236984, -0.1792643666267395, -0.03091621771454811, 0.07888142019510269, -0.09531233459711075, -0.19097420573234558, 0.04675154760479927, 0.1656729280948639, -0.08524897694587708, 0.02463393285870552, 0.046175360679626465, 0.16336669027805328, -0.09814808517694473, -0.029746361076831818, -0.021540256217122078, -0.09241349995136261, 0.01171606034040451, 0.027477720752358437, 0.055324457585811615, 0.015142262913286686, -0.06306348741054535, 0.030349217355251312, 0.13769617676734924, -0.07119736820459366, 0.011649778112769127, -0.11205201596021652, 0.03755888342857361, -0.08204279094934464, 0.01573634333908558, -0.09753139317035675, -0.020547974854707718, 0.01643824577331543, 0.07577181607484818, 0.013027772307395935, 0.1318485587835312, -0.002465766156092286, 0.01762566529214382, -0.0284001212567091, 0.015025129541754723, -0.07112734019756317, -0.014733994379639626, -0.012240206822752953, 0.005374961066991091, -0.013102034106850624, -0.04050911217927933, -0.10505567491054535, -0.13034605979919434, -0.014592363499104977, 0.01553260162472725, 0.0679815262556076, 0.0024054362438619137, 0.06269282847642899, -0.03676546365022659, 0.06512230634689331, -0.05457272753119469, -0.038153037428855896, 0.010227573104202747, 0.04560686647891998, -0.03890209272503853, -0.02094835229218006, -0.05270831286907196, 0.0927199199795723, 0.05956288054585457, -0.04396600276231766, -0.039866164326667786, 0.14432616531848907, -0.0022656365763396025, -0.005592028144747019, -0.12294580787420273, -0.02151419036090374, 0.17346186935901642, -0.01672792062163353, 0.12278138101100922, -0.06914007663726807, 0.06585165858268738, 0.08003511279821396, -0.0490327812731266, 0.05804912745952606, 0.05983711779117584, 0.19222430884838104, -0.07886046916246414, 0.01655612885951996, -0.10690641403198242, -0.1582254022359848, 0.15660718083381653, 0.002215887187048793, -0.07214361429214478, -0.04257392883300781, -0.030782608315348625, 0.02913387306034565, 0.08930289000272751, 0.00970467459410429, -0.0020292813424021006, 0.02704755775630474, -0.02536020055413246, 0.062406428158283234, -0.09877370297908783, 0.02183140628039837, -0.007128224708139896, 0.0045210192911326885, 0.10536831617355347, 0.07811248302459717, -0.07649222761392593, 0.06635274738073349, 0.01797434501349926, -0.10426203906536102, 0.01702856831252575, 0.030822379514575005, -0.05402375012636185, 0.18448276817798615, -0.07273939996957779, -0.317205011844635, -0.20644818246364594, 0.07391496002674103, 0.023476002737879753, 0.025867652148008347, 0.01058118510991335, -0.08922994881868362, -0.09797129034996033, -0.009874722920358181, 0.040184348821640015, -0.07008714228868484, 0.050526391714811325, 0.0943230465054512, 0.020447982475161552, -0.02369650825858116, -0.1138664036989212, 0.03759388253092766, -0.05416267365217209, -0.04285665228962898, 0.10836854577064514, -0.0827813595533371, 0.059023667126894, 0.1303764134645462, -0.05134500935673714, 0.047122035175561905, -0.005917912814766169, 0.11829134076833725, -0.018104836344718933, 0.06346611678600311, 0.18705031275749207, 0.055093828588724136, 0.0365581139922142, 0.0007150521269068122, 0.012461978010833263, -0.05604897066950798, 0.09070929139852524, -0.010751594789326191, -0.1283515989780426, -0.1930762678384781, -0.10854317247867584, -0.04596436768770218, 0.00753316143527627, 0.05986303463578224, 0.05655240640044212, 0.008867661468684673, 0.060562554746866226, 0.10798588395118713, 0.06175803765654564, -0.10873101651668549, 0.0937206894159317, 0.16264772415161133, 0.05994148179888725, 0.039356786757707596, -0.10242602974176407, -0.048330191522836685, 0.08888701349496841, 0.062268465757369995, 0.28040677309036255, -0.05457792803645134, 0.11728747934103012, 0.02783888205885887, 0.08108563721179962, 0.11708705127239227, 0.09364546835422516, -0.03379134461283684, -0.02931712009012699, -0.023957129567861557, 0.022473137825727463, 0.03407805785536766, 0.0065448409877717495, -0.06917104870080948, -0.02883956953883171, -0.04588010907173157, 0.14674726128578186, -0.024692850187420845, 0.10693832486867905, -0.00027883052825927734, -0.291249543428421, -0.08124177157878876, -0.0733339935541153, -0.059286825358867645, -0.06991861760616302, 0.06811606138944626, 0.17250783741474152, -0.04875074326992035, -0.005150691606104374, -0.05031862482428551, 0.056315116584300995, -0.100368931889534, -0.00007802804611856118, -0.02956552244722843, 0.06859475374221802, 0.025157852098345757, 0.07311946898698807, -0.153847336769104, 0.1617041379213333, -0.0264055784791708, 0.05481385439634323, 0.010713513009250164, -0.033900611102581024, -0.04469282180070877, 0.04901634156703949, 0.014384686946868896, 0.018884852528572083, -0.17681151628494263, -0.08319471776485443, 0.01780286431312561, 0.08877339959144592, -0.03746629133820534, 0.08583671599626541, 0.03720073774456978, 0.014428515918552876, 0.007469093427062035, 0.018745386973023415, 0.03949901461601257, -0.0679912343621254, -0.09727112948894501, -0.03197847306728363, 0.14927475154399872, -0.009763691574335098, -0.023258216679096222, -0.04926127940416336, -0.02966230921447277, 0.008857249282300472, -0.04284257814288139, -0.06627900153398514, -0.07738225162029266, 0.11514600366353989, 0.08959195762872696, -0.10659381747245789, 0.08520930260419846, 0.015626702457666397, 0.08540601283311844, -0.034978609532117844, -0.11101030558347702, 0.02964920364320278, -0.0781736969947815, -0.03302983194589615, 0.020484481006860733, 0.015209720470011234, 0.15811467170715332, -0.047807253897190094, 0.03255440667271614, -0.053977835923433304, -0.081588014960289, -0.07487276196479797, 0.0030048107728362083, 0.03440859913825989, 0.012089725583791733, -0.02884894795715809, 0.0236821286380291, -0.0692654550075531, 0.032496128231287, -0.060885217040777206, 0.12634263932704926, 0.03143087401986122, -0.05524350330233574, 0.01363141369074583, 0.22055914998054504, -0.056659698486328125, -0.16814079880714417, -0.0933198556303978, 0.07391321659088135, 0.05094955116510391, -0.06186001002788544, -0.23032774031162262, 0.13311465084552765, 0.015154492110013962, 0.0070924581959843636, -0.060916684567928314, -0.18008913099765778, -0.04533814266324043, 0.12074191868305206, 0.03589753061532974, 0.31924837827682495, -0.0438978336751461, 0.05814464017748833, -0.051460713148117065, -0.09768310934305191, 0.1872996687889099, -0.07749827206134796, 0.07752427458763123, -0.023140747100114822, 0.12540625035762787, -0.014681599102914333, -0.025097670033574104, 0.06454408913850784, 0.041185833513736725, 0.052158765494823456, 0.01034210342913866, 0.06389261782169342, 0.09127573668956757, -0.011843662708997726, 0.1278710663318634, -0.01965486817061901, 0.058331213891506195, -0.1213582307100296, -0.08957404643297195, -0.02409506030380726, 0.05747954547405243, 0.07918206602334976, -0.09077493101358414, -0.11163950711488724, 0.008894631639122963, 0.0368112251162529, 0.012131867930293083, 0.043693624436855316, 0.07373229414224625, 0.003152919700369239, 0.10896620154380798, 0.04032072052359581, -0.0890657901763916, -0.026992419734597206, -0.003669701050966978, -0.021309329196810722, 0.1648237705230713, -0.16175858676433563, 0.0692332312464714, 0.12891016900539398, -0.059746503829956055, -0.0002335145982215181, 0.04649580270051956, -0.11190952360630035, 0.029677970334887505, 0.0840022936463356, -0.06882385164499283, -0.005523166619241238, -0.042686913162469864, 0.1401405781507492, 0.07045019418001175, 0.1548795849084854, 0.1114446148276329, -0.10856106132268906, -0.03067297302186489, 0.006537249311804771, -0.02922138199210167, -0.04924686625599861, 0.0654904693365097, 0.06308204680681229, 0.05650502070784569, -0.020221175625920296, 0.05852283164858818, 0.05744258314371109, -0.10749763250350952, 0.08254855871200562, 0.0366668775677681, -0.12982653081417084, -0.07156851142644882, 0.03550441563129425, 0.10560613870620728, -0.20031310617923737, -0.17315128445625305, -0.10822644084692001, -0.008594431914389133, 0.06738721579313278, 0.180136039853096, 0.073577880859375, -0.02920921891927719, -0.07534660398960114, 0.044281069189310074, -0.11573253571987152, 0.0542609840631485, 0.05687032267451286, 0.026282543316483498, -0.1131124347448349, 0.06201452016830444, 0.02611149474978447, 0.08791138976812363, -0.0594988577067852, -0.0748489573597908, -0.08032714575529099, 0.0747918039560318, -0.11756785213947296, 0.029185030609369278, -0.005524443928152323, -0.037692707031965256, 0.01282354723662138, 0.008745311759412289, -0.051987409591674805, 0.0624062679708004, -0.030272798612713814, 0.09474016726016998, 0.002249050885438919, -0.01860308088362217, -0.0012098881416022778, -0.0423925444483757, -0.007515480276197195, -0.014289576560258865, 0.08753084391355515, -0.04923059046268463, -0.02207096666097641, 0.05608057230710983, -0.0803922712802887, 0.018291229382157326, 0.05585632473230362, 0.07355328649282455, 0.010261432267725468, -0.06287992745637894, 0.08585794270038605, 0.022671258077025414, 0.07211253046989441, -0.04529033973813057, -0.03628677502274513, -0.06000812351703644, 0.047035735100507736, -0.08795768767595291, -0.011430799029767513, -0.10390851646661758, -0.010088932700455189, -0.006321408320218325, 0.15849529206752777, 0.06662097573280334, -0.08043352514505386, -0.04602598026394844, -0.0007851665141060948, 0.011841376312077045, 0.001605889410711825, 0.009620868600904942, 0.015586658380925655, -0.0584959015250206, 0.004083050414919853, -0.019032174721360207, 0.11811819672584534, -0.004167630802839994, -0.02815016359090805, -0.027906112372875214, -0.015703000128269196, -0.13495872914791107, -0.011007403954863548, 0.20430314540863037, 0.03572310134768486, 0.02216363698244095, -0.07494399696588516, 0.037025392055511475, 0.06780419498682022, 0.19900508224964142, 0.10102500021457672, -0.007590571418404579, 0.020914727821946144, 0.05813825875520706, -0.034381669014692307, -0.00292969006113708, -0.0823187381029129, -0.03208328038454056, -0.11811038106679916, 0.01104251854121685, 0.011258010752499104, 0.02734517864882946, 0.17485132813453674, -0.01637408509850502, 0.006187134422361851, -0.011078596115112305, -0.048317283391952515, -0.08639732003211975, -0.1817142814397812, -0.06778288632631302, -0.1942286193370819, 0.028013283386826515, -0.07297094911336899, -0.06882220506668091, 0.1069207414984703, 0.058547135442495346, -0.04822442680597305, 0.02809867635369301, 0.059900909662246704, -0.023722397163510323, -0.017995111644268036, -0.011727044358849525, -0.02344416454434395, 0.046544067561626434, 0.04592094570398331, -0.04056069999933243, 0.031362369656562805, 0.008227833546698093, 0.008107283152639866, 0.01205009687691927, 0.08814786374568939, 0.020966749638319016, 0.01245551835745573, -0.04643829166889191, -0.008064242079854012, -0.034526191651821136, 0.06700888276100159, 0.023486727848649025, -0.07971104234457016, 0.029775718227028847, 0.15925730764865875, -0.058002445846796036, -0.12510378658771515, -0.1930670291185379, 0.09898752719163895, 0.00042224672506563365, 0.04339013248682022, -0.003198012476786971, -0.007985478267073631, -0.156849667429924, 0.2082199901342392, 0.021431464701890945, 0.05809883773326874, -0.03509143367409706, -0.028172364458441734, -0.01743701845407486, -0.09775547683238983, 0.17235836386680603, 0.1362268477678299, 0.29886582493782043, -0.06358636915683746, -0.1058167815208435, -0.12080451101064682, 0.03775433823466301, -0.1358294039964676, -0.06698886305093765, -0.009967515245079994, -0.001580029376782477, -0.03262319788336754, 0.06074729189276695, -0.15876175463199615, -0.1414739191532135, 0.003271812340244651, 0.038650527596473694, -0.02349049784243107, -0.01516692154109478, 0.012428813613951206, -0.02799730934202671, 0.022238926962018013, -0.1003994569182396, 0.052906349301338196, 0.06990703195333481, 0.059929706156253815, -0.10493354499340057, -0.0004123502585571259, 0.11441842466592789, -0.004980756901204586, 0.013817095197737217, -0.050440456718206406, 0.13767677545547485, 0.10022933781147003, -0.006762501783668995, -0.06535141170024872, 0.106354720890522, -0.0034227899741381407, -0.0031460393220186234, 0.02739722840487957, 0.023385202512145042, -0.018490426242351532, 0.11019109189510345, -0.04275095462799072, -0.05489641800522804, -0.07625815272331238, 0.01992238499224186, 0.13366486132144928, -0.09110191464424133, 0.04864831641316414, -0.065587118268013, 0.08507168292999268, -0.022179050371050835, -0.03710286319255829, -0.04129350557923317, -0.07841823250055313, 0.08773829787969589, -0.006334621924906969, -0.008745891973376274, 0.021263817325234413, -0.11726544052362442, -0.00970851443707943, -0.0656818151473999, 0.03764098882675171, -0.14542724192142487, 0.02253718487918377, -0.05439091846346855, -0.040795326232910156, -0.03147115558385849, 0.05758339911699295, 0.07690723240375519, 0.010334022343158722, -0.062066324055194855, 0.06309133023023605, -0.07947622984647751, 0.011023514904081821, -0.14057573676109314, -0.12927067279815674 ]
null
null
transformers
# GPT-Neo-small for vietnamese First GPT for vietnamese ## Model Description GPT-Neo-vi-small is a transformer model designed using EleutherAI's replication of the GPT-3 architecture. ## Training data GPT-Neo-vi-smal was trained on the News datasets, a large scale dataset created by from News Website for the purpose of training this model. ### How to use his example generates a different sequence each time it's run: ```py from transformers import GPTNeoForCausalLM, GPT2Tokenizer model = GPTNeoForCausalLM.from_pretrained("NlpHUST/gpt-neo-vi-small") tokenizer = GPT2Tokenizer.from_pretrained("NlpHUST/gpt-neo-vi-small") prompt = "Ngay sau Tết Nguyên đán Tân Sửu, hiện tượng giá đất tăng tại nhiều địa phương. Thị trường nhộn nhịp, tạo ra những cơn sóng sốt đất khó tin khiến bộ ngành, địa phương đưa cảnh báo." input_ids = tokenizer(prompt, return_tensors="pt").input_ids gen_tokens = model.generate(input_ids, do_sample=True, temperature=1.0, max_length=1024) gen_text = tokenizer.batch_decode(gen_tokens)[0] print(gen_text) ``` ### Contact information For personal communication related to this project, please contact Nha Nguyen Van ([email protected]).
{"language": "vi", "tags": ["vi", "vietnamese", "text-generation", "gpt3", "lm", "nlp"], "datasets": ["vietnamese"], "widget": [{"text": "Vi\u1ec7t Nam l\u00e0 qu\u1ed1c gia c\u00f3"}], "pipeline_tag": "text-generation"}
text-generation
NlpHUST/gpt-neo-vi-small
[ "transformers", "pytorch", "gpt_neo", "text-generation", "vi", "vietnamese", "gpt3", "lm", "nlp", "dataset:vietnamese", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "vi" ]
TAGS #transformers #pytorch #gpt_neo #text-generation #vi #vietnamese #gpt3 #lm #nlp #dataset-vietnamese #autotrain_compatible #endpoints_compatible #region-us
# GPT-Neo-small for vietnamese First GPT for vietnamese ## Model Description GPT-Neo-vi-small is a transformer model designed using EleutherAI's replication of the GPT-3 architecture. ## Training data GPT-Neo-vi-smal was trained on the News datasets, a large scale dataset created by from News Website for the purpose of training this model. ### How to use his example generates a different sequence each time it's run: ### Contact information For personal communication related to this project, please contact Nha Nguyen Van (nha282@URL).
[ "# GPT-Neo-small for vietnamese\nFirst GPT for vietnamese", "## Model Description\nGPT-Neo-vi-small is a transformer model designed using EleutherAI's replication of the GPT-3 architecture.", "## Training data\nGPT-Neo-vi-smal was trained on the News datasets, a large scale dataset created by from News Website for the purpose of training this model.", "### How to use\nhis example generates a different sequence each time it's run:", "### Contact information\nFor personal communication related to this project, please contact Nha Nguyen Van (nha282@URL)." ]
[ "TAGS\n#transformers #pytorch #gpt_neo #text-generation #vi #vietnamese #gpt3 #lm #nlp #dataset-vietnamese #autotrain_compatible #endpoints_compatible #region-us \n", "# GPT-Neo-small for vietnamese\nFirst GPT for vietnamese", "## Model Description\nGPT-Neo-vi-small is a transformer model designed using EleutherAI's replication of the GPT-3 architecture.", "## Training data\nGPT-Neo-vi-smal was trained on the News datasets, a large scale dataset created by from News Website for the purpose of training this model.", "### How to use\nhis example generates a different sequence each time it's run:", "### Contact information\nFor personal communication related to this project, please contact Nha Nguyen Van (nha282@URL)." ]
[ 61, 18, 36, 41, 21, 25 ]
[ "passage: TAGS\n#transformers #pytorch #gpt_neo #text-generation #vi #vietnamese #gpt3 #lm #nlp #dataset-vietnamese #autotrain_compatible #endpoints_compatible #region-us \n# GPT-Neo-small for vietnamese\nFirst GPT for vietnamese## Model Description\nGPT-Neo-vi-small is a transformer model designed using EleutherAI's replication of the GPT-3 architecture.## Training data\nGPT-Neo-vi-smal was trained on the News datasets, a large scale dataset created by from News Website for the purpose of training this model.### How to use\nhis example generates a different sequence each time it's run:### Contact information\nFor personal communication related to this project, please contact Nha Nguyen Van (nha282@URL)." ]
[ -0.01144811324775219, 0.06837336719036102, -0.00038015248719602823, 0.07340168952941895, 0.10145556181669235, -0.037332575768232346, 0.0593113973736763, 0.09565957635641098, -0.0983441174030304, -0.04910653457045555, 0.1449500471353531, 0.05160736292600632, 0.019582023844122887, 0.13798461854457855, 0.0835631936788559, -0.29196617007255554, 0.03167889267206192, 0.11255794018507004, -0.013443797826766968, 0.08186313509941101, 0.0705915167927742, -0.06170325726270676, 0.15549905598163605, 0.03763346001505852, -0.09655623883008957, 0.0239451602101326, -0.11716701835393906, -0.08244801312685013, 0.1155022531747818, -0.0069166189059615135, -0.028811462223529816, 0.014735736884176731, 0.0829920619726181, -0.02046731300652027, 0.015992071479558945, 0.04197127744555473, -0.030720120295882225, 0.027168909087777138, -0.028528448194265366, 0.07008787989616394, 0.2875434160232544, -0.13042482733726501, -0.0007464703521691263, -0.022724159061908722, -0.10667125880718231, -0.24057456851005554, -0.03467873856425285, 0.021095845848321915, 0.13525842130184174, 0.1267070174217224, -0.02263065055012703, 0.1969577968120575, -0.1240244060754776, 0.005515590310096741, 0.08981624245643616, -0.2848486602306366, -0.04599207639694214, 0.1467505246400833, 0.05074611306190491, 0.048940252512693405, -0.039935220032930374, 0.07107910513877869, 0.06539537012577057, 0.0877043604850769, 0.06882449984550476, -0.05573789030313492, -0.07983995974063873, 0.06828155368566513, -0.14996929466724396, 0.027488095685839653, 0.2369534969329834, -0.0027958503924310207, 0.03274897113442421, -0.025084707885980606, -0.004299186635762453, -0.03421402350068092, 0.01001317985355854, -0.1464715152978897, -0.027400318533182144, -0.0036763011012226343, 0.01129136886447668, -0.11098694801330566, -0.10699145495891571, -0.09346953779459, -0.1018841490149498, -0.08891310542821884, 0.03875086456537247, 0.02588805928826332, -0.07628844678401947, 0.08179675042629242, -0.12367691844701767, 0.017789453268051147, -0.045482534915208817, -0.05190946161746979, 0.016111386939883232, -0.025312907993793488, 0.03238702565431595, 0.0038501382805407047, 0.015234781429171562, -0.03438488394021988, 0.006157101131975651, -0.023654187098145485, 0.05177312344312668, 0.013137398287653923, 0.002376800635829568, 0.15198704600334167, -0.19032728672027588, 0.019009992480278015, 0.02350732870399952, -0.07285117357969284, -0.07430194318294525, 0.0133657930418849, -0.13319745659828186, -0.012378739193081856, 0.01974680833518505, -0.03802333399653435, -0.044315457344055176, 0.11915427446365356, 0.0129385432228446, -0.05308185890316963, 0.08890944719314575, -0.0022890125401318073, -0.014131210744380951, -0.06688864529132843, -0.0347423329949379, 0.11325064301490784, 0.06074725463986397, 0.02950618974864483, -0.07168525457382202, -0.017840450629591942, -0.04245101287961006, -0.07406215369701385, -0.05331851541996002, -0.03035474196076393, -0.004323958419263363, -0.020151525735855103, -0.03715578839182854, -0.09800060838460922, -0.2673204243183136, 0.0122604975476861, 0.03506210446357727, -0.039823874831199646, -0.06371685862541199, -0.12307129055261612, -0.014360432513058186, -0.019930917769670486, 0.021860316395759583, 0.04434364289045334, -0.03383413329720497, 0.03412555903196335, -0.023374121636152267, 0.15103086829185486, -0.08743232488632202, 0.07111426442861557, -0.0694996789097786, 0.0009564821957610548, -0.038540590554475784, 0.11461422592401505, -0.008772649802267551, -0.026181485503911972, -0.0789632722735405, -0.11964238435029984, 0.0037231259047985077, 0.01070321537554264, -0.016513224691152573, 0.07828385382890701, -0.11204979568719864, -0.06926065683364868, 0.13009749352931976, -0.04879491776227951, -0.06878990679979324, 0.16658318042755127, 0.018166545778512955, 0.1469392627477646, 0.13120362162590027, 0.04572157561779022, 0.08242423832416534, -0.06584861129522324, 0.005111434031277895, 0.07342752069234848, -0.10076240450143814, -0.08419833332300186, 0.11512073874473572, 0.11701694875955582, -0.1533663272857666, 0.02402598224580288, -0.12470526993274689, 0.05227915197610855, -0.1283106803894043, -0.03965519368648529, 0.0551019087433815, -0.07143719494342804, 0.11702435463666916, -0.022174322977662086, 0.10800806432962418, 0.02438151277601719, -0.07822798192501068, 0.07998818159103394, 0.10056452453136444, -0.03742764890193939, -0.005819430574774742, -0.14631852507591248, -0.010144717060029507, -0.05324285477399826, 0.038467515259981155, -0.026130827143788338, 0.01206934079527855, 0.010305861942470074, 0.09348773956298828, 0.03942781686782837, 0.08398433774709702, 0.019763514399528503, 0.0351310633122921, -0.032719094306230545, 0.05335851386189461, -0.022406939417123795, 0.012754586525261402, -0.00014555671077687293, -0.04248006269335747, 0.060622233897447586, -0.004507857840508223, 0.048337072134017944, -0.1745106279850006, -0.024026844650506973, 0.03859236091375351, 0.010233231820166111, -0.0265195332467556, 0.06824685633182526, -0.005309690721333027, 0.09713298827409744, -0.0007151706959120929, -0.03827764093875885, 0.05318119376897812, -0.006875557359308004, -0.03515009954571724, 0.09874256700277328, 0.0017339882906526327, -0.009432148188352585, 0.09281757473945618, -0.10123169422149658, -0.04059695824980736, 0.12273003160953522, -0.03464316204190254, -0.0496244803071022, -0.06261525303125381, 0.0177009217441082, 0.16078995168209076, -0.032425496727228165, 0.13930600881576538, -0.051628660410642624, -0.0154116814956069, 0.018582843244075775, -0.03774222359061241, 0.013129083439707756, 0.10019508749246597, 0.13290388882160187, -0.09896715730428696, 0.1286374032497406, 0.043051742017269135, -0.042880769819021225, 0.13944502174854279, 0.08477532118558884, -0.01698877289891243, -0.059494223445653915, 0.0010004665236920118, -0.010872984305024147, -0.0018942535389214754, -0.16936102509498596, 0.007545010186731815, 0.030811632052063942, 0.04802289977669716, 0.06777234375476837, -0.09803436696529388, -0.07449494302272797, -0.009953350760042667, -0.0323658287525177, 0.05884872004389763, 0.10369373112916946, -0.048764634877443314, 0.0667206197977066, 0.07386844605207443, -0.00819086842238903, 0.03451252728700638, 0.0522017739713192, -0.05971304327249527, 0.19486664235591888, -0.07870030403137207, -0.4143986701965332, -0.09390896558761597, -0.10648689419031143, -0.04197016358375549, 0.017602063715457916, 0.03740941360592842, -0.24308650195598602, -0.03554452955722809, 0.000537657062523067, 0.08642145991325378, -0.0016376671846956015, 0.02001108229160309, 0.07027814537286758, -0.01590590737760067, -0.0912635400891304, -0.04770167917013168, -0.022436659783124924, -0.07387828081846237, -0.20066840946674347, 0.08001258969306946, -0.1432211697101593, 0.0009477776475250721, 0.10753989219665527, -0.01656796783208847, 0.06564424186944962, -0.011614350602030754, 0.18879522383213043, -0.08236972987651825, 0.023650309070944786, 0.1646421253681183, 0.09367211908102036, 0.03502979502081871, -0.00047786461072973907, 0.019183894619345665, -0.0693928524851799, 0.05725126713514328, 0.03914937749505043, -0.073793426156044, -0.15618064999580383, -0.13053098320960999, -0.05245744809508324, 0.048728253692388535, 0.042532991617918015, 0.0526832714676857, 0.10950449854135513, 0.07437893003225327, 0.07203295826911926, 0.21688735485076904, -0.043477702885866165, 0.05425581708550453, 0.1955071985721588, -0.0419028215110302, 0.011598686687648296, -0.05768735706806183, -0.12392280995845795, 0.12034547328948975, 0.011645713821053505, 0.1882905811071396, 0.005997767671942711, 0.02545807510614395, 0.0979849323630333, 0.07333502918481827, 0.055150553584098816, 0.029639676213264465, -0.09517472237348557, -0.03592521697282791, -0.04571866989135742, -0.07543011009693146, 0.09232746809720993, 0.031656328588724136, -0.07067455351352692, -0.04620233550667763, 0.04213167726993561, 0.10189017653465271, 0.018469691276550293, 0.14435511827468872, 0.09975915402173996, -0.31762248277664185, -0.030244557186961174, -0.04281140863895416, 0.07677172869443893, -0.049102671444416046, 0.05623317509889603, 0.11849755793809891, -0.15691885352134705, 0.06308964639902115, 0.021004391834139824, 0.08302411437034607, -0.11283458024263382, 0.03782053291797638, 0.0576813630759716, 0.006086353678256273, 0.029746001586318016, 0.12818259000778198, -0.2082700878381729, 0.20965655148029327, -0.04719938710331917, -0.013444795273244381, -0.0672212541103363, -0.04131750762462616, -0.028163498267531395, 0.15560004115104675, 0.14881718158721924, 0.05715635418891907, 0.09895680099725723, -0.06884347647428513, -0.13903860747814178, 0.04051622748374939, -0.05985967442393303, -0.0005116503452882171, -0.005762441083788872, -0.0026432895101606846, -0.0412369966506958, -0.0009515010751783848, 0.0017519379034638405, -0.08384472131729126, -0.05980755388736725, 0.024896841496229172, 0.07542029768228531, 0.04923389479517937, 0.021630389615893364, -0.09234587848186493, -0.048445381224155426, 0.17818571627140045, 0.05824120715260506, -0.05773645266890526, -0.04009352996945381, 0.031112562865018845, 0.03545450419187546, -0.057787973433732986, -0.004936655517667532, 0.03999006003141403, 0.033153824508190155, 0.04053909331560135, -0.08211960643529892, 0.0560867004096508, -0.039290715008974075, -0.06382496654987335, 0.024215666577219963, 0.046556323766708374, 0.14474183320999146, -0.01617857627570629, 0.054041702300310135, -0.03889615461230278, -0.04792839288711548, -0.13320288062095642, -0.017337163910269737, 0.15605631470680237, 0.032831717282533646, 0.0074532292783260345, -0.021916035562753677, -0.031789254397153854, -0.00878951232880354, -0.09571938216686249, 0.18762271106243134, 0.1061260998249054, -0.07302986830472946, 0.09275242686271667, 0.12704810500144958, -0.07621254771947861, -0.2406059354543686, -0.04822571948170662, 0.029463831335306168, 0.0008286628290079534, -0.06628275662660599, -0.1750878393650055, 0.1128065213561058, 0.07736431062221527, 0.02468244917690754, -0.11051853001117706, -0.2748074531555176, -0.09570010751485825, 0.13993285596370697, -0.025336258113384247, 0.22026202082633972, -0.007132188882678747, 0.015952005982398987, -0.04547896981239319, -0.10645012557506561, 0.24167434871196747, -0.02118063159286976, 0.06696364283561707, -0.06265681982040405, 0.09261514246463776, -0.004262154456228018, 0.0184345506131649, 0.12130393832921982, 0.015304800122976303, -0.01647298038005829, -0.09088142961263657, -0.03851791098713875, 0.1317913979291916, 0.019481895491480827, 0.13741588592529297, 0.06560703366994858, -0.000051946317398687825, -0.1493387669324875, -0.07615606486797333, -0.09434784948825836, -0.008557187393307686, 0.028405053541064262, -0.09037146717309952, -0.06475258618593216, 0.09107502549886703, 0.006099265068769455, -0.011627192609012127, -0.08082888275384903, 0.014165681786835194, 0.00876983068883419, -0.07488236576318741, 0.08903001248836517, -0.07924751192331314, 0.03910548612475395, -0.0325283408164978, -0.0034115477465093136, 0.10667251795530319, -0.11772304773330688, 0.003475307486951351, 0.1049085259437561, -0.03210040554404259, 0.11978516727685928, 0.05495676025748253, -0.07713180035352707, 0.02062717266380787, 0.10073481500148773, -0.12760894000530243, -0.16893553733825684, -0.09814739227294922, -0.0023963479325175285, 0.13206090033054352, 0.08922196179628372, 0.04389583319425583, -0.12934671342372894, -0.0604282021522522, 0.011516982689499855, -0.0015472507802769542, -0.10410037636756897, 0.0372239425778389, -0.04486160725355148, 0.02874484658241272, -0.07514747232198715, 0.015149935148656368, 0.12409152090549469, -0.05191267654299736, 0.03704570233821869, 0.13417066633701324, -0.1151655912399292, -0.056863270699977875, 0.11813536286354065, 0.1352299004793167, -0.0765823945403099, -0.07590360194444656, 0.009848971851170063, -0.06189199909567833, 0.10743670910596848, 0.07601522654294968, 0.07910463958978653, 0.04223288223147392, -0.026346800848841667, 0.0007225602166727185, -0.09750503301620483, 0.0035808170214295387, 0.043714337050914764, -0.003180169267579913, -0.14903883635997772, 0.0019710813648998737, 0.040459733456373215, 0.234984889626503, -0.08079122006893158, -0.0936269462108612, -0.10890420526266098, 0.03328125551342964, -0.07960604876279831, 0.021140528842806816, -0.07185311615467072, -0.014087754301726818, -0.03991960734128952, -0.04025135561823845, -0.06040807068347931, 0.02382160909473896, -0.05079526826739311, 0.04358687624335289, 0.027556972578167915, 0.034529298543930054, -0.06118505820631981, -0.01378533337265253, 0.05338575318455696, 0.012628603726625443, 0.12022063136100769, -0.04651149362325668, -0.029561040922999382, 0.04122837632894516, -0.038457319140434265, -0.011783028021454811, 0.06609847396612167, 0.05025552213191986, 0.09081439673900604, -0.05273120850324631, 0.031961459666490555, 0.012013791128993034, 0.06517326831817627, 0.0464291125535965, 0.09087818115949631, -0.050763703882694244, 0.02942904271185398, -0.1036306619644165, -0.03792828693985939, -0.04022815823554993, 0.05320394039154053, 0.006594947073608637, 0.087027408182621, 0.03711853548884392, -0.09018086642026901, 0.011284960433840752, -0.06269051879644394, -0.026244306936860085, -0.046913664788007736, -0.05608002096414566, 0.03237530589103699, -0.0408586822450161, 0.034131456166505814, 0.0260491780936718, 0.16351814568042755, 0.05217295140028, -0.0742906704545021, -0.030629035085439682, 0.055234551429748535, -0.06529539823532104, -0.03513449430465698, 0.10003054887056351, 0.03857268765568733, 0.0011121593415737152, -0.023336419835686684, 0.08997280895709991, -0.011681032367050648, 0.18477746844291687, 0.11761581152677536, -0.03919166699051857, 0.08494402468204498, -0.004446328151971102, 0.039002519100904465, 0.032880451530218124, -0.19993141293525696, -0.031408585608005524, -0.09568527340888977, 0.06560883671045303, -0.11045463383197784, -0.015769440680742264, 0.09918033331632614, -0.07830654829740524, 0.05839795619249344, -0.022104188799858093, -0.07590650022029877, -0.13576054573059082, -0.28522494435310364, -0.09633688628673553, -0.21276473999023438, -0.013677298091351986, -0.09677483141422272, -0.014191132970154285, 0.045306604355573654, 0.1447523981332779, -0.0537998229265213, 0.10528940707445145, -0.07828563451766968, -0.0465470552444458, 0.013942033052444458, -0.0693788006901741, 0.01530756801366806, -0.11247918009757996, 0.10299372673034668, 0.021867115050554276, 0.06958847492933273, 0.03941528499126434, 0.007531320210546255, -0.008880790323019028, 0.07489294558763504, -0.05549982190132141, -0.0077219558879733086, -0.056167084723711014, -0.0046966904774308205, 0.0024262897204607725, -0.0632491260766983, 0.010911795310676098, -0.09502344578504562, 0.035798266530036926, 0.1939917951822281, -0.006095637567341328, -0.16890351474285126, -0.13364356756210327, 0.21153223514556885, -0.04055650159716606, 0.025866571813821793, -0.011860951781272888, 0.05269269272685051, -0.1014634445309639, 0.3168959617614746, 0.21996153891086578, 0.016405198723077774, -0.036615848541259766, -0.030168935656547546, 0.013560544699430466, -0.016857918351888657, 0.11968052387237549, 0.13210782408714294, 0.22832252085208893, -0.09127349406480789, -0.10003028064966202, -0.1154525876045227, 0.019692547619342804, -0.0645524188876152, -0.0799933671951294, 0.11110592633485794, 0.02801056019961834, -0.024786556139588356, 0.09672873467206955, -0.16151739656925201, 0.07429492473602295, -0.0971176028251648, -0.1065644770860672, -0.13388743996620178, -0.0019763652235269547, -0.005935266148298979, -0.012302055954933167, 0.00639455346390605, -0.021169669926166534, 0.0315263569355011, 0.06438250094652176, 0.03985670581459999, -0.11681034415960312, -0.07059194892644882, 0.1535826027393341, 0.07701689004898071, 0.17079725861549377, -0.03066137060523033, 0.08853041380643845, 0.05636446177959442, -0.0451270155608654, -0.10225050896406174, 0.06319426745176315, -0.07456480711698532, 0.044671863317489624, 0.010336311534047127, -0.039220523089170456, -0.03300157189369202, 0.018935605883598328, -0.014547222293913364, -0.04611625894904137, -0.004426118917763233, 0.0337977260351181, 0.07134238630533218, -0.052244964987039566, 0.058764953166246414, -0.0909406766295433, 0.11579126864671707, 0.11994478851556778, -0.042334046214818954, 0.0308527909219265, -0.10021118819713593, 0.08059926331043243, -0.06117554381489754, -0.00019969175627920777, -0.01969888061285019, -0.13772547245025635, -0.023483922705054283, -0.09333466738462448, 0.06430023908615112, -0.12304063141345978, -0.01966126076877117, -0.06248724088072777, -0.016547031700611115, -0.03115096129477024, 0.06499872356653214, 0.05018163099884987, 0.04146508872509003, -0.06195324659347534, -0.0764598399400711, -0.08074408024549484, 0.02231081947684288, -0.11720699816942215, -0.10406016558408737 ]
null
null
transformers
# T5-EN-VI-BASE:Pretraining Text-To-Text Transfer Transformer for English Vietnamese Translation # Dataset The *IWSLT'15 English-Vietnamese* data is used from [Stanford NLP group](https://nlp.stanford.edu/projects/nmt/). For all experiments the corpus was split into training, development and test set: | Data set | Sentences | Download | ----------- | --------- | --------------------------------------------------------------------------------------------------------------------------------- | Training | 133,317 | via [GitHub](https://github.com/stefan-it/nmt-en-vi/raw/master/data/train-en-vi.tgz) or located in `data/train-en-vi.tgz` | Development | 1,553 | via [GitHub](https://github.com/stefan-it/nmt-en-vi/raw/master/data/dev-2012-en-vi.tgz) or located in `data/dev-2012-en-vi.tgz` | Test | 1,268 | via [GitHub](https://github.com/stefan-it/nmt-en-vi/raw/master/data/test-2013-en-vi.tgz) or located in `data/test-2013-en-vi.tgz` ## Results The results on test set. | Model | BLEU (Beam Search) | ----------------------------------------------------------------------------------------------------- | ------------------ | [Luong & Manning (2015)](https://nlp.stanford.edu/pubs/luong-manning-iwslt15.pdf) | 23.30 | Sequence-to-sequence model with attention | 26.10 | Neural Phrase-based Machine Translation [Huang et. al. (2017)](https://arxiv.org/abs/1706.05565) | 27.69 | Neural Phrase-based Machine Translation + LM [Huang et. al. (2017)](https://arxiv.org/abs/1706.05565) | 28.07 | t5-en-vi-small (pretraining, without training data) | **28.46** (cased) / **29.23** (uncased) |t5-en-vi-small (fineturning with training data) | **32.38** (cased) / **33.19** (uncased) | t5-en-vi-base (pretraining, without training data) | **29.66** (cased) / **30.37** (uncased) #### Example Using ``` bash import torch from transformers import T5ForConditionalGeneration, T5Tokenizer import torch if torch.cuda.is_available(): device = torch.device("cuda") print('There are %d GPU(s) available.' % torch.cuda.device_count()) print('We will use the GPU:', torch.cuda.get_device_name(0)) else: print('No GPU available, using the CPU instead.') device = torch.device("cpu") model = T5ForConditionalGeneration.from_pretrained("NlpHUST/t5-en-vi-small") tokenizer = T5Tokenizer.from_pretrained("NlpHUST/t5-en-vi-small") model.to(device) src = "In school , we spent a lot of time studying the history of Kim Il-Sung , but we never learned much about the outside world , except that America , South Korea , Japan are the enemies ." tokenized_text = tokenizer.encode(src, return_tensors="pt").to(device) model.eval() summary_ids = model.generate( tokenized_text, max_length=128, num_beams=5, repetition_penalty=2.5, length_penalty=1.0, early_stopping=True ) output = tokenizer.decode(summary_ids[0], skip_special_tokens=True) print(output) ``` #### Output ``` bash Ở trường, chúng tôi dành nhiều thời gian để nghiên cứu về lịch sử Kim Il-Sung, nhưng chúng tôi chưa bao giờ học được nhiều về thế giới bên ngoài, ngoại trừ Mỹ, Hàn Quốc, Nhật Bản là kẻ thù. ``` ### Contact information For personal communication related to this project, please contact Nha Nguyen Van ([email protected]).
{}
text2text-generation
NlpHUST/t5-en-vi-base
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "arxiv:1706.05565", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[ "1706.05565" ]
[]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #arxiv-1706.05565 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
T5-EN-VI-BASE:Pretraining Text-To-Text Transfer Transformer for English Vietnamese Translation ============================================================================================== Dataset ======= The *IWSLT'15 English-Vietnamese* data is used from Stanford NLP group. For all experiments the corpus was split into training, development and test set: Data set: Training, Sentences: 133,317, Download: via GitHub or located in 'data/URL' Data set: Development, Sentences: 1,553, Download: via GitHub or located in 'data/URL' Data set: Test, Sentences: 1,268, Download: via GitHub or located in 'data/URL' Results ------- The results on test set. #### Example Using #### Output ### Contact information For personal communication related to this project, please contact Nha Nguyen Van (nha282@URL).
[ "#### Example Using", "#### Output", "### Contact information\n\n\nFor personal communication related to this project, please contact Nha Nguyen Van (nha282@URL)." ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #arxiv-1706.05565 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "#### Example Using", "#### Output", "### Contact information\n\n\nFor personal communication related to this project, please contact Nha Nguyen Van (nha282@URL)." ]
[ 59, 6, 4, 25 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #arxiv-1706.05565 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n#### Example Using#### Output### Contact information\n\n\nFor personal communication related to this project, please contact Nha Nguyen Van (nha282@URL)." ]
[ -0.023408230394124985, 0.07068675011396408, -0.0033957071136683226, 0.02095269411802292, 0.11574886739253998, -0.05047590658068657, 0.12225483357906342, 0.0664532482624054, -0.07472224533557892, -0.013948457315564156, 0.19614312052726746, 0.1895805299282074, 0.034461408853530884, 0.09097513556480408, -0.06714332848787308, -0.23346930742263794, -0.014221815392374992, 0.10855530202388763, 0.013121544383466244, 0.11292808502912521, 0.09260193258523941, -0.0782371386885643, 0.15408526360988617, 0.02682683616876602, -0.14494821429252625, 0.02051866054534912, -0.0561843067407608, -0.10242078453302383, 0.13450641930103302, 0.0367245189845562, 0.09748905152082443, 0.06934603303670883, 0.014073913916945457, -0.08244580030441284, 0.00991995632648468, -0.01827252097427845, -0.05020861700177193, 0.0704522356390953, 0.036294832825660706, 0.0034427957143634558, 0.2776968777179718, -0.03648072108626366, -0.03928107023239136, 0.04101268947124481, -0.08459515869617462, -0.09397860616445541, 0.016449877992272377, 0.07787702232599258, 0.11345622688531876, 0.05917080491781235, 0.010777927003800869, 0.22561317682266235, -0.01846957951784134, 0.09547184407711029, 0.14047777652740479, -0.3262166380882263, -0.04196179658174515, 0.012236503884196281, 0.08285463601350784, 0.1034092828631401, -0.026991544291377068, 0.08064177632331848, 0.06139941141009331, 0.03655700758099556, 0.051427796483039856, -0.11286938190460205, -0.16397643089294434, 0.024534184485673904, -0.08699803799390793, 0.006934352684766054, 0.3141251802444458, 0.005938987247645855, 0.022702142596244812, 0.009489155374467373, -0.07037178426980972, -0.11045806854963303, -0.006651115138083696, -0.1235307976603508, -0.05696626007556915, -0.02092571184039116, -0.02710709534585476, -0.04758660867810249, -0.17328058183193207, -0.021881574764847755, -0.08529272675514221, -0.07727425545454025, -0.017363205552101135, 0.0461864173412323, -0.18077799677848816, 0.06866396963596344, 0.09404600411653519, -0.08936764299869537, 0.04894358664751053, -0.08115813881158829, 0.06900809705257416, -0.008967704139649868, 0.06068950891494751, -0.1548353135585785, 0.06299316883087158, -0.038436409085989, 0.029481912031769753, 0.038979899138212204, -0.07695738971233368, 0.05999933183193207, 0.04240000620484352, 0.07313072681427002, -0.05015838146209717, -0.02100927010178566, 0.04744594544172287, -0.08905893564224243, -0.0015179545152932405, -0.027364516630768776, -0.17909862101078033, -0.035362958908081055, 0.0747084841132164, 0.07289350777864456, -0.01946854777634144, 0.05093898996710777, -0.010669772513210773, -0.040092527866363525, 0.11531894654035568, -0.0032797788735479116, -0.05229298025369644, -0.04296945407986641, 0.012903254479169846, 0.14015066623687744, 0.04344819113612175, 0.04160354658961296, -0.10545600950717926, 0.036744069308042526, -0.06664153933525085, -0.03918229416012764, -0.03691722825169563, -0.03604208678007126, 0.03084850125014782, 0.012096731923520565, -0.0056779226288199425, -0.1644769012928009, -0.0940471887588501, 0.02608550898730755, 0.03562726080417633, -0.05424885451793671, -0.07878011465072632, -0.06237185373902321, -0.021934062242507935, 0.03912924602627754, -0.05657260864973068, 0.04144744947552681, -0.029811259359121323, 0.06636159867048264, -0.02016189508140087, 0.11479918658733368, -0.11927498131990433, 0.05343097448348999, -0.06538233160972595, 0.015434717759490013, 0.022710230201482773, 0.006976715289056301, -0.01076656486839056, 0.14148566126823425, -0.060776300728321075, -0.07983984798192978, -0.0573277585208416, -0.017400087788701057, 0.010984598658978939, 0.16359993815422058, -0.048721637576818466, -0.046382706612348557, 0.14862459897994995, -0.04779147729277611, -0.18448978662490845, 0.18795762956142426, 0.039820849895477295, 0.10725953429937363, 0.08298851549625397, 0.070506252348423, 0.03324975073337555, -0.17644308507442474, 0.06403626501560211, 0.13407765328884125, -0.10365947335958481, -0.1489783525466919, 0.037549443542957306, 0.07464074343442917, -0.08721856772899628, 0.043844956904649734, 0.09497617185115814, 0.05534019321203232, -0.1093311533331871, -0.043976087123155594, 0.003562312573194504, -0.058502864092588425, 0.03696576878428459, 0.0322955921292305, 0.1422949880361557, -0.0028072334825992584, -0.012657531537115574, -0.01377484854310751, 0.017210854217410088, -0.010484575293958187, 0.02345460280776024, -0.04653764143586159, 0.02817305363714695, -0.000589092611335218, -0.0031702774576842785, -0.12854740023612976, 0.03124961629509926, 0.004417980555444956, 0.0978119745850563, 0.02009008824825287, 0.11062732338905334, 0.023020029067993164, 0.01695973426103592, 0.0028746025636792183, 0.030158786103129387, 0.14073893427848816, -0.011354818008840084, -0.07464126497507095, -0.03542882204055786, 0.07018924504518509, -0.029920514672994614, 0.00033735521719790995, -0.16956815123558044, -0.008233416825532913, 0.050409235060214996, 0.07284785807132721, 0.023539390414953232, 0.058275386691093445, -0.0004963524988852441, 0.0863507091999054, -0.07088388502597809, 0.018990730866789818, 0.10461892932653427, 0.001362024457193911, -0.09953758120536804, 0.16049028933048248, -0.10626008361577988, 0.22053749859333038, 0.09527458995580673, -0.25332504510879517, -0.023544959723949432, 0.0007047576364129782, -0.05615135654807091, -0.0022984622046351433, 0.020989924669265747, 0.005380786489695311, 0.06284943968057632, -0.007728463038802147, 0.19785891473293304, -0.0446990542113781, -0.030752619728446007, 0.02584444172680378, -0.07531356066465378, -0.0058810715563595295, 0.11987757682800293, 0.0845307931303978, -0.22756794095039368, 0.11413805186748505, 0.12061252444982529, -0.00731394998729229, 0.19776862859725952, 0.05458551272749901, -0.03538317605853081, -0.017298081889748573, 0.03421701490879059, -0.001867325627245009, -0.017665699124336243, -0.1749422401189804, -0.0590667761862278, 0.06449010223150253, -0.0006650462164543569, 0.05145254358649254, -0.08119900524616241, -0.030188634991645813, -0.02425743080675602, 0.027162758633494377, 0.02692030370235443, 0.12883765995502472, -0.020295405760407448, 0.12987670302391052, 0.021573727950453758, 0.0050021144561469555, 0.022649966180324554, 0.008160080760717392, -0.0788140520453453, 0.17485293745994568, -0.05018238723278046, -0.4428640604019165, -0.19566568732261658, -0.11513719707727432, -0.02096334472298622, 0.0039160228334367275, 0.08758505433797836, -0.1756654530763626, -0.044120270758867264, 0.023218918591737747, -0.01005576178431511, -0.10062888264656067, -0.013604178093373775, -0.005885764490813017, 0.05369340628385544, -0.07132531702518463, -0.07425972074270248, -0.05619064345955849, -0.04159979522228241, -0.049628276377916336, 0.10913390666246414, -0.12862932682037354, 0.0990062803030014, 0.10508115589618683, 0.029333394020795822, 0.050018660724163055, 0.008218699134886265, 0.14626722037792206, -0.05587178096175194, 0.02925807610154152, 0.19350434839725494, -0.03153684362769127, 0.10625746846199036, 0.0664152279496193, 0.023699074983596802, -0.0666135624051094, 0.06439460068941116, -0.020912569016218185, -0.0884091705083847, -0.21915698051452637, -0.12154827266931534, -0.11341829597949982, 0.12214909493923187, 0.00429312139749527, 0.028437258675694466, 0.18458783626556396, 0.05621163547039032, 0.06592012196779251, 0.10867928713560104, -0.07272180914878845, 0.12528502941131592, 0.1794336438179016, -0.01726415567100048, 0.07517621666193008, -0.08927690237760544, -0.1242934837937355, 0.0749942809343338, -0.011472699232399464, 0.10829660296440125, 0.06402347981929779, 0.11237968504428864, 0.042708463966846466, 0.15277045965194702, 0.12937524914741516, 0.07150036841630936, 0.0038062427192926407, -0.038552213460206985, -0.03592967242002487, -0.06513750553131104, -0.005898704752326012, 0.034025028347969055, -0.04553736746311188, -0.08414003998041153, 0.0015401176642626524, -0.012009863741695881, -0.020063690841197968, 0.10439835488796234, 0.08376028388738632, -0.2421543300151825, -0.025496603921055794, 0.032526999711990356, 0.0006734793423675001, -0.11882699280977249, 0.03863222151994705, 0.061905886977910995, -0.15664027631282806, 0.032182034105062485, -0.056660886853933334, 0.11791777610778809, -0.07461603730916977, 0.09085585176944733, -0.09424582123756409, -0.07155172526836395, 0.016951007768511772, 0.08626612275838852, -0.23352892696857452, 0.3171319365501404, -0.007383828517049551, -0.07433862239122391, -0.07474921643733978, -0.036108728498220444, -0.04476011544466019, 0.15992258489131927, 0.11945337057113647, 0.014624722301959991, 0.02015453577041626, -0.09844762086868286, -0.032462120056152344, 0.059582047164440155, 0.07925291359424591, 0.05063418298959732, -0.05029720440506935, 0.005994895007461309, -0.056381285190582275, -0.03793754428625107, -0.039678074419498444, -0.11720617860555649, -0.1080377921462059, 0.08334234356880188, 0.04702998325228691, 0.09339042752981186, 0.04982313513755798, -0.021847601979970932, 0.022840091958642006, 0.18178698420524597, 0.0004730799118988216, -0.050030842423439026, -0.07047363370656967, -0.039813462644815445, 0.046119336038827896, -0.09508081525564194, 0.029640423133969307, -0.07461044937372208, -0.024563180282711983, 0.007154578808695078, -0.13120603561401367, 0.06849528104066849, -0.07151806354522705, -0.0103726452216506, -0.013215562328696251, 0.13646964728832245, 0.05105287954211235, -0.005756995640695095, 0.011208768002688885, -0.0514376237988472, -0.11687251925468445, -0.10809198021888733, 0.010774761438369751, 0.02559158019721508, 0.13774754106998444, -0.1116982102394104, -0.06864961981773376, -0.049482837319374084, -0.0474432036280632, -0.08315856009721756, 0.2761571705341339, 0.1309206187725067, -0.04700597748160362, 0.1367570459842682, 0.18371134996414185, -0.0641193613409996, -0.22513683140277863, -0.1579296886920929, 0.0004352104151621461, 0.011308908462524414, -0.007801959291100502, -0.10717841982841492, 0.034203868359327316, 0.018270235508680344, -0.016011901199817657, 0.025082604959607124, -0.22766506671905518, -0.1133052185177803, 0.13866975903511047, 0.0014434404438361526, 0.2586384117603302, -0.11731735616922379, -0.04979950189590454, -0.09293881803750992, -0.28513801097869873, 0.19357915222644806, 0.0313437357544899, 0.027947774156928062, -0.03894218057394028, 0.1355622261762619, 0.022499730810523033, -0.03805616497993469, 0.09188760817050934, -0.04243568703532219, -0.018198495730757713, -0.0961422398686409, -0.1308072805404663, 0.014610160142183304, 0.0013886996312066913, 0.09667215496301651, -0.04023003950715065, 0.03059164434671402, -0.18339091539382935, -0.01891099475324154, -0.063365139067173, 0.03500267490744591, 0.01766437664628029, -0.07723115384578705, -0.02636558748781681, -0.04338263347744942, -0.01871444471180439, 0.007619089912623167, 0.14890576899051666, -0.05784677341580391, 0.14295704662799835, 0.2020089477300644, 0.18598335981369019, -0.11385372281074524, 0.14235256612300873, -0.07131169736385345, -0.05993398651480675, 0.07963637262582779, -0.1534385085105896, 0.029024416580796242, 0.11699521541595459, -0.024687353521585464, 0.0758601725101471, 0.03798453137278557, 0.012060931883752346, 0.03741266205906868, 0.10974108427762985, -0.17854684591293335, -0.04582129418849945, -0.09332199394702911, 0.049582500010728836, 0.09761066734790802, 0.12524206936359406, 0.09286615997552872, -0.052618060261011124, -0.05711628869175911, 0.011693608947098255, -0.01827559433877468, -0.06939448416233063, -0.0033974475227296352, 0.016951965168118477, 0.03414183482527733, -0.04927372559905052, -0.02318088710308075, 0.06550022214651108, -0.14290368556976318, -0.006344051565974951, 0.13994336128234863, -0.12953734397888184, -0.10023822635412216, 0.0594751350581646, 0.1203998401761055, -0.10888169705867767, -0.051941610872745514, -0.024757999926805496, -0.032478950917720795, 0.04544936120510101, 0.21111096441745758, 0.03267231211066246, 0.040812794119119644, -0.053801581263542175, 0.006370481103658676, -0.07296819239854813, 0.008083960972726345, 0.05644483491778374, 0.01208685152232647, -0.16184930503368378, 0.054912008345127106, 0.002589731477200985, 0.18569788336753845, -0.07848858833312988, -0.07121046632528305, -0.15336112678050995, 0.01365254633128643, -0.10767766833305359, -0.04516080021858215, -0.0925941988825798, -0.05178900808095932, -0.04515702277421951, -0.07987787574529648, -0.07584943622350693, -0.01309916377067566, -0.09613458812236786, 0.05741123482584953, -0.003625372890383005, 0.04660472646355629, -0.03268728032708168, -0.00935794785618782, 0.11468422412872314, 0.012630615383386612, 0.09662799537181854, 0.05270392447710037, -0.07938814163208008, 0.06771768629550934, -0.1496451199054718, 0.0258309468626976, 0.08118546009063721, 0.03913696110248566, 0.09186950325965881, 0.08695632219314575, -0.03409658372402191, 0.04118412733078003, 0.08774837851524353, 0.02225526049733162, 0.08130757510662079, -0.0896337628364563, 0.037266358733177185, -0.06883565336465836, -0.16963331401348114, -0.08449231088161469, 0.009943121112883091, -0.01007286086678505, 0.025665398687124252, 0.023736584931612015, -0.07959331572055817, 0.06431470811367035, -0.057621490210294724, 0.010850818827748299, 0.02891489677131176, -0.10980892926454544, -0.010319401510059834, -0.1615913212299347, 0.01802363432943821, -0.014888051897287369, 0.11551139503717422, 0.0028291677590459585, -0.0003710002056322992, 0.011769890785217285, 0.04835449904203415, -0.10106522589921951, -0.036233529448509216, 0.1218779981136322, 0.05078470706939697, -0.01250676903873682, -0.0889769122004509, 0.06614642590284348, -0.05166313797235489, 0.088371641933918, 0.11158007383346558, -0.018097760155797005, -0.034223299473524094, 0.005977503024041653, -0.0016225456492975354, 0.06532225012779236, -0.14594607055187225, -0.16047024726867676, -0.05895111337304115, 0.07339957356452942, -0.08397198468446732, 0.0836428701877594, 0.1754460483789444, 0.0005149524076841772, 0.009374774061143398, -0.02758149243891239, -0.0843711718916893, -0.15526893734931946, -0.16374795138835907, -0.0868540108203888, -0.1294088214635849, 0.0020835865288972855, -0.04830917716026306, 0.09073097258806229, -0.0032113261986523867, 0.08160428702831268, -0.050342023372650146, 0.1077846959233284, 0.08447831869125366, -0.09272945672273636, 0.06749210506677628, -0.016751449555158615, 0.032757747918367386, -0.13853155076503754, 0.12186941504478455, -0.07595264166593552, 0.03426149860024452, -0.013566801324486732, 0.01912110485136509, -0.05505656078457832, -0.01474613044410944, -0.10123258084058762, -0.07665591686964035, -0.036730166524648666, 0.06425336748361588, -0.004216256085783243, 0.04751845821738243, -0.012095608748495579, -0.031003151088953018, 0.010610687546432018, 0.20398123562335968, -0.06557485461235046, -0.05762935429811478, -0.07933878153562546, 0.24696563184261322, -0.05133785679936409, 0.039287641644477844, -0.022862425073981285, 0.06880102306604385, -0.06289147585630417, 0.35069864988327026, 0.23369984328746796, -0.00956777948886156, 0.004776140674948692, -0.013349993154406548, 0.045288946479558945, 0.002856326522305608, 0.1281060427427292, 0.14903254806995392, 0.348494291305542, -0.08118399232625961, -0.05410337075591087, -0.08346834033727646, 0.010457729920744896, -0.13193947076797485, 0.033751390874385834, 0.08044590055942535, -0.038456182926893234, 0.0018942216411232948, 0.1213209331035614, -0.19593320786952972, 0.0660720095038414, -0.1190790981054306, -0.13323131203651428, -0.08514793962240219, 0.04000118747353554, 0.0807938352227211, -0.006996529642492533, 0.05000530183315277, -0.049726251512765884, -0.04665020480751991, 0.12261346727609634, 0.04174806550145149, -0.13778957724571228, 0.019098125398159027, 0.12433037161827087, -0.11593063920736313, -0.020294979214668274, -0.024929091334342957, 0.0929790735244751, 0.09488634765148163, -0.018279869109392166, -0.02911723218858242, 0.031483206897974014, 0.004918338265269995, -0.02279411070048809, 0.035004664212465286, 0.028109386563301086, 0.028648216277360916, -0.019956421107053757, -0.024159062653779984, -0.11721546202898026, 0.02153988741338253, 0.028349071741104126, 0.10656914114952087, -0.024995334446430206, 0.07741041481494904, -0.05570793151855469, 0.08072680979967117, 0.06282715499401093, -0.021249176934361458, -0.00010318762360839173, -0.06339485943317413, 0.004285438451915979, -0.008805806748569012, -0.12927305698394775, -0.05411912128329277, -0.14554201066493988, -0.039306171238422394, 0.015532507561147213, 0.023847641423344612, -0.18580631911754608, 0.029608706012368202, -0.05687909945845604, 0.038041189312934875, -0.12862248718738556, 0.012346722185611725, 0.04764091223478317, -0.03494802117347717, 0.00003430946162552573, -0.03021465800702572, 0.016526497900485992, 0.049051299691200256, -0.09218073636293411, -0.06443444639444351 ]
null
null
transformers
# T5-EN-VI-SMALL:Pretraining Text-To-Text Transfer Transformer for English Vietnamese Translation # Dataset The *IWSLT'15 English-Vietnamese* data is used from [Stanford NLP group](https://nlp.stanford.edu/projects/nmt/). For all experiments the corpus was split into training, development and test set: | Data set | Sentences | Download | ----------- | --------- | --------------------------------------------------------------------------------------------------------------------------------- | Training | 133,317 | via [GitHub](https://github.com/stefan-it/nmt-en-vi/raw/master/data/train-en-vi.tgz) or located in `data/train-en-vi.tgz` | Development | 1,553 | via [GitHub](https://github.com/stefan-it/nmt-en-vi/raw/master/data/dev-2012-en-vi.tgz) or located in `data/dev-2012-en-vi.tgz` | Test | 1,268 | via [GitHub](https://github.com/stefan-it/nmt-en-vi/raw/master/data/test-2013-en-vi.tgz) or located in `data/test-2013-en-vi.tgz` ## Results The results on test set. | Model | BLEU (Beam Search) | ----------------------------------------------------------------------------------------------------- | ------------------ | [Luong & Manning (2015)](https://nlp.stanford.edu/pubs/luong-manning-iwslt15.pdf) | 23.30 | Sequence-to-sequence model with attention | 26.10 | Neural Phrase-based Machine Translation [Huang et. al. (2017)](https://arxiv.org/abs/1706.05565) | 27.69 | Neural Phrase-based Machine Translation + LM [Huang et. al. (2017)](https://arxiv.org/abs/1706.05565) | 28.07 | t5-en-vi-small (pretraining, without training data) | **28.46** (cased) / **29.23** (uncased) |t5-en-vi-small (fineturning with training data) | **32.38** (cased) / **33.19** (uncased) #### Example Using ``` bash import torch from transformers import T5ForConditionalGeneration, T5Tokenizer import torch if torch.cuda.is_available(): device = torch.device("cuda") print('There are %d GPU(s) available.' % torch.cuda.device_count()) print('We will use the GPU:', torch.cuda.get_device_name(0)) else: print('No GPU available, using the CPU instead.') device = torch.device("cpu") model = T5ForConditionalGeneration.from_pretrained("NlpHUST/t5-en-vi-small") tokenizer = T5Tokenizer.from_pretrained("NlpHUST/t5-en-vi-small") model.to(device) src = "In school , we spent a lot of time studying the history of Kim Il-Sung , but we never learned much about the outside world , except that America , South Korea , Japan are the enemies ." tokenized_text = tokenizer.encode(src, return_tensors="pt").to(device) model.eval() summary_ids = model.generate( tokenized_text, max_length=128, num_beams=5, repetition_penalty=2.5, length_penalty=1.0, early_stopping=True ) output = tokenizer.decode(summary_ids[0], skip_special_tokens=True) print(output) ``` #### Output ``` bash Ở trường, chúng tôi dành nhiều thời gian để nghiên cứu về lịch sử Kim Il-Sung, nhưng chúng tôi chưa bao giờ học được nhiều về thế giới bên ngoài, ngoại trừ Mỹ, Hàn Quốc, Nhật Bản là kẻ thù. ``` ### Contact information For personal communication related to this project, please contact Nha Nguyen Van ([email protected]).
{}
text2text-generation
NlpHUST/t5-en-vi-small
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "arxiv:1706.05565", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[ "1706.05565" ]
[]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #arxiv-1706.05565 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
T5-EN-VI-SMALL:Pretraining Text-To-Text Transfer Transformer for English Vietnamese Translation =============================================================================================== Dataset ======= The *IWSLT'15 English-Vietnamese* data is used from Stanford NLP group. For all experiments the corpus was split into training, development and test set: Data set: Training, Sentences: 133,317, Download: via GitHub or located in 'data/URL' Data set: Development, Sentences: 1,553, Download: via GitHub or located in 'data/URL' Data set: Test, Sentences: 1,268, Download: via GitHub or located in 'data/URL' Results ------- The results on test set. #### Example Using #### Output ### Contact information For personal communication related to this project, please contact Nha Nguyen Van (nha282@URL).
[ "#### Example Using", "#### Output", "### Contact information\n\n\nFor personal communication related to this project, please contact Nha Nguyen Van (nha282@URL)." ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #arxiv-1706.05565 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "#### Example Using", "#### Output", "### Contact information\n\n\nFor personal communication related to this project, please contact Nha Nguyen Van (nha282@URL)." ]
[ 59, 6, 4, 25 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #arxiv-1706.05565 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n#### Example Using#### Output### Contact information\n\n\nFor personal communication related to this project, please contact Nha Nguyen Van (nha282@URL)." ]
[ -0.023408230394124985, 0.07068675011396408, -0.0033957071136683226, 0.02095269411802292, 0.11574886739253998, -0.05047590658068657, 0.12225483357906342, 0.0664532482624054, -0.07472224533557892, -0.013948457315564156, 0.19614312052726746, 0.1895805299282074, 0.034461408853530884, 0.09097513556480408, -0.06714332848787308, -0.23346930742263794, -0.014221815392374992, 0.10855530202388763, 0.013121544383466244, 0.11292808502912521, 0.09260193258523941, -0.0782371386885643, 0.15408526360988617, 0.02682683616876602, -0.14494821429252625, 0.02051866054534912, -0.0561843067407608, -0.10242078453302383, 0.13450641930103302, 0.0367245189845562, 0.09748905152082443, 0.06934603303670883, 0.014073913916945457, -0.08244580030441284, 0.00991995632648468, -0.01827252097427845, -0.05020861700177193, 0.0704522356390953, 0.036294832825660706, 0.0034427957143634558, 0.2776968777179718, -0.03648072108626366, -0.03928107023239136, 0.04101268947124481, -0.08459515869617462, -0.09397860616445541, 0.016449877992272377, 0.07787702232599258, 0.11345622688531876, 0.05917080491781235, 0.010777927003800869, 0.22561317682266235, -0.01846957951784134, 0.09547184407711029, 0.14047777652740479, -0.3262166380882263, -0.04196179658174515, 0.012236503884196281, 0.08285463601350784, 0.1034092828631401, -0.026991544291377068, 0.08064177632331848, 0.06139941141009331, 0.03655700758099556, 0.051427796483039856, -0.11286938190460205, -0.16397643089294434, 0.024534184485673904, -0.08699803799390793, 0.006934352684766054, 0.3141251802444458, 0.005938987247645855, 0.022702142596244812, 0.009489155374467373, -0.07037178426980972, -0.11045806854963303, -0.006651115138083696, -0.1235307976603508, -0.05696626007556915, -0.02092571184039116, -0.02710709534585476, -0.04758660867810249, -0.17328058183193207, -0.021881574764847755, -0.08529272675514221, -0.07727425545454025, -0.017363205552101135, 0.0461864173412323, -0.18077799677848816, 0.06866396963596344, 0.09404600411653519, -0.08936764299869537, 0.04894358664751053, -0.08115813881158829, 0.06900809705257416, -0.008967704139649868, 0.06068950891494751, -0.1548353135585785, 0.06299316883087158, -0.038436409085989, 0.029481912031769753, 0.038979899138212204, -0.07695738971233368, 0.05999933183193207, 0.04240000620484352, 0.07313072681427002, -0.05015838146209717, -0.02100927010178566, 0.04744594544172287, -0.08905893564224243, -0.0015179545152932405, -0.027364516630768776, -0.17909862101078033, -0.035362958908081055, 0.0747084841132164, 0.07289350777864456, -0.01946854777634144, 0.05093898996710777, -0.010669772513210773, -0.040092527866363525, 0.11531894654035568, -0.0032797788735479116, -0.05229298025369644, -0.04296945407986641, 0.012903254479169846, 0.14015066623687744, 0.04344819113612175, 0.04160354658961296, -0.10545600950717926, 0.036744069308042526, -0.06664153933525085, -0.03918229416012764, -0.03691722825169563, -0.03604208678007126, 0.03084850125014782, 0.012096731923520565, -0.0056779226288199425, -0.1644769012928009, -0.0940471887588501, 0.02608550898730755, 0.03562726080417633, -0.05424885451793671, -0.07878011465072632, -0.06237185373902321, -0.021934062242507935, 0.03912924602627754, -0.05657260864973068, 0.04144744947552681, -0.029811259359121323, 0.06636159867048264, -0.02016189508140087, 0.11479918658733368, -0.11927498131990433, 0.05343097448348999, -0.06538233160972595, 0.015434717759490013, 0.022710230201482773, 0.006976715289056301, -0.01076656486839056, 0.14148566126823425, -0.060776300728321075, -0.07983984798192978, -0.0573277585208416, -0.017400087788701057, 0.010984598658978939, 0.16359993815422058, -0.048721637576818466, -0.046382706612348557, 0.14862459897994995, -0.04779147729277611, -0.18448978662490845, 0.18795762956142426, 0.039820849895477295, 0.10725953429937363, 0.08298851549625397, 0.070506252348423, 0.03324975073337555, -0.17644308507442474, 0.06403626501560211, 0.13407765328884125, -0.10365947335958481, -0.1489783525466919, 0.037549443542957306, 0.07464074343442917, -0.08721856772899628, 0.043844956904649734, 0.09497617185115814, 0.05534019321203232, -0.1093311533331871, -0.043976087123155594, 0.003562312573194504, -0.058502864092588425, 0.03696576878428459, 0.0322955921292305, 0.1422949880361557, -0.0028072334825992584, -0.012657531537115574, -0.01377484854310751, 0.017210854217410088, -0.010484575293958187, 0.02345460280776024, -0.04653764143586159, 0.02817305363714695, -0.000589092611335218, -0.0031702774576842785, -0.12854740023612976, 0.03124961629509926, 0.004417980555444956, 0.0978119745850563, 0.02009008824825287, 0.11062732338905334, 0.023020029067993164, 0.01695973426103592, 0.0028746025636792183, 0.030158786103129387, 0.14073893427848816, -0.011354818008840084, -0.07464126497507095, -0.03542882204055786, 0.07018924504518509, -0.029920514672994614, 0.00033735521719790995, -0.16956815123558044, -0.008233416825532913, 0.050409235060214996, 0.07284785807132721, 0.023539390414953232, 0.058275386691093445, -0.0004963524988852441, 0.0863507091999054, -0.07088388502597809, 0.018990730866789818, 0.10461892932653427, 0.001362024457193911, -0.09953758120536804, 0.16049028933048248, -0.10626008361577988, 0.22053749859333038, 0.09527458995580673, -0.25332504510879517, -0.023544959723949432, 0.0007047576364129782, -0.05615135654807091, -0.0022984622046351433, 0.020989924669265747, 0.005380786489695311, 0.06284943968057632, -0.007728463038802147, 0.19785891473293304, -0.0446990542113781, -0.030752619728446007, 0.02584444172680378, -0.07531356066465378, -0.0058810715563595295, 0.11987757682800293, 0.0845307931303978, -0.22756794095039368, 0.11413805186748505, 0.12061252444982529, -0.00731394998729229, 0.19776862859725952, 0.05458551272749901, -0.03538317605853081, -0.017298081889748573, 0.03421701490879059, -0.001867325627245009, -0.017665699124336243, -0.1749422401189804, -0.0590667761862278, 0.06449010223150253, -0.0006650462164543569, 0.05145254358649254, -0.08119900524616241, -0.030188634991645813, -0.02425743080675602, 0.027162758633494377, 0.02692030370235443, 0.12883765995502472, -0.020295405760407448, 0.12987670302391052, 0.021573727950453758, 0.0050021144561469555, 0.022649966180324554, 0.008160080760717392, -0.0788140520453453, 0.17485293745994568, -0.05018238723278046, -0.4428640604019165, -0.19566568732261658, -0.11513719707727432, -0.02096334472298622, 0.0039160228334367275, 0.08758505433797836, -0.1756654530763626, -0.044120270758867264, 0.023218918591737747, -0.01005576178431511, -0.10062888264656067, -0.013604178093373775, -0.005885764490813017, 0.05369340628385544, -0.07132531702518463, -0.07425972074270248, -0.05619064345955849, -0.04159979522228241, -0.049628276377916336, 0.10913390666246414, -0.12862932682037354, 0.0990062803030014, 0.10508115589618683, 0.029333394020795822, 0.050018660724163055, 0.008218699134886265, 0.14626722037792206, -0.05587178096175194, 0.02925807610154152, 0.19350434839725494, -0.03153684362769127, 0.10625746846199036, 0.0664152279496193, 0.023699074983596802, -0.0666135624051094, 0.06439460068941116, -0.020912569016218185, -0.0884091705083847, -0.21915698051452637, -0.12154827266931534, -0.11341829597949982, 0.12214909493923187, 0.00429312139749527, 0.028437258675694466, 0.18458783626556396, 0.05621163547039032, 0.06592012196779251, 0.10867928713560104, -0.07272180914878845, 0.12528502941131592, 0.1794336438179016, -0.01726415567100048, 0.07517621666193008, -0.08927690237760544, -0.1242934837937355, 0.0749942809343338, -0.011472699232399464, 0.10829660296440125, 0.06402347981929779, 0.11237968504428864, 0.042708463966846466, 0.15277045965194702, 0.12937524914741516, 0.07150036841630936, 0.0038062427192926407, -0.038552213460206985, -0.03592967242002487, -0.06513750553131104, -0.005898704752326012, 0.034025028347969055, -0.04553736746311188, -0.08414003998041153, 0.0015401176642626524, -0.012009863741695881, -0.020063690841197968, 0.10439835488796234, 0.08376028388738632, -0.2421543300151825, -0.025496603921055794, 0.032526999711990356, 0.0006734793423675001, -0.11882699280977249, 0.03863222151994705, 0.061905886977910995, -0.15664027631282806, 0.032182034105062485, -0.056660886853933334, 0.11791777610778809, -0.07461603730916977, 0.09085585176944733, -0.09424582123756409, -0.07155172526836395, 0.016951007768511772, 0.08626612275838852, -0.23352892696857452, 0.3171319365501404, -0.007383828517049551, -0.07433862239122391, -0.07474921643733978, -0.036108728498220444, -0.04476011544466019, 0.15992258489131927, 0.11945337057113647, 0.014624722301959991, 0.02015453577041626, -0.09844762086868286, -0.032462120056152344, 0.059582047164440155, 0.07925291359424591, 0.05063418298959732, -0.05029720440506935, 0.005994895007461309, -0.056381285190582275, -0.03793754428625107, -0.039678074419498444, -0.11720617860555649, -0.1080377921462059, 0.08334234356880188, 0.04702998325228691, 0.09339042752981186, 0.04982313513755798, -0.021847601979970932, 0.022840091958642006, 0.18178698420524597, 0.0004730799118988216, -0.050030842423439026, -0.07047363370656967, -0.039813462644815445, 0.046119336038827896, -0.09508081525564194, 0.029640423133969307, -0.07461044937372208, -0.024563180282711983, 0.007154578808695078, -0.13120603561401367, 0.06849528104066849, -0.07151806354522705, -0.0103726452216506, -0.013215562328696251, 0.13646964728832245, 0.05105287954211235, -0.005756995640695095, 0.011208768002688885, -0.0514376237988472, -0.11687251925468445, -0.10809198021888733, 0.010774761438369751, 0.02559158019721508, 0.13774754106998444, -0.1116982102394104, -0.06864961981773376, -0.049482837319374084, -0.0474432036280632, -0.08315856009721756, 0.2761571705341339, 0.1309206187725067, -0.04700597748160362, 0.1367570459842682, 0.18371134996414185, -0.0641193613409996, -0.22513683140277863, -0.1579296886920929, 0.0004352104151621461, 0.011308908462524414, -0.007801959291100502, -0.10717841982841492, 0.034203868359327316, 0.018270235508680344, -0.016011901199817657, 0.025082604959607124, -0.22766506671905518, -0.1133052185177803, 0.13866975903511047, 0.0014434404438361526, 0.2586384117603302, -0.11731735616922379, -0.04979950189590454, -0.09293881803750992, -0.28513801097869873, 0.19357915222644806, 0.0313437357544899, 0.027947774156928062, -0.03894218057394028, 0.1355622261762619, 0.022499730810523033, -0.03805616497993469, 0.09188760817050934, -0.04243568703532219, -0.018198495730757713, -0.0961422398686409, -0.1308072805404663, 0.014610160142183304, 0.0013886996312066913, 0.09667215496301651, -0.04023003950715065, 0.03059164434671402, -0.18339091539382935, -0.01891099475324154, -0.063365139067173, 0.03500267490744591, 0.01766437664628029, -0.07723115384578705, -0.02636558748781681, -0.04338263347744942, -0.01871444471180439, 0.007619089912623167, 0.14890576899051666, -0.05784677341580391, 0.14295704662799835, 0.2020089477300644, 0.18598335981369019, -0.11385372281074524, 0.14235256612300873, -0.07131169736385345, -0.05993398651480675, 0.07963637262582779, -0.1534385085105896, 0.029024416580796242, 0.11699521541595459, -0.024687353521585464, 0.0758601725101471, 0.03798453137278557, 0.012060931883752346, 0.03741266205906868, 0.10974108427762985, -0.17854684591293335, -0.04582129418849945, -0.09332199394702911, 0.049582500010728836, 0.09761066734790802, 0.12524206936359406, 0.09286615997552872, -0.052618060261011124, -0.05711628869175911, 0.011693608947098255, -0.01827559433877468, -0.06939448416233063, -0.0033974475227296352, 0.016951965168118477, 0.03414183482527733, -0.04927372559905052, -0.02318088710308075, 0.06550022214651108, -0.14290368556976318, -0.006344051565974951, 0.13994336128234863, -0.12953734397888184, -0.10023822635412216, 0.0594751350581646, 0.1203998401761055, -0.10888169705867767, -0.051941610872745514, -0.024757999926805496, -0.032478950917720795, 0.04544936120510101, 0.21111096441745758, 0.03267231211066246, 0.040812794119119644, -0.053801581263542175, 0.006370481103658676, -0.07296819239854813, 0.008083960972726345, 0.05644483491778374, 0.01208685152232647, -0.16184930503368378, 0.054912008345127106, 0.002589731477200985, 0.18569788336753845, -0.07848858833312988, -0.07121046632528305, -0.15336112678050995, 0.01365254633128643, -0.10767766833305359, -0.04516080021858215, -0.0925941988825798, -0.05178900808095932, -0.04515702277421951, -0.07987787574529648, -0.07584943622350693, -0.01309916377067566, -0.09613458812236786, 0.05741123482584953, -0.003625372890383005, 0.04660472646355629, -0.03268728032708168, -0.00935794785618782, 0.11468422412872314, 0.012630615383386612, 0.09662799537181854, 0.05270392447710037, -0.07938814163208008, 0.06771768629550934, -0.1496451199054718, 0.0258309468626976, 0.08118546009063721, 0.03913696110248566, 0.09186950325965881, 0.08695632219314575, -0.03409658372402191, 0.04118412733078003, 0.08774837851524353, 0.02225526049733162, 0.08130757510662079, -0.0896337628364563, 0.037266358733177185, -0.06883565336465836, -0.16963331401348114, -0.08449231088161469, 0.009943121112883091, -0.01007286086678505, 0.025665398687124252, 0.023736584931612015, -0.07959331572055817, 0.06431470811367035, -0.057621490210294724, 0.010850818827748299, 0.02891489677131176, -0.10980892926454544, -0.010319401510059834, -0.1615913212299347, 0.01802363432943821, -0.014888051897287369, 0.11551139503717422, 0.0028291677590459585, -0.0003710002056322992, 0.011769890785217285, 0.04835449904203415, -0.10106522589921951, -0.036233529448509216, 0.1218779981136322, 0.05078470706939697, -0.01250676903873682, -0.0889769122004509, 0.06614642590284348, -0.05166313797235489, 0.088371641933918, 0.11158007383346558, -0.018097760155797005, -0.034223299473524094, 0.005977503024041653, -0.0016225456492975354, 0.06532225012779236, -0.14594607055187225, -0.16047024726867676, -0.05895111337304115, 0.07339957356452942, -0.08397198468446732, 0.0836428701877594, 0.1754460483789444, 0.0005149524076841772, 0.009374774061143398, -0.02758149243891239, -0.0843711718916893, -0.15526893734931946, -0.16374795138835907, -0.0868540108203888, -0.1294088214635849, 0.0020835865288972855, -0.04830917716026306, 0.09073097258806229, -0.0032113261986523867, 0.08160428702831268, -0.050342023372650146, 0.1077846959233284, 0.08447831869125366, -0.09272945672273636, 0.06749210506677628, -0.016751449555158615, 0.032757747918367386, -0.13853155076503754, 0.12186941504478455, -0.07595264166593552, 0.03426149860024452, -0.013566801324486732, 0.01912110485136509, -0.05505656078457832, -0.01474613044410944, -0.10123258084058762, -0.07665591686964035, -0.036730166524648666, 0.06425336748361588, -0.004216256085783243, 0.04751845821738243, -0.012095608748495579, -0.031003151088953018, 0.010610687546432018, 0.20398123562335968, -0.06557485461235046, -0.05762935429811478, -0.07933878153562546, 0.24696563184261322, -0.05133785679936409, 0.039287641644477844, -0.022862425073981285, 0.06880102306604385, -0.06289147585630417, 0.35069864988327026, 0.23369984328746796, -0.00956777948886156, 0.004776140674948692, -0.013349993154406548, 0.045288946479558945, 0.002856326522305608, 0.1281060427427292, 0.14903254806995392, 0.348494291305542, -0.08118399232625961, -0.05410337075591087, -0.08346834033727646, 0.010457729920744896, -0.13193947076797485, 0.033751390874385834, 0.08044590055942535, -0.038456182926893234, 0.0018942216411232948, 0.1213209331035614, -0.19593320786952972, 0.0660720095038414, -0.1190790981054306, -0.13323131203651428, -0.08514793962240219, 0.04000118747353554, 0.0807938352227211, -0.006996529642492533, 0.05000530183315277, -0.049726251512765884, -0.04665020480751991, 0.12261346727609634, 0.04174806550145149, -0.13778957724571228, 0.019098125398159027, 0.12433037161827087, -0.11593063920736313, -0.020294979214668274, -0.024929091334342957, 0.0929790735244751, 0.09488634765148163, -0.018279869109392166, -0.02911723218858242, 0.031483206897974014, 0.004918338265269995, -0.02279411070048809, 0.035004664212465286, 0.028109386563301086, 0.028648216277360916, -0.019956421107053757, -0.024159062653779984, -0.11721546202898026, 0.02153988741338253, 0.028349071741104126, 0.10656914114952087, -0.024995334446430206, 0.07741041481494904, -0.05570793151855469, 0.08072680979967117, 0.06282715499401093, -0.021249176934361458, -0.00010318762360839173, -0.06339485943317413, 0.004285438451915979, -0.008805806748569012, -0.12927305698394775, -0.05411912128329277, -0.14554201066493988, -0.039306171238422394, 0.015532507561147213, 0.023847641423344612, -0.18580631911754608, 0.029608706012368202, -0.05687909945845604, 0.038041189312934875, -0.12862248718738556, 0.012346722185611725, 0.04764091223478317, -0.03494802117347717, 0.00003430946162552573, -0.03021465800702572, 0.016526497900485992, 0.049051299691200256, -0.09218073636293411, -0.06443444639444351 ]
null
null
transformers
# T5-SMALL-SUMMARIZATION :Pretraining Text-To-Text Transfer Transformer for Vietnamese Text Summarization #### Example Using ``` bash import torch from transformers import T5ForConditionalGeneration, T5Tokenizer import torch if torch.cuda.is_available(): device = torch.device("cuda") print('There are %d GPU(s) available.' % torch.cuda.device_count()) print('We will use the GPU:', torch.cuda.get_device_name(0)) else: print('No GPU available, using the CPU instead.') device = torch.device("cpu") model = T5ForConditionalGeneration.from_pretrained("NlpHUST/t5-small-vi-summarization") tokenizer = T5Tokenizer.from_pretrained("NlpHUST/t5-small-vi-summarization") model.to(device) src = "Theo BHXH Việt Nam, nhiều doanh nghiệp vẫn chỉ đóng BHXH cho người lao động theo mức lương. \\\\ Dù quy định từ 1/1/2018, tiền lương tháng đóng BHXH gồm mức lương và thêm khoản bổ sung khác. \\\\ BHXH Việt Nam vừa có báo cáo về tình hình thực hiện chính sách BHXH thời gian qua. \\\\ Theo đó, tình trạng nợ, trốn đóng BHXH, BHTN vẫn xảy ra ở hầu hết các tỉnh, thành. \\\\ Thống kê tới ngày 31/12/2020, tổng số nợ BHXH, BHYT, BHTN là hơn 13.500 tỷ đồng, \\\\ chiếm 3,35 % số phải thu, trong đó: Số nợ BHXH bắt buộc là hơn 8.600 tỷ đồng, \\\\ nợ BHTN là 335 tỷ đồng. Liên quan tới tiền lương đóng BHXH, báo cáo của \\\\ BHXH Việt Nam cho thấy: Nhiều doanh nghiệp vẫn chủ yếu xây dựng thang, \\\\ bảng lương để đóng BHXH bằng mức thấp nhất. Tức là bằng mức lương tối \\\\ thiểu vùng, cộng thêm 7 % đối với lao động đã qua đào tạo nghề và cộng \\\\ thêm 5 % hoặc 7 % đối với lao động làm nghề hoặc công việc nặng nhọc, \\\\ độc hại, nguy hiểm, đặc biệt nặng nhọc độc hại và nguy hiểm. Đối với \\\\ lao động giữ chức vụ, khoảng 80 % doanh nghiệp đã xây dựng thang, \\\\ bảng lương cụ thể theo chức danh. Đơn cử như với chức vụ giám đốc \\\\ sản xuất, giám đốc điều hành, trưởng phòng. Còn lại các doanh nghiệp \\\\ xây dựng đối với lao động giữ chức vụ theo thang lương, bảng lương \\\\ chuyên môn nghiệp vụ và bảng phụ cấp chức vụ, phụ cấp trách nhiệm. \\\\ Thống kê của BHXH Việt Nam cũng cho thấy, đa số doanh nghiệp đã đăng \\\\ ký đóng BHXH cho người lao động theo mức lương mà không có khoản bổ \\\\ sung khác. Mặc dù quy định từ ngày 1/1/2018, tiền lương tháng đóng BHXH \\\\ gồm mức lương và thêm khoản bổ sung khác." tokenized_text = tokenizer.encode(src, return_tensors="pt").to(device) model.eval() summary_ids = model.generate( tokenized_text, max_length=256, num_beams=5, repetition_penalty=2.5, length_penalty=1.0, early_stopping=True ) output = tokenizer.decode(summary_ids[0], skip_special_tokens=True) print(output) ``` #### Output ``` bash Nhiều doanh nghiệp vẫn chủ yếu xây dựng thang, bảng lương để đóng BHXH bằng mức thấp nhất. \\ Dù quy định từ 1/1/2018, tiền lương tháng đóng BHXH gồm mức lương và thêm khoản bổ sung khác. \\ Thống kê của BHXH Việt Nam cho thấy, nhiều doanh nghiệp vẫn chỉ đóng BHXH \\ cho người lao động theo mức lương mà không có khoản bổ sung khác. ``` ### Contact information For personal communication related to this project, please contact Nha Nguyen Van ([email protected]).
{}
text2text-generation
NlpHUST/t5-small-vi-summarization
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "autotrain_compatible", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
# T5-SMALL-SUMMARIZATION :Pretraining Text-To-Text Transfer Transformer for Vietnamese Text Summarization #### Example Using #### Output ### Contact information For personal communication related to this project, please contact Nha Nguyen Van (nha282@URL).
[ "# T5-SMALL-SUMMARIZATION :Pretraining Text-To-Text Transfer Transformer for Vietnamese Text Summarization", "#### Example Using", "#### Output", "### Contact information\nFor personal communication related to this project, please contact Nha Nguyen Van (nha282@URL)." ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n", "# T5-SMALL-SUMMARIZATION :Pretraining Text-To-Text Transfer Transformer for Vietnamese Text Summarization", "#### Example Using", "#### Output", "### Contact information\nFor personal communication related to this project, please contact Nha Nguyen Van (nha282@URL)." ]
[ 55, 28, 6, 4, 25 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n# T5-SMALL-SUMMARIZATION :Pretraining Text-To-Text Transfer Transformer for Vietnamese Text Summarization#### Example Using#### Output### Contact information\nFor personal communication related to this project, please contact Nha Nguyen Van (nha282@URL)." ]
[ 0.02544875256717205, -0.019709670916199684, -0.0030074650421738625, -0.0062948111444711685, 0.11887108534574509, -0.07628954946994781, 0.09180842339992523, 0.06566150486469269, -0.15152643620967865, -0.052806317806243896, 0.10270989686250687, 0.1032433807849884, 0.02807651087641716, 0.082453653216362, -0.041551534086465836, -0.298403799533844, 0.006411374546587467, 0.12605871260166168, -0.0750458613038063, 0.13485710322856903, 0.07206134498119354, -0.05003572255373001, 0.1537282019853592, 0.010748541913926601, -0.07074136286973953, 0.0501444973051548, -0.07209499180316925, -0.11388259381055832, 0.09781528264284134, 0.020068086683750153, 0.05530849099159241, 0.07202399522066116, 0.015217510983347893, -0.08250290155410767, 0.014832143671810627, 0.02162410132586956, -0.05801888555288315, 0.03371181711554527, -0.04301270842552185, 0.020552467554807663, 0.34572896361351013, -0.13228437304496765, 0.0019882533233612776, 0.03292228654026985, -0.06380613148212433, -0.05647002533078194, 0.029471762478351593, 0.038076676428318024, 0.12307138741016388, 0.09055477380752563, -0.026030942797660828, 0.25280603766441345, -0.04369807988405228, 0.1023760735988617, 0.10237947851419449, -0.32139530777931213, -0.04071790352463722, -0.006617109756916761, 0.15955691039562225, 0.15261058509349823, -0.03759227320551872, 0.09540574997663498, 0.057730697095394135, 0.03885814920067787, 0.06994479894638062, -0.11107207834720612, -0.12088999897241592, 0.026055818423628807, -0.12942279875278473, 0.02981874905526638, 0.31423741579055786, 0.010553739964962006, 0.023148858919739723, 0.022994723170995712, -0.08930140733718872, -0.13265644013881683, -0.026770496740937233, -0.1396951973438263, -0.04184107854962349, -0.02052837237715721, -0.05694734305143356, -0.07777269184589386, -0.15590988099575043, -0.01402871310710907, -0.12367638200521469, -0.10525651276111603, 0.008092950098216534, -0.006832209415733814, -0.16413582861423492, 0.05726997181773186, 0.11703282594680786, -0.06373640149831772, 0.06429225206375122, -0.04987948387861252, -0.004555319435894489, -0.000561405555345118, 0.055649254471063614, -0.1987503618001938, 0.017420118674635887, -0.08331577479839325, -0.016826290637254715, 0.043077509850263596, -0.11571789532899857, 0.03422199934720993, 0.004368902184069157, 0.08201562613248825, -0.10798440128564835, 0.0083199767395854, -0.0019459971226751804, -0.09126510471105576, 0.004324935842305422, -0.012262385338544846, -0.16188256442546844, -0.014348285272717476, 0.07493803650140762, 0.04581134021282196, -0.04546649381518364, 0.09413902461528778, 0.03357579559087753, -0.07518626004457474, 0.1574081927537918, -0.011276674456894398, -0.04348510876297951, -0.05064331740140915, 0.023993046954274178, 0.14406754076480865, 0.10781414806842804, 0.03523282706737518, -0.11579857766628265, 0.008279233239591122, -0.030514990910887718, -0.02925271913409233, -0.011168225668370724, -0.08073785156011581, 0.0165896974503994, 0.03433676064014435, -0.06691788882017136, -0.14602677524089813, -0.08401467651128769, 0.04816628247499466, 0.0007988623110577464, -0.037536222487688065, -0.052698779851198196, -0.0805395320057869, -0.060597896575927734, 0.02938614785671234, -0.02241278626024723, 0.07701980322599411, -0.01525954995304346, 0.06946786493062973, -0.04128504544496536, 0.14942285418510437, -0.12963099777698517, 0.06188809499144554, -0.055301349610090256, 0.002000855514779687, -0.06389754265546799, 0.06196342781186104, -0.0021547088399529457, 0.11514481902122498, -0.04126351699233055, -0.09839535504579544, -0.093457892537117, 0.003555579576641321, -0.023730775341391563, 0.09957531094551086, -0.14065952599048615, -0.06214231625199318, 0.14052441716194153, -0.0754418894648552, -0.13567683100700378, 0.19862249493598938, 0.04445991292595863, 0.11675729602575302, 0.08012313395738602, 0.102378249168396, 0.06903757899999619, -0.14905014634132385, 0.06406670808792114, 0.12452729791402817, -0.15052855014801025, -0.11695654690265656, 0.04869344085454941, 0.13749252259731293, -0.060047853738069534, 0.002858062507584691, 0.11711669713258743, 0.06855381280183792, -0.0988537073135376, -0.03639883175492287, 0.012694727629423141, -0.05336827412247658, 0.09256063401699066, 0.019302576780319214, 0.13516075909137726, -0.0058732242323458195, -0.05917087942361832, 0.138004869222641, 0.019603075459599495, -0.02722313441336155, 0.07410945743322372, -0.0897546038031578, -0.05860305204987526, 0.053308431059122086, 0.013384728692471981, -0.10162636637687683, 0.09539153426885605, 0.0010341282468289137, 0.1512364000082016, 0.050201524049043655, 0.10895243287086487, 0.014325587078928947, 0.011838279664516449, -0.044430363923311234, 0.059555262327194214, 0.14735355973243713, 0.024463046342134476, -0.054058048874139786, -0.03477134928107262, 0.10761244595050812, -0.012616096064448357, 0.00734308548271656, -0.12715943157672882, 0.014835367910563946, 0.12493619322776794, 0.08636317402124405, 0.0004406433436088264, 0.09486706554889679, 0.01573038473725319, 0.1048758327960968, -0.053480364382267, -0.0008445249404758215, 0.07124445587396622, 0.004781750030815601, -0.11963790655136108, 0.21836620569229126, -0.1296418309211731, 0.1350550651550293, 0.12868621945381165, -0.18593627214431763, -0.024729138240218163, 0.06532986462116241, -0.03950456902384758, -0.024366924539208412, 0.029911864548921585, -0.010205648839473724, 0.06851772964000702, -0.0027075365651398897, 0.22944167256355286, -0.045141685754060745, -0.04061247408390045, 0.015142843127250671, -0.03515518829226494, 0.002726897830143571, 0.1276504099369049, 0.01688368432223797, -0.25306302309036255, 0.0996192917227745, 0.09546519815921783, 0.012872632592916489, 0.21739643812179565, 0.0196671225130558, 0.01397621352225542, -0.04604367911815643, 0.05399647355079651, 0.008000914007425308, -0.03774547204375267, -0.19410888850688934, -0.049342066049575806, 0.02780849114060402, 0.028466809540987015, 0.06728419661521912, -0.05714692920446396, -0.010067562572658062, -0.007620565593242645, 0.0017637761775404215, 0.035196371376514435, 0.13175301253795624, -0.001175779034383595, 0.09967172890901566, 0.0018107600044459105, 0.056405868381261826, -0.0045857843942940235, 0.009894097223877907, -0.0662296935915947, 0.14845305681228638, -0.08458701521158218, -0.4946208894252777, -0.15238377451896667, -0.05581450089812279, 0.0033822704572230577, -0.021439451724290848, 0.11485201120376587, -0.20889200270175934, -0.015863321721553802, -0.032015420496463776, -0.010021212510764599, -0.06728743761777878, 0.007491917349398136, -0.04782307147979736, 0.03671266883611679, -0.05725116655230522, -0.07448659837245941, -0.033664826303720474, -0.045623987913131714, -0.07799306511878967, 0.09997682273387909, -0.18311433494091034, 0.04666299745440483, 0.14129501581192017, -0.028819195926189423, 0.03947686776518822, -0.020935142412781715, 0.04685155302286148, -0.06865806132555008, 0.07750653475522995, 0.14139941334724426, 0.0416259802877903, 0.08583927154541016, 0.04004601389169693, -0.035218413919210434, -0.024917591363191605, 0.12047113478183746, 0.028760213404893875, -0.08062434196472168, -0.19802823662757874, -0.14916886389255524, -0.10943592339754105, 0.1178920790553093, -0.044579945504665375, 0.03503446280956268, 0.1505173295736313, 0.0008824702817946672, 0.04231955111026764, 0.15232662856578827, -0.02713363617658615, 0.09043345600366592, 0.2213897854089737, -0.05685276538133621, 0.0858030691742897, -0.11465977877378464, -0.11994569003582001, 0.10103646665811539, -0.059787821024656296, 0.0842282623052597, 0.028665078803896904, 0.10414993762969971, 0.06398158520460129, 0.0995851531624794, 0.14359620213508606, 0.04902512580156326, -0.021966150030493736, -0.006914027500897646, -0.04113544151186943, -0.07556018978357315, 0.06612861156463623, 0.002572587225586176, -0.049201324582099915, -0.06700939685106277, 0.00659923953935504, 0.06829635053873062, 0.020171180367469788, 0.1032748892903328, 0.06346097588539124, -0.1819019466638565, 0.004221874754875898, 0.0171318668872118, -0.0027775310445576906, -0.07883120328187943, 0.05300344154238701, 0.13980591297149658, -0.1140456348657608, 0.08531869202852249, -0.027788210660219193, 0.10456410050392151, -0.048359621316194534, 0.10021054744720459, -0.036535006016492844, -0.07000257074832916, -0.011839386075735092, 0.0678095743060112, -0.22564388811588287, 0.2538924217224121, -0.026120716705918312, -0.09186732023954391, -0.03152162954211235, -0.06534488499164581, -0.030234338715672493, 0.1717856228351593, 0.05785501375794411, 0.029825255274772644, 0.04045422375202179, -0.0460188165307045, -0.05079128220677376, 0.03398348391056061, 0.0525842010974884, 0.11996079981327057, -0.06767775863409042, -0.01265040971338749, -0.07320640236139297, -0.007503460627049208, 0.0011754221050068736, -0.12655384838581085, -0.09141087532043457, 0.04483266547322273, 0.07146423310041428, 0.07983075082302094, 0.03840745612978935, -0.04085654765367508, 0.04353882372379303, 0.1839199811220169, 0.024951836094260216, -0.020709095522761345, -0.04311620071530342, -0.044770367443561554, 0.05014461278915405, -0.07183714956045151, 0.009592347778379917, -0.03857605531811714, -0.014977019280195236, -0.014795118011534214, -0.10911940783262253, 0.08845410495996475, -0.01303547527641058, 0.009441692382097244, 0.017331240698695183, 0.14737926423549652, 0.0584908165037632, 0.007148632779717445, -0.0024853136856108904, -0.08285357058048248, -0.05145835503935814, -0.10529313236474991, -0.0683450773358345, 0.008263212628662586, 0.06815891712903976, -0.045823872089385986, -0.10201165825128555, -0.10748700797557831, -0.040382519364356995, -0.08162546902894974, 0.20961013436317444, 0.03712762892246246, -0.04392411559820175, 0.1451617181301117, 0.12356255203485489, -0.06308367848396301, -0.21903932094573975, -0.13471835851669312, -0.013321058824658394, 0.013027235865592957, -0.00020553430658765137, -0.05594763532280922, -0.004605437163263559, -0.008568432182073593, 0.016829729080200195, -0.09307670593261719, -0.22299732267856598, -0.142009437084198, 0.11221425235271454, -0.031190166249871254, 0.1856822818517685, -0.10925429314374924, -0.03633766993880272, -0.09566576033830643, -0.19409634172916412, 0.20744971930980682, -0.06959280371665955, 0.045068755745887756, -0.012865043245255947, 0.08450659364461899, 0.005957045126706362, 0.0017665241612121463, 0.06763319671154022, -0.022168008610606194, -0.019243283197283745, -0.10569071024656296, -0.11701435595750809, 0.12954585254192352, 0.011845611967146397, 0.09924677014350891, -0.09028162807226181, 0.024980565533041954, -0.15305912494659424, -0.01591723971068859, -0.057518962770700455, -0.0073426468297839165, 0.01587197184562683, -0.05779552832245827, -0.05044018104672432, -0.0022878271993249655, 0.006760527845472097, -0.004010800737887621, 0.08485084772109985, -0.0449509397149086, 0.06706772744655609, 0.2161264717578888, 0.1710413098335266, -0.09380613267421722, 0.19721104204654694, -0.0566224567592144, -0.033146440982818604, 0.06722836941480637, -0.1716289222240448, 0.03297073394060135, 0.09970922023057938, -0.03135679289698601, 0.10699500888586044, 0.025903183966875076, 0.013258334249258041, 0.038569312542676926, 0.10311242938041687, -0.10731837153434753, -0.12747032940387726, -0.11973083764314651, 0.03277900069952011, 0.11632180213928223, 0.11067020893096924, 0.08385202288627625, -0.078559510409832, -0.02357722818851471, 0.03716062381863594, -0.024629585444927216, -0.08979488164186478, 0.008714263327419758, -0.050372980535030365, 0.06715066730976105, -0.04290777072310448, -0.0111679183319211, 0.112549789249897, -0.16070771217346191, 0.015230237506330013, 0.19963470101356506, -0.13958096504211426, -0.0922299474477768, 0.06919945776462555, 0.11407338082790375, -0.03176702558994293, -0.0417502224445343, -0.01069454476237297, -0.07425172626972198, 0.051505234092473984, 0.25987523794174194, 0.03174401447176933, 0.052124060690402985, -0.07068578898906708, -0.0024102036841213703, -0.07207479327917099, 0.034335434436798096, 0.11139529943466187, -0.00497445510700345, -0.1426296830177307, 0.08291154354810715, 0.053523220121860504, 0.18565009534358978, -0.09036771953105927, -0.09535388648509979, -0.1331961452960968, 0.01455269567668438, -0.1656758189201355, -0.032747626304626465, -0.06717176735401154, -0.06552350521087646, -0.03854718059301376, -0.07918436825275421, -0.07768028229475021, -0.012247019447386265, -0.03863232210278511, 0.04347939416766167, -0.002123309066519141, 0.05181513726711273, -0.04849463701248169, -0.022621935233473778, 0.09022006392478943, 0.00723838759586215, 0.06946313381195068, 0.009530828334391117, -0.0934550017118454, 0.057573091238737106, -0.0639040619134903, -0.01771712303161621, 0.09319791942834854, 0.041345708072185516, 0.0770503357052803, 0.07731249183416367, -0.025256292894482613, 0.048142921179533005, 0.13743191957473755, 0.02789636142551899, 0.09499498456716537, -0.06632643193006516, 0.0714409351348877, -0.13482584059238434, -0.10138456523418427, -0.07407746464014053, 0.030963363125920296, -0.019186049699783325, 0.038069356232881546, 0.018132418394088745, -0.11795719712972641, 0.024265147745609283, -0.07799936085939407, -0.005917014554142952, -0.0011788050178438425, -0.10363506525754929, 0.04482811316847801, -0.13150857388973236, 0.019045114517211914, 0.002871996955946088, 0.07006990164518356, 0.018367119133472443, 0.00459406990557909, 0.04049118235707283, 0.017485681921243668, -0.1285957396030426, -0.019764412194490433, 0.14957916736602783, 0.05438940227031708, -0.02937011420726776, -0.07693701982498169, 0.04362289980053902, -0.0321337953209877, 0.13821348547935486, 0.054692141711711884, 0.03554418310523033, -0.07502779364585876, 0.030011113733053207, -0.0010613200720399618, 0.11280468106269836, -0.15576831996440887, -0.05517027899622917, -0.0955962985754013, 0.0830506905913353, -0.11336931586265564, 0.0011522721033543348, 0.1362103521823883, -0.0350898914039135, 0.03176786005496979, 0.011701470240950584, -0.09785814583301544, -0.11964358389377594, -0.16274435818195343, -0.08085550367832184, -0.16910912096500397, -0.028882889077067375, -0.033760201185941696, 0.0664813444018364, -0.023925703018903732, 0.1314925104379654, -0.017116058617830276, 0.11231312155723572, 0.009273139759898186, -0.08313505351543427, 0.09272456169128418, -0.03478092700242996, 0.02228950709104538, -0.12141327559947968, 0.13276775181293488, -0.0048127006739377975, 0.0383906364440918, -0.023389937356114388, 0.04412810131907463, -0.04412632808089256, 0.036465954035520554, -0.08266885578632355, -0.06022772565484047, -0.030899664387106895, 0.03045634925365448, 0.017207127064466476, 0.03741070628166199, 0.03533194586634636, -0.08582150936126709, 0.01979834772646427, 0.20556649565696716, -0.06630431115627289, -0.1702825129032135, -0.07750233262777328, 0.23404984176158905, -0.019570689648389816, 0.03232063353061676, -0.01780848018825054, 0.04952584207057953, -0.1335066556930542, 0.30883070826530457, 0.23835670948028564, 0.01503764372318983, 0.0036890849005430937, -0.0203199815005064, 0.035631414502859116, 0.033097896724939346, 0.08500246703624725, 0.1623489260673523, 0.2827654480934143, -0.05657828599214554, -0.09753958135843277, -0.08852029591798782, -0.038976315408945084, -0.09262318909168243, -0.0013881566701456904, 0.0901704654097557, -0.02418399788439274, 0.032060280442237854, 0.09845825284719467, -0.1955171525478363, 0.008899549022316933, -0.12827618420124054, -0.1399722695350647, -0.05712302401661873, 0.02092619054019451, 0.10517434030771255, 0.020337510854005814, 0.0018058198038488626, -0.020720582455396652, -0.025660978630185127, 0.08791018277406693, 0.05234621465206146, -0.12920349836349487, 0.04296749830245972, 0.11399126797914505, -0.16727985441684723, -0.0887143462896347, -0.05827336385846138, 0.03937165439128876, 0.07757795602083206, 0.0015402385033667088, 0.02981833927333355, 0.07207509875297546, -0.058404527604579926, 0.006777584552764893, 0.03835742175579071, -0.00010861838381970301, 0.009113834239542484, 0.03989947587251663, -0.04027838259935379, -0.1446274071931839, 0.017849812284111977, -0.029827574267983437, 0.08382527530193329, -0.03254013881087303, 0.0667971670627594, -0.03896302729845047, 0.07883045077323914, 0.09152107685804367, -0.011339736171066761, 0.06942389160394669, -0.07949505001306534, -0.04343917965888977, -0.061037346720695496, -0.13232338428497314, -0.01958024874329567, -0.13139359652996063, -0.05080370232462883, -0.026113396510481834, 0.047309260815382004, -0.2572498023509979, 0.019811559468507767, -0.07073362171649933, 0.03487538918852806, -0.09800168871879578, 0.0569206103682518, 0.06004447489976883, 0.0028779509011656046, -0.018060052767395973, -0.05054597929120064, 0.008414654061198235, 0.08039256185293198, -0.10986841470003128, -0.03381868079304695 ]
null
null
transformers
--- language: - vi tags: - t5 - seq2seq # Machine translation for vietnamese ## Model Description T5-vi-en-base is a transformer model for vietnamese machine translation designed using T5 architecture. ## Training data T5-vi-en-base was trained on 4M sentence pairs (english,vietnamese) ### How to use ```py from transformers import T5ForConditionalGeneration, T5Tokenizer import torch if torch.cuda.is_available(): device = torch.device("cuda") print('There are %d GPU(s) available.' % torch.cuda.device_count()) print('We will use the GPU:', torch.cuda.get_device_name(0)) else: print('No GPU available, using the CPU instead.') device = torch.device("cpu") model = T5ForConditionalGeneration.from_pretrained("NlpHUST/t5-vi-en-base") tokenizer = T5Tokenizer.from_pretrained("NlpHUST/t5-vi-en-base") model.to(device) src = "Theo lãnh đạo Sở Y tế, 3 người này không có triệu chứng sốt, ho, khó thở, đã được lấy mẫu xét nghiệm và cách ly tập trung." tokenized_text = tokenizer.encode(src, return_tensors="pt").to(device) model.eval() summary_ids = model.generate( tokenized_text, max_length=256, num_beams=5, repetition_penalty=2.5, length_penalty=1.0, early_stopping=True ) output = tokenizer.decode(summary_ids[0], skip_special_tokens=True) print(output) According to the head of the Department of Health, the three people had no symptoms of fever, cough, shortness of breath, were taken samples for testing and concentrated quarantine. ```
{}
text2text-generation
NlpHUST/t5-vi-en-base
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
--- language: - vi tags: - t5 - seq2seq # Machine translation for vietnamese ## Model Description T5-vi-en-base is a transformer model for vietnamese machine translation designed using T5 architecture. ## Training data T5-vi-en-base was trained on 4M sentence pairs (english,vietnamese) ### How to use
[ "# Machine translation for vietnamese", "## Model Description\nT5-vi-en-base is a transformer model for vietnamese machine translation designed using T5 architecture.", "## Training data\nT5-vi-en-base was trained on 4M sentence pairs (english,vietnamese)", "### How to use" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Machine translation for vietnamese", "## Model Description\nT5-vi-en-base is a transformer model for vietnamese machine translation designed using T5 architecture.", "## Training data\nT5-vi-en-base was trained on 4M sentence pairs (english,vietnamese)", "### How to use" ]
[ 51, 6, 27, 27, 5 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Machine translation for vietnamese## Model Description\nT5-vi-en-base is a transformer model for vietnamese machine translation designed using T5 architecture.## Training data\nT5-vi-en-base was trained on 4M sentence pairs (english,vietnamese)### How to use" ]
[ 0.0020906359422951937, -0.01239816378802061, -0.0016462268540635705, 0.029247639700770378, 0.13560867309570312, -0.005602451041340828, 0.0638425350189209, 0.10930366814136505, -0.1454210728406906, -0.07975590974092484, 0.08869897574186325, 0.11548222601413727, 0.028006426990032196, 0.12684746086597443, 0.007732506841421127, -0.35290005803108215, 0.0527070127427578, 0.10271134227514267, -0.042616404592990875, 0.1335485428571701, 0.12214111536741257, -0.020059673115611076, 0.1607549786567688, 0.005355284083634615, -0.16263623535633087, 0.08548403531312943, -0.008877900429069996, -0.1033535897731781, 0.13096390664577484, 0.05677085742354393, 0.032454464584589005, 0.02173900045454502, 0.05739639326930046, -0.031858041882514954, 0.011549417860805988, 0.029015742242336273, -0.09528306126594543, 0.03658723086118698, -0.03167177364230156, 0.02314165234565735, 0.28992998600006104, -0.10284481197595596, 0.03846487030386925, 0.025853591039776802, -0.08082409203052521, -0.08765911310911179, 0.08353009819984436, 0.07039542496204376, 0.1242244690656662, 0.09529309719800949, -0.028037091717123985, 0.17089374363422394, -0.12511597573757172, 0.119138702750206, 0.053414586931467056, -0.3470900058746338, -0.04905429854989052, 0.09691634029150009, 0.10661886632442474, 0.09677040576934814, -0.04161245748400688, 0.039975762367248535, 0.05176396667957306, 0.05355973541736603, 0.03264220058917999, -0.10895782709121704, -0.04927389696240425, 0.036961887031793594, -0.12004532665014267, 0.07780542969703674, 0.2990697920322418, 0.015874627977609634, 0.01869312860071659, 0.015492411330342293, -0.09694080054759979, -0.03304046392440796, -0.029876679182052612, -0.14979562163352966, -0.010081144981086254, 0.047478120774030685, 0.04725828021764755, -0.10342211276292801, -0.1033216118812561, -0.06726738065481186, -0.14045308530330658, -0.0877644270658493, 0.034151799976825714, -0.021990470588207245, -0.1480671763420105, 0.06019686535000801, -0.10829952359199524, -0.014439230784773827, 0.024413762614130974, -0.07453299313783646, -0.05589529126882553, -0.021908406168222427, 0.0062635233625769615, -0.11550506949424744, 0.029321394860744476, -0.0631232038140297, 0.03784476965665817, 0.04497019946575165, -0.08004192262887955, 0.0192569550126791, -0.02331911399960518, 0.16727960109710693, -0.17925243079662323, -0.03763572871685028, 0.007569404318928719, -0.06643777340650558, -0.11115793883800507, 0.02867690846323967, -0.18999584019184113, -0.05637076124548912, 0.10680250823497772, -0.005926685873419046, -0.07070179283618927, 0.18795305490493774, 0.023258598521351814, -0.09263211488723755, 0.012740708887577057, -0.07336597889661789, -0.02975444868206978, -0.027897050604224205, -0.021043280139565468, 0.16441282629966736, 0.09831588715314865, 0.03119966760277748, -0.1659563183784485, -0.10351736098527908, -0.0409066304564476, -0.03600820526480675, -0.052385102957487106, -0.1188327744603157, -0.015230252407491207, 0.002758408896625042, 0.0020576512906700373, -0.14572633802890778, -0.1245008185505867, 0.03385816141963005, 0.06864466518163681, -0.0543476901948452, -0.047285743057727814, -0.14049646258354187, -0.03502307087182999, 0.01748644933104515, 0.0077739013358950615, 0.03595671430230141, 0.0140537079423666, 0.04369957372546196, -0.038911446928977966, 0.11151212453842163, -0.13978925347328186, 0.07105736434459686, -0.09230164438486099, -0.03328403830528259, -0.16285456717014313, 0.09486426413059235, 0.04897183179855347, 0.06233912706375122, -0.09829635918140411, -0.08960355073213577, -0.024417875334620476, 0.03848665952682495, -0.053638845682144165, 0.10530698299407959, -0.19318132102489471, -0.03450877591967583, 0.1351214498281479, -0.06024543568491936, -0.12253910303115845, 0.1464305967092514, 0.016720836982131004, 0.21264691650867462, 0.10209129005670547, 0.09099684655666351, 0.024743888527154922, -0.07134722173213959, 0.1897556483745575, 0.09068907797336578, -0.13980481028556824, -0.001422543078660965, 0.05906975641846657, 0.08031526207923889, -0.18887726962566376, 0.013125612400472164, -0.06452704966068268, 0.03771582618355751, -0.055153097957372665, -0.05359775200486183, 0.018513787537813187, -0.05984925478696823, 0.1594725102186203, -0.002126171486452222, 0.1428292691707611, -0.02418516017496586, -0.06447187811136246, 0.14183251559734344, 0.04975641518831253, -0.04375552386045456, 0.02866975963115692, -0.15031103789806366, -0.04993080720305443, 0.015613089315593243, 0.07694243639707565, -0.10129699110984802, 0.05802224203944206, -0.006208518519997597, 0.19721852242946625, 0.08315473794937134, 0.12617845833301544, 0.03933650255203247, 0.015241174958646297, -0.052719589322805405, -0.007671255152672529, 0.07448925822973251, 0.0728563666343689, -0.015096706338226795, -0.15792429447174072, 0.06311187148094177, 0.003539013909175992, -0.010636375285685062, -0.13557645678520203, -0.00640168646350503, 0.0664789080619812, 0.07225026190280914, 0.00722206337377429, 0.11538732796907425, -0.01861494965851307, 0.05336138978600502, -0.028282405808568, 0.0045373206958174706, 0.09253662079572678, -0.020387595519423485, -0.08545620739459991, 0.2246183454990387, -0.09071209281682968, 0.1422475129365921, 0.16874265670776367, -0.1957465261220932, -0.07112451642751694, 0.0513446182012558, 0.0038018629420548677, -0.04260561987757683, 0.07161527127027512, -0.05711902305483818, 0.09365758299827576, -0.06391841173171997, 0.19281619787216187, -0.0856713056564331, 0.006214067805558443, 0.022099344059824944, -0.03844933956861496, 0.003239640034735203, 0.09005972743034363, 0.04807010665535927, -0.34864315390586853, 0.10080163925886154, 0.13101933896541595, -0.015823829919099808, 0.2373356819152832, 0.03679320216178894, 0.00431863684207201, -0.044060494750738144, 0.057207606732845306, -0.008728821761906147, -0.008622486144304276, -0.3160514831542969, -0.04145819693803787, 0.013627446256577969, 0.05006004124879837, 0.0431034192442894, -0.08054325729608536, -0.008387978188693523, -0.010289985686540604, -0.01682012900710106, 0.03488723933696747, 0.08708103746175766, -0.00041137452353723347, 0.10574354976415634, 0.04554033651947975, -0.0464688204228878, 0.05716861039400101, 0.014174913987517357, -0.14679184556007385, 0.18026012182235718, -0.1504075527191162, -0.3610794246196747, -0.06321142613887787, -0.11998485773801804, 0.004254547879099846, -0.009243247099220753, 0.07044357061386108, -0.18697069585323334, -0.03552903234958649, -0.004408255685120821, 0.05759628117084503, -0.049216922372579575, 0.01974642463028431, -0.02846689149737358, 0.06611620634794235, -0.01704064942896366, -0.04994642734527588, -0.02813117392361164, -0.03109961375594139, -0.10807367414236069, 0.0507471077144146, -0.2587454319000244, -0.007328956387937069, 0.17171534895896912, -0.05764540284872055, 0.06802576035261154, -0.05355166643857956, 0.1746029555797577, -0.04418599233031273, 0.10975009948015213, 0.12532763183116913, 0.028990034013986588, 0.014414191246032715, 0.11453768610954285, -0.018864864483475685, -0.03706611692905426, 0.11504636704921722, 0.017128221690654755, -0.05583604797720909, -0.1865408569574356, -0.08247334510087967, -0.09793423116207123, 0.048902757465839386, 0.05505608767271042, 0.04148978367447853, 0.15820527076721191, 0.03482815995812416, 0.0668463408946991, 0.2330324351787567, 0.06939397007226944, 0.04947841167449951, 0.1688978374004364, -0.05901385098695755, 0.045264340937137604, -0.04948580637574196, -0.08814625442028046, 0.0776490643620491, 0.015713099390268326, 0.04844963550567627, 0.06099110469222069, 0.04238832741975784, 0.06701400130987167, 0.14317405223846436, 0.08610199391841888, 0.12337078899145126, -0.1084260419011116, -0.029873019084334373, -0.03452145680785179, -0.08408211171627045, 0.06432564556598663, 0.0780319795012474, -0.014068391174077988, -0.020080910995602608, -0.05536339059472084, 0.0880696177482605, 0.06523328274488449, 0.10562147200107574, 0.05456065386533737, -0.21683616936206818, -0.004190865438431501, -0.012806222774088383, -0.022502947598695755, -0.0641842633485794, 0.09772378206253052, 0.08131884783506393, -0.16316881775856018, 0.05198078975081444, -0.007765337359160185, 0.09121712297201157, 0.028868064284324646, 0.08246003836393356, 0.0570792481303215, -0.019462313503026962, 0.03598133474588394, 0.0942704901099205, -0.3755738437175751, 0.18786685168743134, -0.01896810345351696, -0.03745020553469658, -0.06887359917163849, -0.05122257396578789, -0.0007716431864537299, 0.2367834746837616, 0.1259107142686844, 0.035429105162620544, 0.048483435064554214, -0.05268530920147896, -0.026653459295630455, 0.01692778617143631, 0.031317442655563354, 0.02906561642885208, -0.012259067967534065, -0.06012531369924545, -0.06281662732362747, 0.01568162813782692, 0.10282818228006363, -0.13672395050525665, -0.0653180181980133, 0.006702284328639507, 0.008986026979982853, 0.13884800672531128, 0.0071094282902777195, -0.06783532351255417, 0.011578955687582493, 0.2204013168811798, -0.0336119681596756, -0.02444623038172722, -0.10403794795274734, -0.006579630076885223, 0.027822155505418777, -0.07171139121055603, 0.047333814203739166, -0.04988634213805199, -0.01922474429011345, 0.053236331790685654, -0.15705187618732452, 0.18657225370407104, -0.058278217911720276, -0.009881637059152126, -0.0184742771089077, 0.07147359102964401, 0.009704947471618652, 0.02489674836397171, 0.06771524250507355, -0.09647712111473083, 0.003348700702190399, -0.07864466309547424, -0.0870518833398819, 0.04414389282464981, -0.02962295524775982, -0.015037219040095806, -0.17211933434009552, -0.16150043904781342, -0.03788101673126221, -0.06881920248270035, 0.2182183414697647, -0.0723494291305542, -0.03655797615647316, 0.08209645003080368, 0.16352525353431702, -0.07253675162792206, -0.22748881578445435, -0.07865220308303833, -0.00006647482223343104, 0.05165470018982887, -0.054135583341121674, -0.10463858395814896, 0.02269718050956726, -0.06262315064668655, 0.04250463843345642, -0.20097699761390686, -0.23185209929943085, -0.14891405403614044, 0.15329967439174652, 0.0691930279135704, 0.24515429139137268, -0.04466281831264496, -0.03178149089217186, -0.004628969822078943, -0.15545925498008728, 0.1908164918422699, -0.1361917108297348, 0.046242937445640564, 0.039036914706230164, 0.027778737246990204, 0.007602152414619923, 0.04550665244460106, 0.0035270031075924635, 0.014007274061441422, -0.039171140640974045, -0.10143344849348068, -0.14703664183616638, 0.06909030675888062, 0.033215608447790146, 0.14064833521842957, -0.0043187858536839485, 0.03607029840350151, -0.11010770499706268, -0.03559894859790802, -0.08444748818874359, -0.04794532433152199, -0.008831564337015152, -0.10146410018205643, 0.03178846091032028, 0.06283795088529587, 0.02024082839488983, -0.029135923832654953, 0.04713260382413864, -0.04896416887640953, 0.028295015916228294, 0.14900672435760498, 0.18969696760177612, -0.08143943548202515, 0.1558205932378769, -0.0201286468654871, -0.020234795287251472, 0.11432400345802307, -0.15183444321155548, 0.06256794184446335, 0.1016889438033104, -0.03880157321691513, 0.1672249734401703, 0.04268442839384079, -0.0373414009809494, -0.006322704721242189, 0.10199706256389618, -0.12085561454296112, -0.1332922726869583, -0.13942836225032806, -0.049723681062459946, 0.12804743647575378, 0.07385416328907013, 0.1014469563961029, -0.10688669979572296, -0.03390595316886902, 0.001532090245746076, -0.022785186767578125, -0.1382257491350174, 0.06502017378807068, -0.001212611678056419, 0.041949111968278885, -0.0800635889172554, 0.021966535598039627, 0.0934799537062645, -0.13373680412769318, 0.07011689245700836, 0.24960298836231232, -0.16264991462230682, -0.08869173377752304, 0.12720887362957, 0.18785274028778076, -0.05865466222167015, -0.07668199390172958, -0.04449567198753357, -0.13822074234485626, 0.07801882922649384, 0.20554938912391663, 0.06945442408323288, 0.06863025575876236, -0.07510486245155334, -0.05531959608197212, -0.02947874739766121, 0.05241299048066139, 0.059358127415180206, -0.026666440069675446, -0.12093811482191086, 0.020550930872559547, 0.048995357006788254, 0.1853983998298645, -0.10101402550935745, -0.06785573065280914, -0.1575610488653183, 0.019052891060709953, -0.17597271502017975, -0.017922790721058846, -0.058887023478746414, -0.02332201786339283, 0.0008213448454625905, -0.09213226288557053, -0.037490103393793106, 0.02483028918504715, -0.05799395591020584, 0.03521512448787689, 0.004544640425592661, 0.05385246500372887, -0.011713829822838306, -0.00627961615100503, 0.05027676001191139, 0.0046081263571977615, 0.1088406965136528, -0.0030537866987288, -0.10398602485656738, 0.12803231179714203, -0.08373136818408966, -0.022017771378159523, 0.04936610907316208, 0.03667968139052391, 0.1216416209936142, -0.02085752598941326, 0.016810674220323563, 0.06299588084220886, 0.11672132462263107, 0.017383627593517303, 0.1234826073050499, -0.0632585659623146, -0.018625333905220032, -0.11275424063205719, -0.10462600737810135, -0.07693503797054291, 0.0699491798877716, 0.00905787292867899, -0.018794892355799675, 0.0562894232571125, -0.10134273767471313, 0.010476975701749325, -0.07125744223594666, -0.0006395286764018238, -0.02300664409995079, -0.06090333312749863, -0.044655073434114456, -0.10400371998548508, 0.003038522321730852, -0.03158317133784294, 0.021407322958111763, 0.044400282204151154, 0.04960377886891365, 0.03200916573405266, -0.022444196045398712, -0.062296777963638306, 0.03302035108208656, 0.17531003057956696, -0.010965343564748764, -0.005316728726029396, -0.14103388786315918, 0.05082898586988449, -0.03396456688642502, 0.13153517246246338, 0.03432414308190346, 0.006999365985393524, 0.02897671051323414, 0.09819476306438446, 0.05718507990241051, 0.1221185252070427, -0.15491993725299835, 0.05975114554166794, -0.07194957882165909, 0.04911702498793602, -0.09423717856407166, 0.03196993097662926, 0.20944058895111084, -0.07442844659090042, 0.04357621818780899, -0.009213107638061047, -0.1091143935918808, -0.12633097171783447, -0.26566416025161743, -0.10039548575878143, -0.2257087081670761, -0.03857622295618057, -0.09304997324943542, 0.025109805166721344, -0.09153188765048981, 0.16112011671066284, -0.06753551959991455, 0.1454879194498062, 0.01107731182128191, -0.103550486266613, 0.0912775918841362, -0.01827903836965561, 0.04661891236901283, -0.02587108314037323, 0.07132227718830109, -0.024142490699887276, 0.001731785130687058, -0.02244236133992672, 0.007015647832304239, -0.06475856900215149, 0.01700856350362301, -0.031014494597911835, -0.03176489099860191, -0.0258790235966444, -0.03668855503201485, -0.03273811191320419, 0.04216369241476059, 0.035241562873125076, -0.10969915241003036, 0.022076299414038658, 0.1847081035375595, -0.04389917477965355, -0.194809690117836, -0.19723102450370789, 0.2452753484249115, 0.016373397782444954, 0.08240097761154175, 0.010866512544453144, 0.021795276552438736, -0.08640782535076141, 0.3677677512168884, 0.2217591553926468, -0.03585878759622574, -0.01596887595951557, 0.030994100496172905, 0.026051396504044533, 0.0009370727930217981, 0.13480232656002045, 0.14034777879714966, 0.16737255454063416, -0.04035032168030739, -0.016103360801935196, -0.07043873518705368, -0.013756434433162212, -0.01939631626009941, -0.009467442519962788, 0.1536729484796524, -0.032797425985336304, 0.007006716914474964, 0.10075171291828156, -0.13795188069343567, 0.04775647073984146, -0.08473044633865356, -0.08397034555673599, -0.08295314013957977, -0.05830701068043709, 0.10483170300722122, 0.06662031263113022, 0.013481862843036652, -0.014925871975719929, -0.007744963280856609, 0.08080428093671799, 0.055120572447776794, -0.1372203826904297, -0.038637351244688034, 0.11778545379638672, -0.08679276704788208, 0.024571120738983154, -0.04263531044125557, 0.015681257471442223, 0.07414627075195312, 0.042019058018922806, -0.013835076242685318, 0.09566886723041534, -0.056314509361982346, 0.07109294086694717, 0.04698482155799866, 0.011975778266787529, 0.012090790085494518, 0.09585385769605637, -0.01655920036137104, -0.12752197682857513, 0.032739777117967606, -0.03285720944404602, -0.015198295935988426, -0.0422627218067646, 0.10747354477643967, -0.07683981209993362, 0.07476921379566193, 0.17102497816085815, -0.02108834870159626, 0.06266749650239944, -0.12167249619960785, 0.0447830855846405, -0.048248160630464554, -0.093160480260849, -0.04929322376847267, -0.1213388666510582, -0.023011695593595505, -0.048022083938121796, 0.021776540204882622, -0.17559970915317535, -0.0538957305252552, -0.09919653832912445, 0.0317460298538208, -0.17145362496376038, 0.11413247883319855, 0.15332256257534027, 0.05285578966140747, -0.010513057932257652, -0.13305604457855225, -0.030357303097844124, 0.09161216765642166, -0.14955197274684906, -0.10289251804351807 ]
null
null
transformers
--- language: - vi tags: - t5 - seq2seq # Machine translation for vietnamese ## Model Description T5-vi-en-small is a transformer model for vietnamese machine translation designed using T5 architecture. ## Training data T5-vi-en-small was trained on 4M sentence pairs (english,vietnamese) ### How to use ```py from transformers import T5ForConditionalGeneration, T5Tokenizer import torch if torch.cuda.is_available(): device = torch.device("cuda") print('There are %d GPU(s) available.' % torch.cuda.device_count()) print('We will use the GPU:', torch.cuda.get_device_name(0)) else: print('No GPU available, using the CPU instead.') device = torch.device("cpu") model = T5ForConditionalGeneration.from_pretrained("NlpHUST/t5-vi-en-small") tokenizer = T5Tokenizer.from_pretrained("NlpHUST/t5-vi-en-small") model.to(device) src = "Indonesia phỏng đoán nguyên nhân tàu ngầm chở 53 người mất tích bí ẩn" tokenized_text = tokenizer.encode(src, return_tensors="pt").to(device) model.eval() summary_ids = model.generate( tokenized_text, max_length=256, num_beams=5, repetition_penalty=2.5, length_penalty=1.0, early_stopping=True ) output = tokenizer.decode(summary_ids[0], skip_special_tokens=True) print(output) Indonesia anticipates the cause of the submarine transporting 53 mysterious missing persons ```
{}
text2text-generation
NlpHUST/t5-vi-en-small
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
--- language: - vi tags: - t5 - seq2seq # Machine translation for vietnamese ## Model Description T5-vi-en-small is a transformer model for vietnamese machine translation designed using T5 architecture. ## Training data T5-vi-en-small was trained on 4M sentence pairs (english,vietnamese) ### How to use
[ "# Machine translation for vietnamese", "## Model Description\nT5-vi-en-small is a transformer model for vietnamese machine translation designed using T5 architecture.", "## Training data\nT5-vi-en-small was trained on 4M sentence pairs (english,vietnamese)", "### How to use" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Machine translation for vietnamese", "## Model Description\nT5-vi-en-small is a transformer model for vietnamese machine translation designed using T5 architecture.", "## Training data\nT5-vi-en-small was trained on 4M sentence pairs (english,vietnamese)", "### How to use" ]
[ 51, 6, 28, 28, 5 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Machine translation for vietnamese## Model Description\nT5-vi-en-small is a transformer model for vietnamese machine translation designed using T5 architecture.## Training data\nT5-vi-en-small was trained on 4M sentence pairs (english,vietnamese)### How to use" ]
[ 0.008574135601520538, -0.08062019944190979, -0.0017001981614157557, 0.05024758726358414, 0.1366226077079773, -0.028410041704773903, 0.08438236266374588, 0.12893132865428925, -0.1653287708759308, -0.06160255894064903, 0.05842418968677521, 0.09559664875268936, 0.035449542105197906, 0.1167658120393753, 0.004661085549741983, -0.35406693816185, 0.020726313814520836, 0.09769818931818008, -0.04541788995265961, 0.11402489244937897, 0.14654386043548584, -0.05597035214304924, 0.16359943151474, 0.0181129053235054, -0.13863243162631989, 0.05245482176542282, 0.0010137143544852734, -0.11775720864534378, 0.11293753236532211, 0.05244068428874016, -0.004088822286576033, 0.02376982755959034, 0.036565449088811874, 0.025518734008073807, 0.00442092539742589, 0.01480183657258749, -0.04993297904729843, 0.017998119816184044, 0.005243116058409214, 0.07432785630226135, 0.2675057053565979, -0.1538761556148529, 0.023242367431521416, 0.034728407859802246, -0.03811310604214668, -0.1323108822107315, 0.07206808030605316, 0.044528476893901825, 0.09978703409433365, 0.09048975259065628, -0.013343170285224915, 0.22519586980342865, -0.20234635472297668, 0.09786295890808105, 0.08149249851703644, -0.3152488172054291, -0.04232579097151756, 0.09126383811235428, 0.0929332971572876, 0.12546557188034058, -0.0476703941822052, 0.034075889736413956, 0.03496196120977402, 0.06805761158466339, 0.03640829399228096, -0.08651149272918701, -0.08718717098236084, 0.0351867750287056, -0.11745079606771469, 0.07293368875980377, 0.30591318011283875, -0.011221598833799362, 0.024348748847842216, -0.04287530854344368, -0.06372452527284622, -0.026562193408608437, -0.05814554542303085, -0.1731296181678772, 0.02343899756669998, 0.030808035284280777, 0.042209286242723465, -0.08538973331451416, -0.12721601128578186, -0.047655265778303146, -0.14236043393611908, -0.10455330461263657, 0.026521727442741394, -0.03450017422437668, -0.22585678100585938, -0.0008792838198132813, -0.07184713333845139, 0.009878624230623245, 0.013558190315961838, -0.0644015297293663, 0.018527112901210785, -0.03327722102403641, 0.004697537515312433, -0.12337706238031387, 0.051014434546232224, -0.09923220425844193, 0.024042846634984016, 0.003798015648499131, -0.02978428825736046, 0.010098842903971672, -0.0341782383620739, 0.14645478129386902, -0.16595831513404846, -0.029250534251332283, 0.026505010202527046, -0.05769751965999603, -0.09875141829252243, 0.04245894029736519, -0.18891510367393494, -0.07505209743976593, 0.1003180593252182, 0.016582893207669258, -0.056501735001802444, 0.161919504404068, 0.030969716608524323, -0.09406545758247375, -0.010971160605549812, -0.06968256831169128, -0.013900619000196457, -0.0025365573819726706, -0.022345000877976418, 0.21918366849422455, 0.03546098992228508, -0.004176494665443897, -0.1765938550233841, -0.09627404808998108, -0.03730256110429764, -0.04022616893053055, -0.019924737513065338, -0.13099108636379242, -0.0018760261591523886, 0.012773956172168255, 0.005592815577983856, -0.16949647665023804, -0.07372835278511047, 0.04468610882759094, 0.016302261501550674, -0.04704822227358818, -0.06021184101700783, -0.11630264669656754, -0.07416607439517975, 0.023800626397132874, 0.025859957560896873, 0.04340723901987076, 0.005801971536129713, 0.042607784271240234, -0.04642399773001671, 0.12264876067638397, -0.10520274937152863, 0.06593345105648041, -0.09245702624320984, -0.07184150815010071, -0.10679719597101212, 0.07399506121873856, 0.07197532802820206, 0.04778274893760681, -0.098072350025177, -0.10570936650037766, -0.02072857692837715, 0.0436021126806736, -0.07128634303808212, 0.10750488936901093, -0.21262289583683014, -0.06420408934354782, 0.14323580265045166, -0.05878115072846413, -0.11862227320671082, 0.17176564037799835, 0.04999339580535889, 0.22477909922599792, 0.12411462515592575, 0.08633562177419662, 0.0756581649184227, -0.038564831018447876, 0.1474774330854416, 0.11894882470369339, -0.14253363013267517, -0.03566225990653038, 0.08052708953619003, 0.07912717759609222, -0.15555433928966522, 0.0042191497050225735, -0.04999617114663124, 0.04000356048345566, -0.055999867618083954, -0.041724734008312225, 0.025002941489219666, -0.06787557154893875, 0.15313607454299927, 0.007875428535044193, 0.17422357201576233, -0.038990944623947144, -0.03163621202111244, 0.14623147249221802, 0.053300000727176666, -0.018330655992031097, 0.0019879229366779327, -0.18327596783638, -0.029474610462784767, 0.0041283294558525085, 0.07116127759218216, -0.06837071478366852, 0.08602428436279297, 0.012258617207407951, 0.16332513093948364, 0.04733419790863991, 0.18521347641944885, 0.06540179252624512, -0.02392754703760147, -0.03784075751900673, 0.03284250944852829, 0.07680229842662811, 0.06028333678841591, -0.018711967393755913, -0.13945047557353973, 0.06945492327213287, 0.017046386376023293, -0.011822610162198544, -0.15129902958869934, -0.0357009693980217, 0.11453002691268921, 0.03196660056710243, 0.027531156316399574, 0.1352062076330185, -0.007648984435945749, 0.041664790362119675, -0.020085083320736885, -0.001771765761077404, 0.10269340872764587, -0.01920473389327526, -0.05705530196428299, 0.22341091930866241, -0.10038118809461594, 0.12599211931228638, 0.15991242229938507, -0.1549500972032547, -0.05262409523129463, 0.06850025057792664, 0.0016474988078698516, -0.02874683402478695, 0.054620515555143356, -0.07122267037630081, 0.043875351548194885, -0.07799773663282394, 0.19099265336990356, -0.10890890657901764, -0.010691407136619091, 0.03355996310710907, -0.022245531901717186, 0.018642069771885872, 0.11620823293924332, 0.034803424030542374, -0.3164811432361603, 0.10018708556890488, 0.133541077375412, -0.0068558696657419205, 0.24245724081993103, 0.03213859349489212, -0.005654382519423962, -0.014398063533008099, 0.06866674870252609, 0.025363273918628693, -0.049521978944540024, -0.26970043778419495, -0.027179865166544914, 0.04556083306670189, 0.05208178982138634, 0.0331852100789547, -0.058805909007787704, 0.002800936345010996, 0.018027767539024353, -0.02256275899708271, 0.07457789033651352, 0.09804277867078781, 0.0283204335719347, 0.13184592127799988, 0.06104661524295807, -0.02820843830704689, 0.06650945544242859, 0.027634404599666595, -0.1416722983121872, 0.17105893790721893, -0.11932678520679474, -0.37920716404914856, -0.04039270058274269, -0.04737558588385582, 0.03932330384850502, -0.01076384074985981, 0.074428029358387, -0.2164584994316101, -0.03555650636553764, -0.017004605382680893, 0.07885827124118805, -0.05421115458011627, 0.01874479278922081, -0.036871299147605896, 0.09313900023698807, -0.02509813942015171, -0.06259028613567352, -0.024919530376791954, -0.01494180504232645, -0.11948420852422714, 0.05001247301697731, -0.2283090353012085, -0.05337497219443321, 0.16424080729484558, -0.06427512317895889, 0.057377222925424576, -0.07028456777334213, 0.1503601223230362, -0.028955558314919472, 0.10805337876081467, 0.1923377364873886, 0.07170998305082321, -0.008714078925549984, 0.08339519053697586, -0.02179482765495777, -0.03248506411910057, 0.09659216552972794, 0.009300513193011284, -0.049633149057626724, -0.17773021757602692, -0.09495799988508224, -0.10673411935567856, 0.09556401520967484, 0.05166521295905113, 0.05301741883158684, 0.10933992266654968, 0.03273261711001396, 0.059319812804460526, 0.25331494212150574, 0.05007757619023323, 0.03552147001028061, 0.1919727623462677, -0.023857880383729935, 0.03056972846388817, -0.06734780222177505, -0.08463557064533234, 0.1118667796254158, -0.05864633992314339, 0.009025262668728828, 0.0327434279024601, 0.09497665613889694, 0.060037966817617416, 0.0887337252497673, 0.10810867697000504, 0.12563100457191467, -0.0980992466211319, -0.04014765843749046, -0.040069155395030975, -0.09848256409168243, 0.06206181272864342, 0.05186852440237999, 0.013983982615172863, -0.00842286553233862, -0.09545529633760452, 0.1384017914533615, 0.058671556413173676, 0.1321966052055359, 0.06831631064414978, -0.22512932121753693, -0.051380909979343414, -0.026791153475642204, -0.03284744918346405, -0.04554497450590134, 0.09871179610490799, 0.15029433369636536, -0.1665395349264145, 0.005226559471338987, 0.009676286019384861, 0.07160928845405579, 0.019482580944895744, 0.09634233266115189, 0.03147413209080696, 0.004312371835112572, 0.06952488422393799, 0.09138243645429611, -0.36647480726242065, 0.13503409922122955, -0.022169461473822594, -0.017672156915068626, -0.10412351042032242, -0.05421782657504082, 0.010134993121027946, 0.16165906190872192, 0.10868970304727554, 0.0476069375872612, 0.08395663648843765, -0.04302404820919037, -0.042360108345746994, 0.023241566494107246, 0.03696438670158386, 0.037221118807792664, -0.016042469069361687, -0.1063501164317131, -0.054151035845279694, 0.008690710179507732, 0.07113215327262878, -0.11173490434885025, -0.06343379616737366, 0.01767074689269066, 0.05346717685461044, 0.061842698603868484, 0.0054747071117162704, -0.07927858829498291, -0.00530980946496129, 0.19299714267253876, -0.034730102866888046, -0.028514066711068153, -0.08609835058450699, 0.0029143020510673523, 0.01618865132331848, -0.07333511114120483, 0.04973316192626953, -0.03381878137588501, 0.009897896088659763, 0.0667615681886673, -0.1382797360420227, 0.16419364511966705, -0.0791432335972786, -0.06119896471500397, 0.016571758314967155, 0.05614274740219116, -0.01097785122692585, 0.04253879562020302, 0.07858376204967499, -0.11085066199302673, 0.00829965341836214, -0.06216994300484657, -0.1115468293428421, 0.09704215079545975, -0.05536576732993126, -0.011772728525102139, -0.15300624072551727, -0.1569625437259674, 0.0064448220655322075, -0.08936449140310287, 0.2640235722064972, -0.0012826635502278805, -0.05288104712963104, 0.10109439492225647, 0.14870323240756989, -0.03562357649207115, -0.2551906406879425, -0.06630673259496689, -0.01651621237397194, 0.03706676885485649, -0.01020754687488079, -0.06638330221176147, 0.05843467637896538, -0.06334709376096725, 0.042539410293102264, -0.2063874900341034, -0.30445799231529236, -0.12877574563026428, 0.10800767689943314, 0.048097286373376846, 0.23568743467330933, -0.035852622240781784, -0.02466028928756714, 0.0003914769331458956, -0.11741170287132263, 0.1985408067703247, -0.1306234747171402, 0.06382817775011063, 0.0057558841072022915, 0.010684315115213394, 0.026638276875019073, 0.04630829393863678, -0.003613436594605446, -0.035369329154491425, -0.062099847942590714, -0.11294997483491898, -0.16240724921226501, 0.06751088052988052, 0.03244704380631447, 0.1264578253030777, -0.032945092767477036, 0.029783932492136955, -0.1140885278582573, -0.04210405796766281, -0.10788501054048538, -0.03079257905483246, -0.007357526570558548, -0.09497856348752975, -0.01257407572120428, 0.06929396837949753, 0.03523214906454086, -0.02609005942940712, -0.012323067523539066, -0.04276828467845917, 0.004881543572992086, 0.0673980638384819, 0.2555849850177765, -0.11559829860925674, 0.14212819933891296, -0.025449860841035843, -0.0028318327385932207, 0.1145450547337532, -0.11644981056451797, 0.07440578937530518, 0.09279992431402206, -0.02419549599289894, 0.11891704052686691, 0.051183417439460754, -0.012072899378836155, -0.008231029845774174, 0.09222366660833359, -0.0769108310341835, -0.17789503931999207, -0.12970750033855438, 0.007579270284622908, 0.10148629546165466, 0.03602391481399536, 0.10392212122678757, -0.11665817350149155, -0.04614034295082092, -0.009997881948947906, -0.0129539268091321, -0.11223532259464264, 0.07553371787071228, -0.018673427402973175, 0.050230350345373154, -0.07465936243534088, 0.021386252716183662, 0.10825105756521225, -0.1794562190771103, 0.07924770563840866, 0.2653117775917053, -0.1528560370206833, -0.0915103629231453, 0.21497951447963715, 0.14264170825481415, -0.017531784251332283, -0.07442226260900497, -0.04112202674150467, -0.15423548221588135, 0.08492329716682434, 0.1556256264448166, 0.056838374584913254, 0.06653342396020889, -0.06552786380052567, -0.04913949593901634, -0.059713490307331085, 0.049332961440086365, 0.042220570147037506, -0.03876034915447235, -0.12930314242839813, 0.07002246379852295, 0.012110439129173756, 0.1848679482936859, -0.09166233986616135, -0.032952506095170975, -0.16840925812721252, 0.005398681852966547, -0.17303703725337982, 0.03477250784635544, -0.04881057143211365, -0.03069259598851204, -0.0014186397893354297, -0.038365378975868225, -0.04849071055650711, 0.02407074347138405, -0.07002411782741547, 0.03312427178025246, 0.005177687853574753, 0.09861715137958527, 0.034187156707048416, 0.017917798832058907, 0.033169180154800415, 0.013003692030906677, 0.13453112542629242, -0.030236385762691498, -0.11308985203504562, 0.09053081274032593, -0.07840248197317123, -0.07081551849842072, 0.04342275112867355, 0.04064706712961197, 0.13862529397010803, -0.029002364724874496, 0.03858531638979912, 0.08425047248601913, 0.10886289924383163, 0.02463643252849579, 0.14106714725494385, -0.04718094319105148, 0.027763305231928825, -0.12170342355966568, -0.07868563383817673, -0.08303963392972946, 0.04483168199658394, -0.01984131522476673, -0.0454578772187233, 0.07227903604507446, -0.1122390478849411, 0.0015315068885684013, -0.07167269289493561, -0.00963508989661932, -0.060551807284355164, -0.06369677186012268, -0.06894567608833313, -0.05917632207274437, 0.02639998123049736, -0.025330685079097748, -0.009657745249569416, 0.07250011712312698, 0.011493703350424767, 0.042487405240535736, -0.04656018316745758, -0.06085047498345375, 0.01716121844947338, 0.1712426394224167, -0.005799293052405119, -0.02158145233988762, -0.11190240830183029, 0.028174368664622307, -0.014360862784087658, 0.17915071547031403, 0.12050515413284302, 0.06628557294607162, -0.025886867195367813, 0.1101444810628891, 0.03051983192563057, 0.08835197240114212, -0.19731037318706512, 0.04207197204232216, -0.058553606271743774, 0.07140297442674637, -0.1255129724740982, 0.005188731476664543, 0.2314174324274063, -0.049278855323791504, 0.04224954545497894, -0.08290520310401917, -0.07914942502975464, -0.1383717954158783, -0.25304460525512695, -0.11642790585756302, -0.1864434778690338, -0.06116413697600365, -0.07867830246686935, 0.012763690203428268, -0.08019772917032242, 0.17324693500995636, -0.08387404680252075, 0.13039745390415192, -0.040887489914894104, -0.09679029881954193, 0.08954835683107376, -0.047214485704898834, 0.0455588772892952, -0.04183715209364891, 0.04843037575483322, -0.036909762769937515, 0.01213245838880539, -0.00961865484714508, 0.007436951156705618, -0.07527555525302887, -0.016747454181313515, -0.045074671506881714, -0.03481161221861839, -0.01805427484214306, -0.06029113009572029, -0.02629433386027813, 0.01437259092926979, 0.04329738765954971, -0.12827807664871216, 0.021774860098958015, 0.17663456499576569, -0.04017110913991928, -0.24896426498889923, -0.15927250683307648, 0.23061172664165497, -0.007991177029907703, 0.06721358001232147, 0.0152790118008852, 0.017704283818602562, -0.11036936938762665, 0.3963228166103363, 0.2338629812002182, -0.039162296801805496, 0.001985053066164255, 0.02274131402373314, 0.017574019730091095, -0.04956108331680298, 0.16711828112602234, 0.11761073023080826, 0.20649048686027527, -0.037605512887239456, 0.012670498341321945, -0.06255774945020676, 0.0138848340138793, -0.05896763503551483, 0.028894959017634392, 0.14796875417232513, -0.0020137852989137173, 0.031466759741306305, 0.07456107437610626, -0.11592452973127365, 0.06944936513900757, -0.08324865996837616, -0.08887062966823578, -0.08041366189718246, -0.051630597561597824, 0.08624537289142609, 0.06519635766744614, 0.024045271798968315, -0.02137819118797779, -0.010287539102137089, -0.015627358108758926, 0.0388261154294014, -0.14425644278526306, 0.005191568285226822, 0.14691877365112305, -0.0817156508564949, 0.06130637601017952, -0.03695933148264885, 0.014628883451223373, 0.09889624267816544, 0.03560350835323334, -0.06474318355321884, 0.07216428220272064, -0.05202082544565201, 0.06391006708145142, 0.07410205900669098, 0.03335936740040779, -0.0001447189861210063, 0.0964156910777092, -0.006845434196293354, -0.13877825438976288, 0.023420287296175957, 0.04370028153061867, 0.03213221952319145, -0.056959930807352066, 0.08977572619915009, -0.06702962517738342, 0.06468034535646439, 0.16373954713344574, -0.026578841730952263, 0.027283750474452972, -0.12389518320560455, 0.07505635172128677, -0.03398871794342995, -0.07312487810850143, -0.0713159590959549, -0.13506385684013367, -0.027829522266983986, -0.0719616636633873, 0.0515921488404274, -0.15555009245872498, -0.041474759578704834, -0.10529236495494843, 0.025488341227173805, -0.18790303170681, 0.09292639046907425, 0.16747929155826569, 0.04679381847381592, -0.010377515107393265, -0.11247088760137558, -0.0499960221350193, 0.09937955439090729, -0.15999816358089447, -0.09321080148220062 ]
null
null
transformers
# BERT for Vietnamese is trained on more 20 GB news dataset Apply for task sentiment analysis on using [AIViVN's comments dataset](https://www.aivivn.com/contests/6) The model achieved 0.90268 on the public leaderboard, (winner's score is 0.90087) Bert4news is used for a toolkit Vietnames(segmentation and Named Entity Recognition) at ViNLPtoolkit(https://github.com/bino282/ViNLP) We use word sentencepiece, use basic bert tokenization and same config with bert base with lowercase = False. You can download trained model: - [tensorflow](https://drive.google.com/file/d/1X-sRDYf7moS_h61J3L79NkMVGHP-P-k5/view?usp=sharing). - [pytorch](https://drive.google.com/file/d/11aFSTpYIurn-oI2XpAmcCTccB_AonMOu/view?usp=sharing). Use with huggingface/transformers ``` bash import torch from transformers import BertTokenizer,BertModel tokenizer= BertTokenizer.from_pretrained("NlpHUST/vibert4news-base-cased") bert_model = BertModel.from_pretrained("NlpHUST/vibert4news-base-cased") line = "Tôi là sinh viên trường Bách Khoa Hà Nội ." input_id = tokenizer.encode(line,add_special_tokens = True) att_mask = [int(token_id > 0) for token_id in input_id] input_ids = torch.tensor([input_id]) att_masks = torch.tensor([att_mask]) with torch.no_grad(): features = bert_model(input_ids,att_masks) print(features) ``` # Vietnamese toolkit with bert ViNLP is a system annotation for Vietnamese, it use pretrain [Bert4news](https://github.com/bino282/bert4news/) to fine-turning to NLP problems in Vietnamese components of wordsegmentation,Named entity recognition (NER) and achieve high accuravy. ### Installation ```bash git clone https://github.com/bino282/ViNLP.git cd ViNLP python setup.py develop build ``` ### Test Segmentation The model achieved F1 score : 0.984 on VLSP 2013 dataset |Model | F1 | |--------|-----------| | **BertVnTokenizer** | 98.40 | | **DongDu** | 96.90 | | **JvnSegmenter-Maxent** | 97.00 | | **JvnSegmenter-CRFs** | 97.06 | | **VnTokenizer** | 97.33 | | **UETSegmenter** | 97.87 | | **VnTokenizer** | 97.33 | | **VnCoreNLP (i.e. RDRsegmenter)** | 97.90 | ``` bash from ViNLP import BertVnTokenizer tokenizer = BertVnTokenizer() sentences = tokenizer.split(["Tổng thống Donald Trump ký sắc lệnh cấm mọi giao dịch của Mỹ với ByteDance và Tecent - chủ sở hữu của 2 ứng dụng phổ biến TikTok và WeChat sau 45 ngày nữa."]) print(sentences[0]) ``` ``` bash Tổng_thống Donald_Trump ký sắc_lệnh cấm mọi giao_dịch của Mỹ với ByteDance và Tecent - chủ_sở_hữu của 2 ứng_dụng phổ_biến TikTok và WeChat sau 45 ngày nữa . ``` ### Test Named Entity Recognition The model achieved F1 score VLSP 2018 for all named entities including nested entities : 0.786 |Model | F1 | |--------|-----------| | **BertVnNer** | 78.60 | | **VNER Attentive Neural Network** | 77.52 | | **vietner CRF (ngrams + word shapes + cluster + w2v)** | 76.63 | | **ZA-NER BiLSTM** | 74.70 | ``` bash from ViNLP import BertVnNer bert_ner_model = BertVnNer() sentence = "Theo SCMP, báo cáo của CSIS với tên gọi Định hình Tương lai Chính sách của Mỹ với Trung Quốc cũng cho thấy sự ủng hộ tương đối rộng rãi của các chuyên gia về việc cấm Huawei, tập đoàn viễn thông khổng lồ của Trung Quốc" entities = bert_ner_model.annotate([sentence]) print(entities) ``` ``` bash [{'ORGANIZATION': ['SCMP', 'CSIS', 'Huawei'], 'LOCATION': ['Mỹ', 'Trung Quốc']}] ``` Run training with base config ``` bash python train_pytorch.py \\\\ --model_path=bert4news.pytorch \\\\ --max_len=200 \\\\ --batch_size=16 \\\\ --epochs=6 \\\\ --lr=2e-5 ``` ### Contact information For personal communication related to this project, please contact Nha Nguyen Van ([email protected]).
{"language": "vn"}
fill-mask
NlpHUST/vibert4news-base-cased
[ "transformers", "pytorch", "safetensors", "fill-mask", "vn", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "vn" ]
TAGS #transformers #pytorch #safetensors #fill-mask #vn #autotrain_compatible #endpoints_compatible #region-us
BERT for Vietnamese is trained on more 20 GB news dataset ========================================================= Apply for task sentiment analysis on using AIViVN's comments dataset The model achieved 0.90268 on the public leaderboard, (winner's score is 0.90087) Bert4news is used for a toolkit Vietnames(segmentation and Named Entity Recognition) at ViNLPtoolkit(URL We use word sentencepiece, use basic bert tokenization and same config with bert base with lowercase = False. You can download trained model: * tensorflow. * pytorch. Use with huggingface/transformers Vietnamese toolkit with bert ============================ ViNLP is a system annotation for Vietnamese, it use pretrain Bert4news to fine-turning to NLP problems in Vietnamese components of wordsegmentation,Named entity recognition (NER) and achieve high accuravy. ### Installation ### Test Segmentation The model achieved F1 score : 0.984 on VLSP 2013 dataset ### Test Named Entity Recognition The model achieved F1 score VLSP 2018 for all named entities including nested entities : 0.786 Run training with base config ### Contact information For personal communication related to this project, please contact Nha Nguyen Van (nha282@URL).
[ "### Installation", "### Test Segmentation\n\n\nThe model achieved F1 score : 0.984 on VLSP 2013 dataset", "### Test Named Entity Recognition\n\n\nThe model achieved F1 score VLSP 2018 for all named entities including nested entities : 0.786\n\n\n\nRun training with base config", "### Contact information\n\n\nFor personal communication related to this project, please contact Nha Nguyen Van (nha282@URL)." ]
[ "TAGS\n#transformers #pytorch #safetensors #fill-mask #vn #autotrain_compatible #endpoints_compatible #region-us \n", "### Installation", "### Test Segmentation\n\n\nThe model achieved F1 score : 0.984 on VLSP 2013 dataset", "### Test Named Entity Recognition\n\n\nThe model achieved F1 score VLSP 2018 for all named entities including nested entities : 0.786\n\n\n\nRun training with base config", "### Contact information\n\n\nFor personal communication related to this project, please contact Nha Nguyen Van (nha282@URL)." ]
[ 41, 3, 24, 42, 25 ]
[ "passage: TAGS\n#transformers #pytorch #safetensors #fill-mask #vn #autotrain_compatible #endpoints_compatible #region-us \n### Installation### Test Segmentation\n\n\nThe model achieved F1 score : 0.984 on VLSP 2013 dataset### Test Named Entity Recognition\n\n\nThe model achieved F1 score VLSP 2018 for all named entities including nested entities : 0.786\n\n\n\nRun training with base config### Contact information\n\n\nFor personal communication related to this project, please contact Nha Nguyen Van (nha282@URL)." ]
[ -0.20788300037384033, -0.06922602653503418, -0.00037343695294111967, 0.07903939485549927, 0.13424116373062134, -0.027859201654791832, 0.015645164996385574, 0.03289613127708435, 0.03090282715857029, -0.006183276418596506, 0.1801779568195343, 0.12330250442028046, 0.028068656101822853, 0.17913658916950226, 0.009309849701821804, -0.08055417984724045, 0.03529270365834236, 0.03425028920173645, -0.10749781876802444, 0.12302907556295395, 0.10534761846065521, -0.09736365079879761, 0.1302584409713745, 0.04943539947271347, -0.1343517303466797, 0.04209354519844055, 0.043227214366197586, -0.07675904780626297, 0.14705944061279297, 0.03110661171376705, 0.10012327134609222, 0.04283573105931282, 0.13052840530872345, -0.05211326852440834, 0.0035983906127512455, 0.009245422668755054, -0.01248561218380928, 0.02723691612482071, -0.02313804067671299, 0.052737943828105927, 0.12686289846897125, -0.02265915833413601, 0.02339704893529415, 0.04829053953289986, -0.020567461848258972, -0.18142826855182648, -0.05242243409156799, 0.07974487543106079, 0.10539768636226654, 0.05664774402976036, 0.011253301985561848, 0.3119064271450043, -0.08393719047307968, 0.09493796527385712, 0.03644810616970062, -0.32012155652046204, -0.05591888725757599, 0.17573989927768707, 0.008135069161653519, -0.09582830965518951, -0.007954797707498074, 0.09696663171052933, 0.13267919421195984, 0.046988409012556076, 0.04674023389816284, -0.006537773180752993, -0.09389309585094452, 0.023529283702373505, -0.12806811928749084, -0.0808313861489296, 0.1971842497587204, 0.02679496258497238, -0.09044255316257477, -0.01033249031752348, -0.03912685438990593, -0.1903226226568222, 0.022276870906352997, -0.12732289731502533, -0.058954618871212006, -0.06842336803674698, -0.027267904952168465, 0.0466739721596241, -0.14102838933467865, -0.1817966103553772, -0.06106004863977432, 0.08939515799283981, 0.05906802788376808, 0.07774532586336136, -0.09173382818698883, 0.1141415685415268, -0.0450521856546402, -0.08830248564481735, -0.020422687754034996, -0.09934534877538681, -0.05233503505587578, -0.001499934121966362, 0.013420085422694683, -0.05506467446684837, 0.05170759558677673, -0.0448509156703949, -0.01787741482257843, -0.03281895071268082, 0.056800104677677155, 0.004298270680010319, -0.04032619670033455, 0.1910635232925415, -0.05095290020108223, 0.06599660217761993, 0.04282817989587784, -0.04053382948040962, -0.014807055704295635, 0.040599171072244644, -0.042129043489694595, -0.00980459712445736, 0.0843982920050621, 0.0731397494673729, -0.0576864518225193, 0.054202351719141006, -0.03931528329849243, 0.04934623837471008, 0.07556717842817307, -0.0384768508374691, -0.045805685222148895, -0.025559792295098305, -0.06277889013290405, -0.0425967238843441, 0.07652736455202103, -0.029488107189536095, 0.0030455621890723705, -0.009479671716690063, -0.04965450242161751, -0.05681522190570831, -0.08591926097869873, -0.10106895118951797, 0.010766039602458477, -0.003860285272821784, -0.005109245888888836, -0.19050481915473938, -0.21616633236408234, -0.02439194545149803, 0.010132521390914917, -0.024712368845939636, -0.05654201656579971, 0.01872587762773037, -0.005178709048777819, -0.001365089206956327, -0.003137286752462387, 0.08339548856019974, 0.00017712183762341738, 0.05298365280032158, 0.14605839550495148, 0.11250609159469604, -0.016895882785320282, 0.031019849702715874, -0.09399163722991943, 0.004033256787806749, 0.10062660276889801, -0.02902802638709545, -0.1516878753900528, 0.11515530943870544, -0.02898789942264557, -0.09597879648208618, -0.045897386968135834, 0.02795858308672905, 0.07418253272771835, 0.16905033588409424, -0.18197453022003174, -0.029049623757600784, 0.12818394601345062, -0.12540936470031738, -0.1779688596725464, 0.11406020075082779, 0.039550065994262695, 0.009872164577245712, 0.07008733600378036, -0.04462390020489693, 0.12915360927581787, -0.2767563462257385, 0.0040395017713308334, 0.10619690269231796, -0.10420835018157959, -0.17773933708667755, 0.08022750914096832, 0.07547874003648758, -0.19948995113372803, 0.08622437715530396, -0.08952592313289642, 0.05031381547451019, -0.1402585804462433, -0.09716704487800598, -0.09190455079078674, -0.11208921670913696, 0.10962670296430588, 0.041360098868608475, 0.11391283571720123, -0.08301340788602829, -0.09948199987411499, -0.014352519065141678, 0.1312694549560547, 0.04599506035447121, -0.0330638661980629, -0.11944524198770523, 0.1743025928735733, -0.036631353199481964, -0.0345476008951664, -0.08140311390161514, -0.05000261962413788, 0.00997016578912735, 0.041830118745565414, -0.02587222307920456, 0.1224680244922638, 0.039522357285022736, 0.03502778336405754, 0.029889145866036415, -0.00704735703766346, -0.034564077854156494, 0.013368505984544754, -0.018162544816732407, -0.10525265336036682, -0.041394900530576706, -0.04358644038438797, 0.07138029485940933, -0.15863902866840363, 0.0213006604462862, 0.019482189789414406, 0.11949069052934647, -0.015063301660120487, 0.06013009697198868, -0.016219450160861015, 0.07751946896314621, 0.013462376780807972, -0.013224748894572258, 0.035427629947662354, -0.010166903026401997, -0.0041790613904595375, 0.07930415868759155, 0.06487154960632324, 0.11118277907371521, 0.11410024017095566, -0.06995242834091187, -0.05840366706252098, 0.15647666156291962, -0.0776108056306839, 0.04711078107357025, -0.10124748945236206, 0.03242558240890503, -0.018028778955340385, -0.05561533942818642, 0.12104152143001556, -0.056211382150650024, -0.048838961869478226, 0.0260375514626503, -0.06064153462648392, -0.05766167491674423, 0.11287049204111099, 0.08341353386640549, -0.028220247477293015, 0.06349088996648788, 0.06277777254581451, -0.0831383615732193, 0.16823351383209229, -0.007287678308784962, -0.05463257431983948, -0.0326116643846035, 0.02518438547849655, -0.03851267322897911, 0.13601204752922058, -0.17837144434452057, -0.037316445261240005, 0.035617321729660034, -0.026336852461099625, 0.06057592108845711, -0.10897432267665863, -0.025324391201138496, 0.04054036736488342, -0.02327418141067028, 0.019426969811320305, 0.15195992588996887, -0.0618705116212368, 0.09674462676048279, -0.06493356078863144, -0.12846064567565918, 0.008486447855830193, 0.008933528326451778, -0.1110791563987732, 0.1442289799451828, -0.02215445414185524, -0.28427818417549133, -0.1660093069076538, 0.10897666215896606, -0.07913411408662796, -0.0009789521573111415, 0.04066140949726105, -0.16732589900493622, -0.040819257497787476, 0.025979140773415565, -0.017595428973436356, 0.032640863209962845, 0.08071702718734741, 0.044232457876205444, 0.02808081917464733, 0.05025478079915047, -0.12366271018981934, -0.0340409129858017, -0.08981038630008698, -0.10807596147060394, 0.0787847712635994, 0.006012072786688805, 0.05962146073579788, 0.07401829957962036, -0.07609937340021133, 0.05957880988717079, -0.002163926837965846, 0.24788153171539307, -0.01644911989569664, -0.02402426302433014, 0.2055644690990448, 0.10158179700374603, -0.01902684196829796, 0.12937472760677338, -0.00037632256862707436, -0.08395744115114212, 0.04561794549226761, -0.04478150233626366, -0.06530480086803436, -0.22035090625286102, -0.1274530589580536, -0.0233683492988348, 0.008870728313922882, -0.05171513929963112, 0.025626933202147484, 0.0911746546626091, 0.06496042013168335, 0.050700247287750244, 0.029642771929502487, -0.18874013423919678, 0.027152443304657936, 0.17171740531921387, -0.030876584351062775, 0.10781262814998627, -0.07608325779438019, -0.02918057143688202, 0.036374036222696304, -0.06523529440164566, 0.19267305731773376, -0.006326470989733934, -0.11459608376026154, 0.1319902390241623, 0.2440151870250702, 0.07049711793661118, 0.08075100183486938, 0.014586813747882843, -0.062029629945755005, 0.07931559532880783, -0.053210239857435226, -0.053680527955293655, 0.012979187071323395, 0.008792072534561157, 0.030858008190989494, -0.11501314491033554, 0.07618916779756546, -0.0235010776668787, 0.08808885514736176, 0.1401917189359665, -0.23553860187530518, -0.10233254730701447, -0.02848711609840393, -0.007278876379132271, -0.0729556530714035, -0.009683609940111637, 0.09015136957168579, -0.11414871364831924, 0.019887885078787804, 0.018929162994027138, 0.05643976107239723, 0.07721790671348572, 0.019262943416833878, -0.01647292822599411, -0.06283397227525711, 0.0364203043282032, 0.10575566440820694, -0.23461614549160004, 0.3912923038005829, -0.021683061495423317, 0.04938946291804314, -0.0029062051326036453, 0.007456041406840086, -0.003116982290521264, 0.1401430070400238, 0.1401040405035019, 0.0016031719278544188, -0.05331883206963539, -0.15556679666042328, -0.013736037537455559, 0.04943925887346268, -0.08360032737255096, 0.08795663714408875, -0.003195477183908224, 0.01702803000807762, -0.04343122988939285, -0.023236658424139023, 0.12805193662643433, -0.2681313753128052, 0.07285439968109131, -0.041895635426044464, 0.058946575969457626, 0.039299361407756805, -0.08617168664932251, -0.12130802124738693, -0.10308607667684555, 0.0920293927192688, -0.0027730250731110573, -0.02731223963201046, -0.06735862046480179, 0.07048297673463821, 0.005415644496679306, -0.08458413183689117, 0.059191085398197174, -0.04924587160348892, 0.025411028414964676, 0.0009387794998474419, -0.03951538726687431, 0.02443554252386093, -0.10365742444992065, -0.10069594532251358, -0.018416978418827057, 0.15761171281337738, 0.025316717103123665, 0.03932984173297882, 0.0729551911354065, -0.03925958275794983, -0.015344616025686264, -0.10080107301473618, 0.05923197418451309, 0.07678982615470886, -0.05733468011021614, -0.06383892148733139, -0.015715397894382477, -0.2182641625404358, -0.007008823566138744, -0.07777803391218185, 0.15851998329162598, 0.23096293210983276, -0.11112889647483826, 0.051757946610450745, 0.09597834199666977, -0.04384111613035202, -0.23755842447280884, -0.045440346002578735, 0.06674207746982574, 0.13254104554653168, -0.0004999199300073087, -0.004643787629902363, 0.09570538252592087, 0.005165431648492813, -0.06371448189020157, -0.08767177909612656, -0.10068287700414658, -0.14502687752246857, 0.2508677840232849, 0.0661187544465065, 0.23282653093338013, 0.004098330158740282, -0.025985943153500557, -0.057678621262311935, -0.2195972204208374, 0.06651312112808228, -0.0392366461455822, 0.03409652039408684, -0.09327548742294312, 0.0586639940738678, -0.0027302089147269726, -0.03127282112836838, 0.13411687314510345, -0.027694815769791603, 0.06975697726011276, -0.01625671796500683, 0.029677998274564743, 0.017488349229097366, -0.014067071489989758, 0.16599015891551971, 0.026384031400084496, 0.11685340851545334, -0.2152358442544937, -0.008364393375813961, -0.08865434676408768, 0.136353999376297, -0.04758256673812866, -0.06265668570995331, -0.00532720098271966, 0.04061907157301903, -0.056492455303668976, -0.05189365893602371, -0.13734863698482513, -0.052627939730882645, 0.1258365511894226, 0.16420495510101318, 0.06772185117006302, -0.08314812183380127, 0.08288124203681946, 0.03221112862229347, -0.04492643103003502, 0.12164381146430969, -0.10182604193687439, 0.03397934138774872, 0.13310670852661133, -0.000008239684575528372, 0.0023265599738806486, -0.002370179397985339, -0.0674736350774765, -0.04115252569317818, 0.132558211684227, -0.09936492145061493, 0.0038876126054674387, -0.09911394119262695, -0.026331022381782532, 0.03355250880122185, 0.13999050855636597, 0.07556817680597305, -0.13124558329582214, -0.03234319016337395, -0.04203663766384125, -0.05762121453881264, -0.10694649815559387, 0.16265864670276642, 0.10297542810440063, 0.09185515344142914, -0.061433855444192886, -0.07357003539800644, -0.05722082033753395, 0.047631725668907166, -0.01814533770084381, -0.010599038563668728, -0.10307556390762329, -0.08573976159095764, 0.06724381446838379, 0.1452491283416748, -0.014464005827903748, -0.12048552930355072, -0.07932895421981812, -0.08730063587427139, 0.02584979124367237, 0.1862245798110962, 0.15360158681869507, 0.1052304059267044, 0.035317737609148026, -0.05041592940688133, -0.0457092747092247, 0.08503133058547974, 0.1143382340669632, -0.005050736013799906, -0.2473941445350647, 0.12463133782148361, -0.018795032054185867, 0.17799945175647736, -0.0983082726597786, -0.0520164854824543, -0.0684729591012001, 0.053566884249448776, -0.12653516232967377, -0.05903695523738861, 0.008625265210866928, -0.011716727167367935, 0.05779818445444107, -0.08738822489976883, -0.0943213626742363, 0.0899268314242363, -0.10724764317274094, 0.08157658576965332, 0.006052155513316393, 0.03570154681801796, -0.0767354890704155, 0.005888540763407946, 0.03946094214916229, -0.0028434726409614086, 0.018591383472085, -0.004715722054243088, 0.029157891869544983, 0.12176499515771866, -0.18117250502109528, 0.026457494124770164, 0.039305832237005234, 0.04214649274945259, 0.10930119454860687, -0.0459272526204586, 0.05990079045295715, 0.08208208531141281, 0.05487383157014847, 0.05892917141318321, 0.06053890287876129, -0.07165132462978363, -0.032485995441675186, -0.015407744795084, -0.04585526883602142, -0.027850773185491562, 0.00615279283374548, 0.035948313772678375, 0.0990695059299469, 0.14024369418621063, -0.10316886752843857, 0.005483502056449652, -0.17228829860687256, 0.014803500846028328, -0.047654516994953156, -0.006436658091843128, -0.10142155736684799, -0.0289333276450634, 0.049176137894392014, -0.0014024479314684868, 0.1398746520280838, 0.07665341347455978, -0.0018245329847559333, 0.029523691162467003, 0.03883565962314606, 0.009135447442531586, -0.035820819437503815, 0.09483153373003006, 0.016229834407567978, 0.11403962224721909, -0.010469365864992142, 0.01456781942397356, -0.026279009878635406, 0.023819344118237495, 0.1308983713388443, -0.013659129850566387, -0.05540582165122032, 0.06885522603988647, 0.0698939859867096, 0.014907479286193848, -0.09548858553171158, -0.21153412759304047, -0.1587226837873459, 0.060721904039382935, -0.07755447924137115, -0.02916942536830902, 0.11373572051525116, -0.08293365687131882, 0.017994798719882965, -0.040075164288282394, -0.08543898910284042, -0.20895470678806305, -0.10964059829711914, -0.10451829433441162, -0.2290891408920288, 0.08166784048080444, 0.001589450752362609, -0.06717846542596817, 0.028341025114059448, 0.06688011437654495, -0.06807791441679001, 0.11868704110383987, -0.025797877460718155, 0.06330358237028122, 0.03518126532435417, -0.051905836910009384, -0.011054248549044132, -0.09633295983076096, 0.11903870850801468, -0.055552758276462555, 0.006024812813848257, -0.04444648325443268, -0.030280737206339836, -0.1304699331521988, 0.025102149695158005, 0.06477922201156616, -0.053624555468559265, -0.06154123321175575, -0.019894566386938095, 0.015587933361530304, 0.1179032027721405, 0.030217401683330536, 0.047983985394239426, 0.014451595023274422, 0.17001721262931824, -0.047414619475603104, -0.1362202763557434, -0.15250343084335327, 0.18218177556991577, 0.029628999531269073, 0.04558372497558594, 0.018209896981716156, -0.01727871038019657, -0.022435935214161873, 0.21057313680648804, 0.21528705954551697, 0.147637739777565, 0.03691260516643524, -0.05480385944247246, -0.0072564431466162205, -0.1082085371017456, 0.08962110430002213, 0.11118115484714508, 0.22281871736049652, -0.09740506112575531, -0.0486416220664978, -0.0977160632610321, -0.06670232117176056, -0.05760343000292778, -0.0059237671084702015, 0.009476902894675732, -0.044278133660554886, -0.036629997193813324, 0.0665627270936966, -0.05582503601908684, -0.009622522629797459, 0.10696175694465637, -0.027829911559820175, -0.08218749612569809, 0.0039059689734131098, 0.07413911074399948, 0.041125405579805374, 0.03301923722028732, -0.09742429852485657, 0.037080228328704834, 0.15261653065681458, 0.009055308997631073, -0.10267280787229538, -0.17839960753917694, 0.14386850595474243, 0.056268878281116486, 0.16225549578666687, -0.014994163997471333, 0.20917965471744537, 0.05349954962730408, -0.045576050877571106, -0.1004306897521019, 0.11585143208503723, 0.002364379819482565, -0.08116085082292557, 0.03162454068660736, 0.040534600615501404, 0.04275050759315491, 0.05309779942035675, -0.05786850303411484, -0.10443059355020523, 0.019381478428840637, -0.11007663607597351, 0.04957950860261917, -0.14101700484752655, 0.1532701700925827, -0.05519186332821846, 0.07029347866773605, 0.11644179373979568, -0.06042066216468811, 0.0011215033009648323, -0.12442609667778015, 0.08989414572715759, 0.016506055369973183, -0.05483780801296234, 0.005169144831597805, -0.1945965439081192, 0.08464235067367554, -0.021069614216685295, -0.05532419681549072, -0.08795834332704544, -0.04423706606030464, -0.07271834462881088, -0.08761981874704361, -0.016526393592357635, 0.06944873183965683, 0.1111190989613533, 0.037856485694646835, 0.00455834623426199, 0.007802998181432486, -0.021383823826909065, 0.030404293909668922, -0.09027441591024399, -0.11284803599119186 ]
null
null
transformers
# Hagrid DialoGPT medium model
{"tags": ["conversational"]}
text-generation
NoLawz/DialoGPT-medium-hagrid
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Hagrid DialoGPT medium model
[ "# Hagrid DialoGPT medium model" ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Hagrid DialoGPT medium model" ]
[ 51, 9 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Hagrid DialoGPT medium model" ]
[ -0.02612447366118431, 0.03125859797000885, -0.005465014837682247, -0.017103498801589012, 0.12912800908088684, 0.011341667734086514, 0.16993126273155212, 0.1146710142493248, -0.0014888111036270857, -0.05214007943868637, 0.11353417485952377, 0.11798577755689621, 0.009491045027971268, 0.09862355887889862, -0.05189470574259758, -0.29999858140945435, 0.04623280093073845, 0.04126078635454178, -0.030713584274053574, 0.10574447363615036, 0.11975277960300446, -0.039536211639642715, 0.08672836422920227, 0.03306155651807785, -0.17841561138629913, 0.016369104385375977, -0.0031689326278865337, -0.12398496270179749, 0.11356732249259949, 0.0891374796628952, 0.012314372695982456, 0.005885031074285507, -0.042975351214408875, -0.1556183099746704, 0.032301247119903564, -0.02044345997273922, -0.04454407840967178, 0.040514882653951645, 0.03590012341737747, -0.07834040373563766, 0.16682739555835724, 0.06017531827092171, -0.02467423304915428, 0.05053458362817764, -0.1256013959646225, -0.056222401559352875, 0.01885753870010376, 0.06850068271160126, 0.04894033446907997, 0.09130051732063293, -0.04289005696773529, 0.04157521203160286, -0.0776783674955368, 0.09681543707847595, 0.10912396013736725, -0.31655243039131165, 0.00039882471901364625, 0.15689817070960999, 0.035054564476013184, 0.06984716653823853, -0.05650101229548454, 0.0993376076221466, 0.0226193368434906, 0.003749110270291567, -0.011549750342965126, -0.06795787066221237, -0.04780314117670059, 0.03222631290555, -0.09254670888185501, -0.013016230426728725, 0.2280910313129425, -0.06307430565357208, 0.08422332257032394, -0.10630134493112564, -0.09504716098308563, -0.010399957187473774, -0.04133929684758186, -0.011423241347074509, -0.07010722905397415, 0.0660477876663208, 0.015396373346447945, -0.1020154356956482, -0.12969084084033966, -0.008664920926094055, -0.15690557658672333, 0.17685595154762268, 0.0449749119579792, 0.0405304878950119, -0.1999003291130066, 0.10882304608821869, -0.012549849227070808, -0.09387105703353882, 0.028925348073244095, -0.071354441344738, -0.003379823174327612, 0.026545042172074318, -0.036619726568460464, -0.03265642374753952, 0.09628503024578094, 0.07061691582202911, -0.02992117591202259, 0.016917811706662178, 0.010669948533177376, 0.04811893776059151, 0.07017271220684052, 0.04648629575967789, -0.028681602329015732, -0.10436096787452698, 0.023310258984565735, -0.11490397155284882, 0.003884287551045418, -0.04879801347851753, -0.1744265854358673, -0.050108619034290314, 0.03572900965809822, 0.04234049841761589, -0.008300922811031342, 0.0975603312253952, 0.015533731319010258, -0.04917332902550697, 0.02803714945912361, -0.028048835694789886, -0.03500344231724739, -0.009757825173437595, 0.02557508274912834, 0.19340792298316956, 0.011731812730431557, 0.029621928930282593, -0.13022451102733612, 0.016784770414233208, -0.0493583083152771, 0.018975447863340378, -0.00009357032831758261, -0.039949625730514526, -0.011514173820614815, -0.04893605038523674, -0.009227287024259567, -0.14989599585533142, -0.16865010559558868, 0.00019023288041353226, 0.005199717357754707, -0.05036449432373047, -0.08204849064350128, -0.08832885324954987, -0.002446272410452366, 0.06357204914093018, -0.0698256641626358, -0.021596774458885193, -0.050902001559734344, 0.07421309500932693, -0.018902581185102463, 0.09059560298919678, -0.09793207049369812, 0.07725780457258224, -0.05148657038807869, -0.029448356479406357, -0.09487326443195343, 0.12818893790245056, -0.019098663702607155, 0.06997636705636978, -0.034874431788921356, -0.03446318209171295, -0.046101249754428864, 0.08614461123943329, -0.034631334245204926, 0.2158610224723816, -0.06641311943531036, -0.11758945882320404, 0.27465933561325073, -0.06761948764324188, -0.0980028361082077, 0.17671000957489014, -0.002677186857908964, 0.07962682843208313, 0.130503311753273, 0.19045625627040863, 0.019827621057629585, -0.01761631853878498, 0.08476202934980392, 0.10946305096149445, -0.0630221739411354, -0.010987555608153343, 0.023101821541786194, -0.02177833393216133, -0.06987493485212326, 0.04219205677509308, 0.07961179316043854, 0.08314093202352524, -0.043640974909067154, -0.0233318954706192, -0.007174237631261349, 0.0027041202411055565, 0.10153929144144058, -0.046130694448947906, 0.1378287672996521, -0.024074042215943336, -0.05614786967635155, 0.011757968924939632, 0.04978416487574577, -0.06213855370879173, 0.04217028245329857, -0.05324627086520195, 0.0779825821518898, -0.04291015863418579, 0.06932507455348969, -0.1088004931807518, -0.005191020667552948, -0.06735944747924805, 0.15566495060920715, 0.06383359432220459, 0.15283456444740295, 0.06404906511306763, -0.03931128978729248, -0.02987639233469963, 0.04316677898168564, 0.17493557929992676, -0.020350705832242966, -0.08339252322912216, -0.09954355657100677, 0.08963549137115479, -0.04115881025791168, 0.12862272560596466, -0.08139139413833618, 0.006081463769078255, 0.005476512014865875, 0.11472019553184509, -0.02295585162937641, 0.048240624368190765, 0.005876144394278526, -0.028282763436436653, -0.06261551380157471, 0.012365439906716347, 0.07050828635692596, -0.011232484132051468, -0.10332606732845306, 0.24687831103801727, -0.1593921184539795, 0.1862850785255432, 0.16272911429405212, -0.2169223576784134, 0.01506828237324953, -0.15919126570224762, -0.010249003767967224, 0.016380049288272858, 0.042811088263988495, -0.03592045605182648, 0.19635795056819916, -0.02897677756845951, 0.17699503898620605, -0.03890443220734596, -0.048554956912994385, -0.02747524529695511, -0.08252647519111633, 0.01750326156616211, 0.09294488281011581, -0.013531433418393135, -0.16060185432434082, 0.1561095267534256, 0.0699194073677063, 0.013366454280912876, 0.22892454266548157, 0.039451707154512405, 0.005754795856773853, 0.053830135613679886, -0.004597462248057127, -0.06120077148079872, -0.05854574590921402, -0.26651453971862793, -0.06691331416368484, 0.07380498200654984, 0.0464882031083107, 0.12333641201257706, -0.07800780981779099, -0.007004427257925272, 0.0006397552788257599, -0.03482862934470177, 0.06483614444732666, 0.13473232090473175, 0.01647602953016758, 0.11091972887516022, -0.011494593694806099, -0.020231664180755615, 0.05007858946919441, 0.009254630655050278, -0.07956375181674957, 0.19160792231559753, -0.12916558980941772, -0.33709168434143066, -0.12238256633281708, -0.20383092761039734, -0.07239551097154617, 0.04640621691942215, 0.11576113104820251, -0.1467687487602234, -0.0338018462061882, 0.011372163891792297, 0.10749886184930801, -0.11968700587749481, -0.0017213758546859026, -0.041030630469322205, -0.013359371572732925, -0.11569638550281525, -0.0959797203540802, -0.048295777291059494, -0.04871366173028946, -0.06150653585791588, 0.1307898908853531, -0.11667875200510025, 0.021711960434913635, 0.22365443408489227, 0.06075911223888397, 0.07017737627029419, -0.03717781975865364, 0.19735965132713318, -0.1038370281457901, 0.004254621919244528, 0.1880931556224823, -0.00821672659367323, 0.06529366970062256, 0.10918246954679489, 0.0020518319215625525, -0.07277581095695496, 0.014852376654744148, -0.03042341023683548, -0.07227334380149841, -0.20591124892234802, -0.15151840448379517, -0.12733042240142822, 0.020282087847590446, 0.012595880776643753, 0.04580652341246605, 0.14037810266017914, 0.05912965163588524, -0.03638201951980591, 0.03741971030831337, 0.06945301592350006, 0.0803333967924118, 0.2836432158946991, -0.05873405188322067, 0.137521430850029, -0.024574514478445053, -0.16009758412837982, 0.08926939964294434, 0.0931706428527832, 0.11753413081169128, 0.03384804725646973, 0.09629165381193161, 0.04820074513554573, -0.01211460679769516, 0.13338565826416016, 0.03410148620605469, 0.017255570739507675, -0.03991256654262543, -0.0300237275660038, -0.04182286188006401, -0.00827382318675518, 0.05104263871908188, 0.05319385230541229, -0.16768884658813477, -0.008782483637332916, -0.028054602444171906, 0.07668640464544296, 0.05082622915506363, 0.06440326571464539, -0.1694207638502121, -0.020969262346625328, 0.07941029965877533, -0.047076232731342316, -0.12284094095230103, 0.06570057570934296, 0.02487858012318611, -0.1122736930847168, 0.06764931976795197, -0.020565509796142578, 0.09144812077283859, -0.07941773533821106, 0.07512824982404709, -0.12696602940559387, -0.016522422432899475, -0.0055769868195056915, 0.10790510475635529, -0.3162645101547241, 0.19858697056770325, -0.009438852779567242, -0.05645417422056198, -0.10646848380565643, -0.012319868430495262, 0.023127172142267227, 0.10385539382696152, 0.08569753170013428, 0.01748424395918846, 0.05966196209192276, 0.005487572867423296, -0.023945258930325508, 0.03325000777840614, 0.08943866193294525, -0.018156947568058968, -0.027219001203775406, -0.04088025540113449, -0.0006193870212882757, -0.021100511774420738, -0.05999293178319931, -0.0015417295508086681, -0.2155635505914688, 0.08702827244997025, 0.09409697353839874, 0.08464132994413376, 0.024831155315041542, -0.01970096491277218, -0.10837747156620026, 0.26775938272476196, -0.006587222218513489, -0.11574994027614594, -0.10969975590705872, -0.04454322159290314, 0.04590754956007004, -0.06251934170722961, 0.034659940749406815, -0.06499645113945007, 0.045130759477615356, -0.07721734046936035, -0.17784249782562256, 0.10872817039489746, -0.10070887207984924, -0.028753764927387238, -0.0022800748702138662, 0.2108772248029709, 0.017650438472628593, 0.006821594201028347, 0.0473637580871582, -0.005558047443628311, -0.14082880318164825, -0.09400993585586548, -0.020673973485827446, 0.11511050164699554, -0.026467476040124893, 0.05712289363145828, 0.004654386546462774, -0.07496540993452072, -0.022973254323005676, -0.04651409387588501, 0.3080841898918152, 0.1273382604122162, -0.04163241758942604, 0.19343113899230957, 0.16148591041564941, -0.05208834260702133, -0.26937538385391235, -0.12166254222393036, -0.07000868022441864, -0.014678578823804855, -0.10613017529249191, -0.14757221937179565, 0.09824340790510178, -0.015762049704790115, -0.021815620362758636, 0.10063193738460541, -0.2827250361442566, -0.10343768447637558, 0.18031081557273865, -0.043692462146282196, 0.42408841848373413, -0.1041746735572815, -0.07332572340965271, -0.09811225533485413, -0.18179526925086975, 0.1363883912563324, -0.029608052223920822, 0.12990787625312805, -0.02887525036931038, 0.19541585445404053, 0.044096916913986206, -0.01709926500916481, 0.10168944299221039, 0.06261534243822098, -0.04885629191994667, -0.07871998101472855, -0.0033940449357032776, -0.005362006835639477, -0.0009740076493471861, 0.05888915807008743, -0.0949724018573761, 0.01827722042798996, -0.1545950025320053, -0.0662769302725792, -0.08113081753253937, 0.03364650160074234, 0.036925412714481354, -0.07423581182956696, -0.013463303446769714, -0.06807049363851547, 0.005382141098380089, 0.007636787369847298, 0.13206233084201813, -0.11516054719686508, 0.15608246624469757, 0.07245038449764252, 0.12093348056077957, -0.15890364348888397, -0.0484667532145977, -0.0745636373758316, -0.04180702939629555, 0.08152702450752258, -0.11187256872653961, 0.020335933193564415, 0.11203034967184067, -0.021745599806308746, 0.09527356922626495, 0.08209311962127686, -0.012134767137467861, 0.010300069116055965, 0.08408541977405548, -0.22283735871315002, -0.05646595358848572, -0.07694323360919952, 0.014250436797738075, 0.07414720207452774, 0.1217079758644104, 0.21684630215168, -0.002260375302284956, -0.030157117173075676, 0.01308471616357565, 0.032477427273988724, -0.04488816112279892, 0.09428150951862335, -0.016192765906453133, 0.034459613263607025, -0.16795913875102997, 0.04027898982167244, -0.008567454293370247, -0.04966011270880699, 0.03055783174932003, 0.15925997495651245, -0.10796739906072617, -0.1148475706577301, -0.1030939593911171, 0.09547142684459686, -0.1567850559949875, -0.02727488987147808, -0.002387550426647067, -0.14789050817489624, 0.051701225340366364, 0.05736806243658066, 0.05669589340686798, 0.05124266818165779, -0.09396476298570633, -0.029758840799331665, -0.021868087351322174, -0.0016935411840677261, 0.06260049343109131, -0.002739581046625972, -0.04679397866129875, 0.0726933404803276, -0.03814932331442833, 0.11093692481517792, -0.09547725319862366, -0.09317389130592346, -0.14325807988643646, 0.03477432578802109, -0.16272740066051483, -0.06304043531417847, -0.11294326931238174, -0.037615176290273666, 0.0025377506390213966, -0.017376024276018143, -0.01723134145140648, -0.023043550550937653, -0.10417342931032181, 0.05389048531651497, -0.03137174993753433, 0.011212511919438839, -0.06479818373918533, 0.0318097323179245, 0.013377781957387924, -0.02744848094880581, 0.17905732989311218, 0.1335059106349945, -0.11533958464860916, 0.0836029201745987, -0.16072404384613037, -0.0873831957578659, 0.12092843651771545, -0.007265662308782339, 0.05544937029480934, 0.04323465749621391, 0.0030528916977345943, 0.05190630257129669, 0.07841961830854416, 0.05551479011774063, -0.0016163010150194168, -0.0669936016201973, 0.06105252727866173, -0.10836734622716904, -0.09529431909322739, -0.03349892795085907, -0.018764827400445938, 0.03538810834288597, 0.051277391612529755, 0.08112326264381409, -0.08153331279754639, 0.07238873094320297, -0.036201588809490204, 0.023547839373350143, 0.008892756886780262, -0.15834473073482513, -0.031751230359077454, -0.08294644206762314, 0.026636051014065742, -0.006693113595247269, 0.23773710429668427, 0.013230196200311184, -0.020279167219996452, 0.01297792699187994, 0.07531164586544037, -0.01276049017906189, -0.022953353822231293, 0.16191674768924713, 0.10029107332229614, -0.025701891630887985, -0.1034945473074913, 0.08088625222444534, 0.03897963464260101, 0.033458977937698364, 0.09765590727329254, -0.054870881140232086, -0.04381219297647476, 0.0958879217505455, -0.03351768106222153, -0.0061591570265591145, -0.14322935044765472, -0.08602304011583328, -0.05937374010682106, 0.06547808647155762, -0.05863840878009796, 0.10773856937885284, 0.1725088655948639, -0.019616402685642242, 0.018649600446224213, 0.011826636269688606, -0.06073998287320137, -0.1687115728855133, -0.1902121603488922, -0.06360720098018646, -0.16235381364822388, -0.004307079128921032, -0.11394775658845901, 0.06011056900024414, 0.047284845262765884, 0.11143720895051956, -0.040721457451581955, 0.06497430801391602, 0.07626333832740784, -0.12058745324611664, 0.08281682431697845, -0.028255652636289597, 0.08861543238162994, -0.014578212052583694, 0.006621713750064373, -0.055296629667282104, 0.024136565625667572, 0.014497635886073112, 0.03144034743309021, -0.033983178436756134, 0.0002159841824322939, -0.12246459722518921, -0.06536736339330673, -0.06256567686796188, 0.08214560151100159, 0.02210577018558979, 0.14214350283145905, 0.02324095368385315, -0.06984039396047592, 0.014901523478329182, 0.21979856491088867, -0.0797201544046402, -0.11888709664344788, -0.05052179470658302, 0.22601380944252014, 0.010035770013928413, 0.10409059375524521, -0.043023861944675446, 0.0027768313884735107, -0.09709179401397705, 0.30701467394828796, 0.29878246784210205, -0.0909595638513565, 0.017606545239686966, -0.008124399930238724, 0.033031392842531204, 0.11775553226470947, 0.09392403811216354, 0.08706268668174744, 0.3392983675003052, -0.04187880456447601, -0.012666083872318268, -0.008366970345377922, -0.05249359458684921, -0.08926111459732056, 0.04832182824611664, 0.03556796908378601, -0.08057329803705215, 0.008386766538023949, 0.09035715460777283, -0.25125032663345337, 0.07002219557762146, -0.155099019408226, -0.15125742554664612, -0.07058722525835037, 0.006744207814335823, 0.08143384009599686, 0.040132757276296616, 0.0724765732884407, -0.0006183749064803123, -0.08010376989841461, 0.0933275818824768, 0.02482745423913002, -0.187064528465271, 0.009692001156508923, 0.08778635412454605, -0.015131836757063866, -0.08471367508172989, -0.01859470084309578, 0.05964980274438858, 0.0712030678987503, 0.04195099323987961, -0.012811120599508286, 0.025274094194173813, -0.027591535821557045, -0.07756587862968445, 0.035621412098407745, 0.05867896229028702, 0.027133207768201828, -0.11472445726394653, 0.07588011026382446, -0.15265920758247375, 0.017757082358002663, 0.031568944454193115, -0.003853451693430543, -0.019393520429730415, 0.014132614247500896, -0.05516384541988373, 0.07272326946258545, 0.06769019365310669, -0.015504332259297371, -0.014849268831312656, -0.028341779485344887, 0.0031265621073544025, -0.018393468111753464, -0.05070814490318298, -0.07227494567632675, -0.1716773957014084, -0.10604189336299896, 0.0663360208272934, 0.0014751972630620003, -0.18242409825325012, 0.0194186270236969, -0.13478463888168335, 0.07754436880350113, -0.12165701389312744, 0.10764607787132263, 0.0989023894071579, 0.036162812262773514, 0.0007287869229912758, -0.0017789136618375778, 0.02726641297340393, 0.07840649783611298, -0.13593891263008118, -0.0663413405418396 ]
null
null
transformers
# Harry Potter DialoGPT medium model
{"tags": ["conversational"]}
text-generation
NoLawz/DialoGPT-medium-harrypotter
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Harry Potter DialoGPT medium model
[ "# Harry Potter DialoGPT medium model" ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Harry Potter DialoGPT medium model" ]
[ 51, 9 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Harry Potter DialoGPT medium model" ]
[ -0.0024977580178529024, 0.05409339815378189, -0.006310661323368549, 0.07305006682872772, 0.10804298520088196, 0.04573274403810501, 0.19380828738212585, 0.11601338535547256, -0.013694056309759617, -0.05868169665336609, 0.10053768754005432, 0.19540292024612427, 0.002394932322204113, 0.03318803757429123, -0.04586996138095856, -0.3182937502861023, 0.025568557903170586, 0.020934993401169777, -0.081796795129776, 0.12335750460624695, 0.10233762860298157, -0.054861292243003845, 0.0776563435792923, -0.014222286641597748, -0.138494074344635, -0.039647962898015976, 0.013597304001450539, -0.11846746504306793, 0.15340670943260193, 0.059525758028030396, 0.018356308341026306, -0.0035274142865091562, -0.06314051896333694, -0.1455739438533783, 0.03845865651965141, -0.029627809301018715, -0.008506120182573795, 0.04647129401564598, 0.02264532819390297, -0.0663866251707077, 0.17316320538520813, 0.15744663774967194, 0.08021627366542816, 0.027756713330745697, -0.137954443693161, -0.04761068522930145, 0.012220901437103748, 0.056576378643512726, -0.023120645433664322, 0.09395593404769897, -0.016449345275759697, 0.11505481600761414, -0.033861514180898666, 0.09992129355669022, 0.13579386472702026, -0.4009125232696533, -0.03829832002520561, 0.09562615305185318, 0.06709079444408417, 0.168238565325737, -0.10429108142852783, 0.03905867785215378, -0.01131750363856554, 0.025120841339230537, -0.015671078115701675, -0.08213542401790619, -0.06791539490222931, 0.02848048135638237, -0.12415656447410583, 0.014040512964129448, 0.24505198001861572, -0.11571863293647766, 0.03437020629644394, -0.10323590040206909, -0.07343616336584091, 0.049532562494277954, -0.052099067717790604, -0.07211261987686157, -0.05237491801381111, 0.06360426545143127, -0.014593405649065971, -0.07307592034339905, -0.08753487467765808, -0.004065509885549545, -0.1383076161146164, 0.18045169115066528, 0.054179996252059937, 0.051654864102602005, -0.20884719491004944, 0.10107976943254471, 0.06875880062580109, -0.05183155834674835, 0.016549408435821533, -0.12205825746059418, 0.04895119369029999, 0.011045144870877266, -0.02691776491701603, -0.053633254021406174, 0.03825540840625763, 0.11401127278804779, -0.03226279094815254, 0.03027181699872017, -0.0011105355806648731, 0.05618491768836975, 0.08005567640066147, 0.011345570906996727, 0.008766962215304375, -0.08686584234237671, 0.02112898789346218, -0.07366522401571274, 0.0030724676325917244, -0.047411851584911346, -0.17258229851722717, -0.06886457651853561, 0.035134490579366684, 0.013464611023664474, 0.03108517825603485, 0.07539452612400055, 0.005465383641421795, -0.06130552664399147, 0.036773450672626495, 0.01580825448036194, -0.020589105784893036, -0.01616959646344185, -0.017210500314831734, 0.2259691059589386, 0.016934575513005257, 0.008820499293506145, -0.11788606643676758, 0.0897785946726799, -0.06672801822423935, 0.012042876332998276, 0.019863959401845932, -0.0511244460940361, -0.0017133562359958887, 0.023604612797498703, 0.015724990516901016, -0.13142475485801697, -0.11238423734903336, -0.003473788732662797, -0.004814975894987583, -0.03568076342344284, -0.10142900049686432, -0.07786095142364502, -0.012278412468731403, 0.05865417420864105, -0.021861504763364792, 0.020783673971891403, -0.047658782452344894, 0.09757307171821594, -0.04224269092082977, 0.10408800840377808, -0.09560216963291168, 0.06116195395588875, -0.0675092563033104, -0.06229029595851898, -0.11983875930309296, 0.0578753836452961, 0.010199042968451977, 0.06513752043247223, 0.022731319069862366, -0.05043644830584526, -0.01264470536261797, 0.052823081612586975, -0.0689421147108078, 0.18538585305213928, -0.06286175549030304, -0.12174028158187866, 0.2367950677871704, -0.09112734347581863, -0.16861560940742493, 0.13674864172935486, -0.018482647836208344, 0.07343670725822449, 0.13504354655742645, 0.1716022491455078, 0.01718003675341606, -0.020722532644867897, 0.07004079222679138, 0.07948054373264313, -0.09051232784986496, 0.05960690230131149, 0.03768547996878624, 0.006065391004085541, -0.08255023509263992, 0.058728910982608795, 0.06185218691825867, -0.005008653737604618, -0.04415011405944824, 0.0029417937621474266, -0.007629401981830597, -0.021773749962449074, 0.144802987575531, -0.027512140572071075, 0.1357192099094391, -0.07038508355617523, -0.035118453204631805, 0.015090901404619217, 0.02780558168888092, -0.03995562717318535, 0.08882667124271393, -0.04812933877110481, 0.0900799110531807, 0.0702313482761383, 0.05396898090839386, -0.12994739413261414, 0.02594396099448204, -0.02521059475839138, 0.19524464011192322, 0.09946878254413605, 0.07353609800338745, 0.064515620470047, -0.0077994586899876595, -0.07425323128700256, 0.04910963773727417, 0.13875305652618408, -0.04057198017835617, -0.12410508096218109, -0.16959643363952637, 0.06448838859796524, -0.0521848089993, 0.08626878261566162, -0.07679567486047745, 0.0245610810816288, -0.03876404091715813, 0.07328678667545319, -0.022762391716241837, 0.028232626616954803, 0.003124786773696542, -0.008777892217040062, -0.07958749681711197, 0.0011923541314899921, 0.08380249887704849, -0.0176983755081892, -0.08165358006954193, 0.17192605137825012, -0.16778916120529175, 0.18359126150608063, 0.21219803392887115, -0.28504371643066406, 0.013618825934827328, -0.15829983353614807, -0.011124917306005955, 0.02453530952334404, 0.07937260717153549, 0.03369840979576111, 0.21097467839717865, -0.01672356016933918, 0.1676463484764099, -0.058178775012493134, -0.07599509507417679, -0.06983578950166702, -0.04734876751899719, 0.013854938559234142, 0.0925099104642868, 0.012000511400401592, -0.11146374046802521, 0.1213819831609726, 0.10959048569202423, 0.050863854587078094, 0.14579439163208008, 0.10526817291975021, -0.0019164816476404667, 0.07533085346221924, -0.04390620067715645, -0.035989031195640564, -0.09775274991989136, -0.2931244373321533, -0.04851778969168663, 0.09109865128993988, 0.011483320966362953, 0.0929296612739563, -0.08295323699712753, -0.02552388422191143, 0.008217653259634972, 0.002128097228705883, 0.03790944814682007, 0.10154128819704056, 0.016044506803154945, 0.14713633060455322, -0.009506788104772568, -0.007852497510612011, 0.05928627401590347, 0.010187054984271526, -0.0867392048239708, 0.16581414639949799, -0.1876172423362732, -0.2825288474559784, -0.07891198247671127, -0.20964555442333221, -0.0172050129622221, 0.04688016697764397, 0.08655208349227905, -0.12711212038993835, -0.016486700624227524, -0.013959687203168869, 0.04140638932585716, -0.24045875668525696, 0.007275896146893501, -0.12124623358249664, 0.04413340240716934, -0.1841178685426712, -0.10060428082942963, -0.018355250358581543, -0.00747861061245203, -0.06185050308704376, 0.14772546291351318, -0.1379540115594864, -0.007818448357284069, 0.22642385959625244, 0.03484458476305008, 0.03661173954606056, -0.047986019402742386, 0.19657734036445618, -0.08912330120801926, -0.02063753455877304, 0.13375970721244812, -0.03231171518564224, 0.04990353435277939, 0.06867346167564392, -0.0051054926589131355, -0.0729794055223465, 0.020433280616998672, -0.07140588015317917, -0.0378623828291893, -0.27354028820991516, -0.0974477082490921, -0.0991104394197464, 0.08734440803527832, 0.02003346011042595, 0.05975013226270676, 0.13209526240825653, 0.040871474891901016, -0.02274523861706257, -0.014962222427129745, 0.1303679347038269, 0.12523391842842102, 0.2357497364282608, -0.0659012421965599, 0.10761690884828568, 0.0016945856623351574, -0.08584457635879517, 0.07311492413282394, 0.07197155058383942, 0.06632853299379349, 0.01773780770599842, 0.02374223992228508, -0.01578148826956749, 0.0720997229218483, 0.13727641105651855, 0.05471135675907135, 0.043265458196401596, 0.0018844562582671642, -0.05776076763868332, -0.013605489395558834, -0.043979961425065994, 0.05651538074016571, 0.05653969943523407, -0.12678585946559906, -0.07283435016870499, 0.00787642877548933, 0.0729125440120697, -0.016894953325390816, 0.050120025873184204, -0.13610422611236572, -0.0311382208019495, 0.08279184252023697, -0.07418949902057648, -0.1462244838476181, 0.10736793279647827, 0.003818033030256629, -0.18890056014060974, 0.07084894925355911, 0.00417760293930769, 0.0905657708644867, -0.059279680252075195, 0.04985617846250534, -0.11974311619997025, -0.12658186256885529, -0.022742878645658493, 0.0699898898601532, -0.3632541596889496, 0.13730348646640778, -0.011885346844792366, -0.027978917583823204, -0.05459921061992645, -0.029928386211395264, 0.013581736013293266, 0.1378537118434906, 0.05943959951400757, 0.001873193308711052, 0.11643184721469879, 0.04460277408361435, 0.036517176777124405, 0.007081347517669201, 0.11131004989147186, 0.03125942498445511, 0.003910559229552746, -0.03671886771917343, 0.003925291821360588, -0.05593422055244446, -0.07317325472831726, 0.0680680200457573, -0.2199474275112152, 0.08668415993452072, -0.010567491874098778, 0.06601543724536896, 0.035395506769418716, -0.021338604390621185, -0.09509134292602539, 0.1929571032524109, -0.025538131594657898, -0.11932512372732162, -0.07707009464502335, -0.031592827290296555, 0.09224192798137665, -0.010348214767873287, 0.011468124575912952, -0.04901091754436493, 0.03855464607477188, -0.12371570616960526, -0.16019988059997559, 0.07600343227386475, -0.04984232783317566, -0.10650286078453064, -0.02162783220410347, 0.19794243574142456, 0.001873658038675785, 0.0786115825176239, 0.009041151031851768, -0.005599118769168854, -0.1885293424129486, -0.04159443825483322, 0.015690453350543976, 0.032558999955654144, 0.0043706330470740795, 0.04612993821501732, 0.025508029386401176, 0.046577587723731995, -0.09174373000860214, -0.02002241089940071, 0.3025509715080261, 0.1359972506761551, -0.01033079344779253, 0.15151001513004303, 0.11884938925504684, -0.08729122579097748, -0.16642463207244873, -0.115115225315094, -0.11130718141794205, -0.07754094153642654, -0.11512837558984756, -0.1388731598854065, 0.0914175882935524, -0.04475122690200806, 0.03499004989862442, 0.1363106220960617, -0.27918344736099243, -0.11036349833011627, 0.11772024631500244, 0.006850861944258213, 0.34420016407966614, -0.12433212250471115, -0.04154377803206444, -0.07349374890327454, -0.16908255219459534, 0.07407160103321075, -0.059243664145469666, 0.1193019449710846, -0.0821237564086914, 0.20070292055606842, 0.011412781663239002, 0.010455090552568436, 0.08534015715122223, 0.040728725492954254, -0.05693338066339493, -0.07041158527135849, -0.08401323854923248, 0.009185504168272018, 0.031383685767650604, 0.00625891238451004, -0.04158390313386917, 0.021365230903029442, -0.1130356416106224, -0.05428561568260193, -0.058056849986314774, -0.00582648441195488, -0.0027491264045238495, -0.03752510994672775, -0.05272333696484566, -0.03140551596879959, -0.022214163094758987, 0.021472450345754623, 0.11110232025384903, -0.08592212945222855, 0.18646694719791412, 0.03694383054971695, 0.16860662400722504, -0.1546296775341034, -0.0572250559926033, -0.035480108112096786, -0.04839854687452316, 0.06830376386642456, -0.08431322872638702, -0.030055569484829903, 0.13756301999092102, -0.026310527697205544, 0.04782738536596298, 0.12959721684455872, 0.026243936270475388, 0.0253327377140522, 0.03715570643544197, -0.240749329328537, -0.05663067847490311, -0.03814368695020676, 0.028296560049057007, 0.04429348558187485, 0.057276736944913864, 0.20332548022270203, -0.029378250241279602, -0.07751712948083878, 0.015360837802290916, 0.026411941275000572, -0.030096575617790222, 0.11692231148481369, 0.015523595735430717, -0.0011545494198799133, -0.15817423164844513, 0.07309256494045258, -0.009325029328465462, -0.08117368072271347, 0.014117678627371788, 0.20170482993125916, -0.11907018721103668, -0.10232323408126831, -0.04818910360336304, 0.05588018149137497, -0.08513864874839783, 0.014407794922590256, 0.001476992852985859, -0.16064715385437012, 0.06922507286071777, 0.07567603141069412, 0.03904863819479942, 0.05632390081882477, -0.12411858886480331, -0.012963400222361088, -0.02802419848740101, 0.004171033389866352, 0.03807632252573967, 0.02182617597281933, -0.06267377734184265, 0.17592617869377136, -0.06651437282562256, 0.07913456857204437, -0.0752095878124237, -0.09438325464725494, -0.13258080184459686, 0.05010896921157837, -0.08656524121761322, -0.06736676394939423, -0.11762558668851852, -0.04797423630952835, -0.0054679689928889275, -0.027610834687948227, 0.0038431757129728794, -0.06184081360697746, -0.1001250147819519, 0.020711245015263557, -0.008470823988318443, -0.018300380557775497, -0.05430687963962555, 0.026878036558628082, 0.047021493315696716, -0.03514599800109863, 0.15711970627307892, 0.21208727359771729, -0.14378419518470764, 0.09969767928123474, -0.11970078945159912, -0.10789360105991364, 0.05391629412770271, 0.01746884547173977, 0.03304065018892288, 0.07224863767623901, -0.02791825495660305, 0.019769981503486633, 0.049446143209934235, 0.0822361409664154, 0.07499945908784866, -0.05456947535276413, 0.038484837859869, -0.07041694223880768, -0.1377437710762024, -0.0208374485373497, -0.03517032414674759, 0.08145865797996521, 0.007990995422005653, 0.10837215930223465, -0.06420278549194336, 0.07387887686491013, -0.07199127227067947, 0.049532435834407806, 0.023151524364948273, -0.14471428096294403, -0.01595647819340229, -0.07270932197570801, 0.03872676566243172, -0.010343276895582676, 0.2148599624633789, -0.022952035069465637, -0.015907641500234604, 0.03086635284125805, 0.05630023777484894, -0.0365104079246521, -0.0031087142415344715, 0.1881864070892334, 0.12265247106552124, -0.0754636898636818, -0.03126250207424164, 0.06034655123949051, 0.07215908169746399, 0.06751780211925507, 0.10148171335458755, -0.014541908167302608, 0.01841651275753975, 0.07914777845144272, -0.05464780330657959, 0.08615036308765411, -0.09704257547855377, -0.1327018290758133, 0.05145711451768875, 0.017533734440803528, -0.06544352322816849, 0.20898163318634033, 0.18844282627105713, -0.02525186352431774, 0.012165067717432976, 0.004593114834278822, -0.09717927128076553, -0.15279093384742737, -0.05423359572887421, -0.07353794574737549, -0.14062462747097015, -0.006946035195142031, -0.16188809275627136, 0.03400474041700363, -0.025893334299325943, 0.10246588289737701, -0.06422103941440582, -0.020717386156320572, 0.13826332986354828, -0.10013318806886673, 0.09237761795520782, -0.02317359298467636, 0.08649574220180511, -0.02811412326991558, -0.009385584853589535, -0.09472592920064926, 0.004500578623265028, 0.011476312763988972, 0.06750277429819107, -0.11224113404750824, 0.01752299629151821, -0.11629372835159302, -0.0801374688744545, -0.03084816411137581, 0.09004639089107513, -0.01644342765212059, 0.16458401083946228, 0.028610089793801308, -0.08300105482339859, -0.008347717113792896, 0.2523747682571411, -0.08408698439598083, -0.10667939484119415, -0.025212638080120087, 0.1940273642539978, 0.02710612490773201, 0.06208845600485802, 0.007083544973284006, 0.03422300145030022, -0.07818301767110825, 0.2822102904319763, 0.3579750955104828, -0.1452392339706421, -0.007135095540434122, 0.03624914586544037, 0.042349427938461304, 0.14638209342956543, 0.07983972132205963, 0.12121576815843582, 0.3073844015598297, -0.08173356205224991, 0.016942810267210007, -0.04410826414823532, -0.005236413329839706, -0.07077188044786453, 0.03188173845410347, 0.07025037705898285, -0.09573553502559662, -0.018225176259875298, 0.08517606556415558, -0.28894340991973877, 0.11709441989660263, -0.14684024453163147, -0.14782756567001343, -0.03969718888401985, -0.006554015912115574, 0.022338638082146645, 0.012370671145617962, 0.08142491430044174, 0.022414278239011765, -0.06986717134714127, 0.06519460678100586, 0.022829443216323853, -0.2188921421766281, -0.0056070974096655846, 0.1510927826166153, -0.059523824602365494, -0.039138540625572205, -0.023378700017929077, 0.0582437701523304, 0.06019756942987442, 0.08654139935970306, -0.033490825444459915, -0.06292715668678284, -0.02176622860133648, -0.04238615557551384, 0.026547910645604134, 0.052201371639966965, 0.06832458078861237, -0.08711428940296173, 0.11301964521408081, -0.05584822595119476, 0.06178727000951767, 0.0457632802426815, -0.016272155568003654, -0.0099460668861866, 0.008889105170965195, -0.07015834003686905, 0.05288847163319588, 0.12447007745504379, -0.026420829817652702, 0.0008194451220333576, -0.020400751382112503, -0.07391297072172165, -0.030531801283359528, -0.02668210305273533, -0.08168911188840866, -0.18579699099063873, -0.12910102307796478, 0.03631117194890976, -0.021401386708021164, -0.22240474820137024, 0.004323441535234451, -0.1240076869726181, 0.041043005883693695, -0.1412169337272644, 0.09691870212554932, 0.080451101064682, -0.0004420911427587271, 0.006481901742517948, 0.07179353386163712, 0.04028426855802536, 0.14118239283561707, -0.17587162554264069, -0.034531619399785995 ]
null
null
transformers
# Spong Bob DialoGPT medium model
{"tags": ["conversational"]}
text-generation
NoLawz/DialoGPT-medium-spongebob
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Spong Bob DialoGPT medium model
[ "# Spong Bob DialoGPT medium model" ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Spong Bob DialoGPT medium model" ]
[ 51, 10 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Spong Bob DialoGPT medium model" ]
[ -0.015270689502358437, 0.017581520602107048, -0.005653976462781429, -0.0050736372359097, 0.1424090415239334, 0.0011507944436743855, 0.17437851428985596, 0.088934026658535, -0.007904027588665485, -0.03975997492671013, 0.11299329996109009, 0.11729754507541656, -0.002789085265249014, 0.15563495457172394, -0.06060442700982094, -0.33603399991989136, 0.04790915548801422, 0.049255114048719406, 0.012883475981652737, 0.13053622841835022, 0.10396837443113327, -0.05264059081673622, 0.05856100469827652, -0.01483706571161747, -0.16636686027050018, 0.00882889423519373, 0.007196825463324785, -0.10757580399513245, 0.0876457467675209, 0.071229487657547, 0.026967467740178108, 0.03253539279103279, -0.036384936422109604, -0.13972227275371552, 0.0438251830637455, -0.016967276111245155, -0.02696995809674263, 0.05637429654598236, 0.009815992787480354, -0.035142932087183, 0.15456652641296387, 0.12361985445022583, 0.0023106341250240803, 0.0485197938978672, -0.1418035626411438, 0.003507777117192745, 0.01297715213149786, 0.10126487165689468, 0.03484153747558594, 0.09505210816860199, -0.03336906060576439, 0.11287681013345718, -0.058505501598119736, 0.1084357351064682, 0.10599003732204437, -0.2894432842731476, -0.025388017296791077, 0.12357306480407715, 0.038470663130283356, 0.10365598648786545, -0.06711280345916748, 0.0686984658241272, 0.0015349567402154207, 0.001197229721583426, -0.057631731033325195, -0.07993508130311966, -0.08183574676513672, 0.007724597118794918, -0.10531259328126907, -0.008667324669659138, 0.258102685213089, -0.024200210347771645, 0.05666595697402954, -0.07497946918010712, -0.08854455500841141, -0.011659123934805393, -0.04360140487551689, -0.0099620521068573, -0.09731975942850113, 0.07223788648843765, -0.03542493283748627, -0.12668046355247498, -0.11481595039367676, -0.04490179196000099, -0.17090986669063568, 0.17991028726100922, 0.01689166948199272, 0.05686621740460396, -0.21233631670475006, 0.10468315333127975, 0.036703936755657196, -0.07952280342578888, 0.028594132512807846, -0.09117620438337326, 0.01888510398566723, 0.029629522934556007, -0.011879677884280682, -0.006330419797450304, 0.061787597835063934, 0.1536771059036255, 0.02942678891122341, 0.029444169253110886, -0.019104933366179466, 0.060758646577596664, 0.04294558987021446, 0.020798835903406143, -0.04133347049355507, -0.03577854856848717, 0.028712661936879158, -0.08313598483800888, -0.015525570139288902, -0.06113721802830696, -0.17819151282310486, -0.008131500333547592, 0.05050832778215408, 0.036385856568813324, 0.027101876214146614, 0.11884161829948425, 0.013338048942387104, -0.05138231813907623, 0.04022619500756264, -0.01669776625931263, -0.01922275312244892, 0.01219521090388298, 0.003825747175142169, 0.14598040282726288, 0.02132064662873745, 0.03015066124498844, -0.10144185274839401, 0.042959146201610565, -0.06148052588105202, 0.002099202247336507, -0.026893723756074905, -0.06395091116428375, -0.005735492799431086, -0.05071883276104927, 0.007059617433696985, -0.1461726427078247, -0.15975679457187653, -0.010637998580932617, -0.022922325879335403, -0.04269803315401077, -0.10572265088558197, -0.09528565406799316, -0.020274624228477478, 0.053727176040410995, -0.07612575590610504, 0.04550784453749657, -0.04673904925584793, 0.08728475868701935, -0.016971398144960403, 0.0810851901769638, -0.14043200016021729, 0.06492743641138077, -0.0631873682141304, -0.03924347460269928, -0.09155260026454926, 0.10827943682670593, -0.02464810200035572, 0.04482852667570114, -0.0198224950581789, -0.028022948652505875, -0.11577221751213074, 0.05599498376250267, -0.029225140810012817, 0.2258112132549286, -0.07898694276809692, -0.11046791076660156, 0.28317520022392273, -0.06848801672458649, -0.10780974477529526, 0.11217290163040161, -0.01137788500636816, 0.1032843366265297, 0.1184777095913887, 0.1522189825773239, 0.029056107625365257, -0.022271405905485153, 0.08467156440019608, 0.1140931025147438, -0.06441185623407364, -0.004368111956864595, 0.025503363460302353, -0.008806805126369, -0.07105051726102829, 0.033498089760541916, 0.058298856019973755, 0.09738966822624207, -0.07261622697114944, -0.0018794190837070346, 0.017720935866236687, -0.0021848026663064957, 0.07287782430648804, -0.03991711139678955, 0.13750749826431274, -0.013549343682825565, -0.03969907388091087, 0.07620732486248016, 0.009479169733822346, -0.03626835346221924, 0.04560871049761772, -0.08009564876556396, 0.020603131502866745, -0.032057780772447586, 0.06593247503042221, -0.1531931757926941, -0.0498473159968853, -0.05775762349367142, 0.20521917939186096, 0.09475141018629074, 0.12078727781772614, 0.06382997334003448, -0.06074805557727814, -0.019910449162125587, 0.03489917516708374, 0.14868701994419098, -0.011412864550948143, -0.07615690678358078, -0.08326004445552826, 0.10483682155609131, -0.06306435912847519, 0.057334594428539276, -0.03507058694958687, 0.011783629655838013, 0.042651813477277756, 0.11774201691150665, -0.005806330591440201, 0.0053789964877069, 0.024060199037194252, -0.015074950642883778, -0.046116821467876434, 0.009326876141130924, 0.09444570541381836, 0.005897758528590202, -0.11563222110271454, 0.23003743588924408, -0.18929897248744965, 0.13657747209072113, 0.17840595543384552, -0.21744483709335327, 0.009333135560154915, -0.14395946264266968, -0.020707562565803528, 0.001189841073937714, 0.054108139127492905, -0.05684245377779007, 0.20061323046684265, -0.016088426113128662, 0.16294056177139282, -0.03052297607064247, -0.04678164795041084, -0.031127290800213814, -0.04033411666750908, 0.012971791438758373, 0.10012349486351013, 0.08116964995861053, -0.15188591182231903, 0.1682027280330658, 0.04596640169620514, 0.03885238990187645, 0.21015040576457977, 0.02458205074071884, 0.0009420278365723789, 0.058374758809804916, -0.0013897167518734932, -0.0430818572640419, -0.052340567111968994, -0.26845231652259827, -0.02764247916638851, 0.07458125054836273, 0.046826645731925964, 0.11132574081420898, -0.08460540324449539, -0.01715485379099846, -0.01889209821820259, -0.023558782413601875, 0.042435433715581894, 0.12867195904254913, 0.0059843771159648895, 0.11429574340581894, -0.022596606984734535, -0.034248899668455124, 0.050136663019657135, 0.0049832831136882305, -0.08945471793413162, 0.14653834700584412, -0.12466374784708023, -0.32966119050979614, -0.14360634982585907, -0.185808926820755, -0.05339084565639496, 0.0286690816283226, 0.08701784163713455, -0.10500464588403702, -0.02548905275762081, -0.005230152513831854, 0.07918829470872879, -0.09839268773794174, 0.02782571129500866, -0.013124962337315083, -0.008901846595108509, -0.09949137270450592, -0.09109511971473694, -0.037893328815698624, -0.048998359590768814, -0.05243099108338356, 0.12487750500440598, -0.10667125880718231, 0.03357141464948654, 0.202745258808136, 0.082427978515625, 0.07881683111190796, -0.034310176968574524, 0.2134808748960495, -0.08479935675859451, 0.0021099941805005074, 0.20314444601535797, -0.0538954958319664, 0.052259352058172226, 0.10684258490800858, 0.0033810061868280172, -0.06508342176675797, 0.05691155418753624, -0.014465983025729656, -0.07874438166618347, -0.16937260329723358, -0.12225772440433502, -0.1351049244403839, 0.036945316940546036, 0.040072835981845856, 0.0514347180724144, 0.08169613033533096, 0.056622978299856186, -0.048523128032684326, 0.02794009819626808, 0.09040533006191254, 0.08013006299734116, 0.2404784858226776, -0.07887835055589676, 0.12920483946800232, -0.018341176211833954, -0.1653987318277359, 0.08200140297412872, 0.03280516341328621, 0.09342951327562332, 0.06664937734603882, 0.0797160193324089, 0.02619222365319729, -0.03593120723962784, 0.14692111313343048, 0.09944316744804382, -0.0009715889464132488, -0.02795945107936859, -0.03579696640372276, -0.04289066791534424, 0.00026110903127118945, 0.02488735318183899, 0.0432022325694561, -0.16128160059452057, 0.009535743854939938, -0.04689856618642807, 0.043781593441963196, 0.024594400078058243, 0.06862373650074005, -0.14601850509643555, -0.016403324902057648, 0.06661888211965561, -0.032311856746673584, -0.12954770028591156, 0.059133708477020264, 0.06961513310670853, -0.11953499168157578, 0.04998091980814934, -0.015545430593192577, 0.1286349892616272, -0.06145418807864189, 0.056540824472904205, -0.1325070708990097, -0.041870325803756714, -0.013615883886814117, 0.09431838244199753, -0.30352917313575745, 0.17747768759727478, -0.026356715708971024, -0.0696871280670166, -0.08467314392328262, -0.02483139932155609, 0.0227314755320549, 0.06652934849262238, 0.0715257003903389, 0.005717817228287458, 0.05000513419508934, 0.009785749018192291, -0.0829954445362091, 0.0334254689514637, 0.11753910034894943, -0.04340801015496254, -0.013579177670180798, -0.040324617177248, 0.000272429664619267, 0.004100595135241747, -0.08256494253873825, -0.030787082388997078, -0.18460321426391602, 0.06740497797727585, 0.09522891789674759, 0.08089147508144379, 0.018819523975253105, -0.0323239304125309, -0.043641120195388794, 0.21858657896518707, -0.06723494827747345, -0.08352423459291458, -0.1071559339761734, 0.038082804530858994, 0.06268899887800217, -0.06426259875297546, 0.0235445536673069, -0.06068338826298714, 0.029874110594391823, -0.05436675250530243, -0.1818050742149353, 0.11054390668869019, -0.0822444036602974, -0.05767303332686424, -0.025413691997528076, 0.1820792853832245, -0.009836233220994473, 0.02388739585876465, 0.00828927755355835, -0.010275932028889656, -0.1080346405506134, -0.12439252436161041, -0.015311759896576405, 0.04059732332825661, -0.03713242709636688, 0.061942700296640396, -0.02504955790936947, -0.044940050691366196, -0.04011591523885727, -0.03150298818945885, 0.3181450664997101, 0.14616145193576813, -0.03300515562295914, 0.16865983605384827, 0.12514357268810272, -0.04108856990933418, -0.2877480685710907, -0.11772088706493378, -0.06888780742883682, -0.03219745680689812, -0.10498273372650146, -0.15378014743328094, 0.08722775429487228, -0.051921818405389786, -0.015029911883175373, 0.11146814376115799, -0.24666965007781982, -0.11715139448642731, 0.19890551269054413, -0.026999497786164284, 0.4139988124370575, -0.09358631819486618, -0.09756386280059814, -0.06234941631555557, -0.13276222348213196, 0.17697419226169586, 0.035672660917043686, 0.11984231323003769, -0.01827697642147541, 0.16544410586357117, 0.0405266210436821, -0.008489067666232586, 0.0727829709649086, -0.004481041803956032, -0.05925333499908447, -0.09690260887145996, -0.07109037041664124, -0.006027987692505121, 0.03729194402694702, 0.03794986754655838, -0.07758170366287231, 0.025329552590847015, -0.11723137646913528, -0.062276531010866165, -0.0854838490486145, 0.03223884105682373, 0.041971754282712936, -0.05448947101831436, 0.0047264257445931435, -0.055281445384025574, -0.03198203071951866, 0.017441172152757645, 0.13022653758525848, -0.12195383757352829, 0.1323750764131546, 0.09749474376440048, 0.1240919679403305, -0.13805870711803436, -0.03520100936293602, -0.06494158506393433, -0.05154461786150932, 0.07323281466960907, -0.039531588554382324, 0.01718214899301529, 0.10274708271026611, -0.01483878493309021, 0.10749872028827667, 0.11174482107162476, -0.02091849409043789, 0.021717678755521774, 0.09997644275426865, -0.23516029119491577, -0.060815054923295975, -0.0732835903763771, 0.05318421497941017, 0.022135742008686066, 0.0612051896750927, 0.20209236443042755, -0.0022655504290014505, -0.020752428099513054, 0.017602527514100075, 0.021439328789711, -0.03700021281838417, 0.09907957911491394, 0.024833939969539642, 0.03231367468833923, -0.1533748358488083, 0.028775785118341446, -0.004879075568169355, -0.06394796818494797, 0.0017041901592165232, 0.14828050136566162, -0.12139683961868286, -0.10141883045434952, -0.04850214719772339, 0.08061142265796661, -0.10135384649038315, 0.010916396044194698, -0.022641779854893684, -0.1358751803636551, 0.07478788495063782, 0.10952151566743851, 0.04438238590955734, 0.05223798751831055, -0.1046970933675766, -0.000823936949018389, 0.0029473246540874243, -0.0011313107097521424, 0.04573088139295578, 0.003405390540137887, -0.06416492909193039, 0.09357094764709473, -0.028617175295948982, 0.0919136255979538, -0.09916947036981583, -0.08521001040935516, -0.17943578958511353, 0.036655280739068985, -0.06497883051633835, -0.07284465432167053, -0.07645783573389053, -0.03894852474331856, 0.005960249807685614, -0.05281843990087509, -0.033537622541189194, -0.033893171697854996, -0.1022496148943901, 0.023385504260659218, -0.038333844393491745, -0.002954755909740925, -0.07392708212137222, 0.03677225112915039, 0.07718254625797272, -0.03537895902991295, 0.15823283791542053, 0.15232110023498535, -0.10739676654338837, 0.0845954418182373, -0.185064435005188, -0.07996001094579697, 0.12821871042251587, 0.024231821298599243, 0.03681102767586708, 0.10018256306648254, 0.013337166048586369, 0.056908704340457916, 0.10006225109100342, 0.055712781846523285, 0.06819064170122147, -0.072735495865345, 0.028707636520266533, -0.09398376941680908, -0.09957627207040787, -0.04957707226276398, -0.023977259173989296, 0.03944922611117363, 0.09195421636104584, 0.0856570452451706, -0.0657154768705368, 0.06983423978090286, -0.059209927916526794, 0.05143046751618385, 0.021271735429763794, -0.1649022102355957, 0.036660753190517426, -0.08572223782539368, 0.042217861860990524, 0.027896810322999954, 0.2063131481409073, -0.005551084876060486, -0.07952912151813507, 0.0278166476637125, 0.09293145686388016, 0.014511588029563427, -0.01943226531147957, 0.19137787818908691, 0.1221068799495697, -0.04803621768951416, -0.10581886023283005, 0.07400454580783844, 0.03591512143611908, 0.01927121728658676, 0.13008074462413788, -0.032261233776807785, -0.07690215110778809, 0.06714026629924774, 0.013895807787775993, 0.028943998739123344, -0.06985219568014145, -0.1299077719449997, -0.03924883157014847, 0.050062816590070724, -0.0589841865003109, 0.16782347857952118, 0.13618546724319458, -0.01501546986401081, 0.015107517130672932, 0.01328610721975565, -0.05649847909808159, -0.1844302862882614, -0.17912408709526062, -0.0729612410068512, -0.14100271463394165, 0.010924936272203922, -0.11115606874227524, 0.0349043607711792, 0.015936054289340973, 0.09852945804595947, -0.0295263584703207, 0.06790224462747574, 0.038086093962192535, -0.12416158616542816, 0.07697942852973938, -0.0351032093167305, 0.05506926402449608, -0.05006246641278267, -0.0013921535573899746, -0.02872065268456936, 0.0533212348818779, 0.018179800361394882, 0.0324714295566082, -0.05869639664888382, 0.006685937754809856, -0.10182269662618637, -0.07527586817741394, -0.06917966902256012, 0.0485556460916996, 0.018610285595059395, 0.1754707247018814, 0.020088907331228256, -0.02618546411395073, 0.005976016633212566, 0.23325133323669434, -0.0706324353814125, -0.13229164481163025, -0.060785241425037384, 0.14856795966625214, -0.026583321392536163, 0.09822534769773483, -0.061444200575351715, 0.0025142834056168795, -0.09484976530075073, 0.31971922516822815, 0.30222129821777344, -0.12471975386142731, 0.0213554035872221, 0.006067598704248667, 0.03698176145553589, 0.12858188152313232, 0.10447724163532257, 0.10611458122730255, 0.3230636715888977, -0.03784503415226936, 0.006495732348412275, -0.005307706072926521, -0.0337146632373333, -0.042268674820661545, 0.004049771465361118, 0.06633226573467255, -0.0630013719201088, -0.04463944956660271, 0.11069490015506744, -0.26745444536209106, 0.0602552592754364, -0.1592799723148346, -0.156789168715477, -0.05594781041145325, -0.006602688692510128, 0.07814458757638931, 0.026769736781716347, 0.05829213187098503, -0.013924776576459408, -0.03665933012962341, -0.00630079535767436, 0.028736256062984467, -0.19716039299964905, -0.022935448214411736, 0.08218245208263397, -0.053504619747400284, -0.057805728167295456, -0.03211619332432747, 0.07067545503377914, 0.0682472214102745, 0.03601526841521263, -0.012808039784431458, 0.04823407158255577, -0.03702613711357117, -0.07350292056798935, 0.023064590990543365, 0.05871596559882164, 0.012314289808273315, -0.07380156964063644, 0.06154005229473114, -0.15085521340370178, 0.026913026347756386, 0.011425433680415154, -0.029167547821998596, -0.02754899300634861, 0.02917500026524067, -0.07590126991271973, 0.08640049397945404, 0.08389919996261597, -0.032187722623348236, -0.017700528725981712, -0.024150744080543518, -0.012332528829574585, -0.03718274459242821, -0.049057044088840485, -0.08007554709911346, -0.16369396448135376, -0.12804551422595978, 0.06319708377122879, 0.007631123531609774, -0.22755733132362366, 0.02116730809211731, -0.17070746421813965, 0.0660037100315094, -0.11811500787734985, 0.09796389192342758, 0.08615485578775406, 0.0204718429595232, 0.0037807119078934193, -0.06776662170886993, 0.03899224102497101, 0.07220359891653061, -0.1530776470899582, -0.06641124933958054 ]
null
null
transformers
# NLGP docstring model The NLGP docstring model was introduced in the paper [Natural Language-Guided Programming](https://arxiv.org/abs/2108.05198). The model was trained on a collection of Jupyter notebooks and can be used to synthesize Python code that addresses a natural language **intent** in a certain code **context** (see the example below). Also see the [NLGP natural](https://huggingface.co/Nokia/nlgp-natural) model. This work was carried out by a research team in Nokia Bell Labs. **Context** ```py import matplotlib.pyplot as plt values = [1, 2, 3, 4] labels = ["a", "b", "c", "d"] ``` **Intent** ```py # plot a bart chart ``` **Prediction** ```py plt.bar(labels, values) plt.show() ``` ## Usage ```py import re from transformers import GPT2LMHeadModel, GPT2TokenizerFast # load the model tok = GPT2TokenizerFast.from_pretrained("Nokia/nlgp-docstring") model = GPT2LMHeadModel.from_pretrained("Nokia/nlgp-docstring") # preprocessing functions num_spaces = [2, 4, 6, 8, 10, 12, 14, 16, 18] def preprocess(context, query): """ Encodes context + query as a single string and replaces whitespace with special tokens <|2space|>, <|4space|>, ... """ input_str = f"{context}\n{query} <|endofcomment|>\n" indentation_symbols = {n: f"<|{n}space|>" for n in num_spaces} m = re.match("^[ ]+", input_str) if not m: return input_str leading_whitespace = m.group(0) N = len(leading_whitespace) for n in self.num_spaces: leading_whitespace = leading_whitespace.replace(n * " ", self.indentation_symbols[n]) return leading_whitespace + input_str[N:] detokenize_pattern = re.compile(fr"<\|(\d+)space\|>") def postprocess(output): output = output.split("<|cell|>")[0] def insert_space(m): num_spaces = int(m.group(1)) return num_spaces * " " return detokenize_pattern.sub(insert_space, output) # inference code_context = """ import matplotlib.pyplot as plt values = [1, 2, 3, 4] labels = ["a", "b", "c", "d"] """ query = "# plot a bar chart" input_str = preprocess(code_context, query) input_ids = tok(input_str, return_tensors="pt").input_ids max_length = 150 # don't generate output longer than this length total_max_length = min(1024 - input_ids.shape[-1], input_ids.shape[-1] + 150) # total = input + output input_and_output = model.generate( input_ids=input_ids, max_length=total_max_length, min_length=10, do_sample=False, num_beams=4, early_stopping=True, eos_token_id=tok.encode("<|cell|>")[0] ) output = input_and_output[:, input_ids.shape[-1]:] # remove the tokens that correspond to the input_str output_str = tok.decode(output[0]) postprocess(output_str) ``` ## License and copyright Copyright 2021 Nokia Licensed under the Apache License 2.0 SPDX-License-Identifier: Apache-2.0
{"language": ["en", "code"], "license": "apache-2.0", "tags": ["code completion", "code generation"]}
text-generation
Nokia/nlgp-docstring
[ "transformers", "pytorch", "gpt2", "text-generation", "code completion", "code generation", "en", "code", "arxiv:2108.05198", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2108.05198" ]
[ "en", "code" ]
TAGS #transformers #pytorch #gpt2 #text-generation #code completion #code generation #en #code #arxiv-2108.05198 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# NLGP docstring model The NLGP docstring model was introduced in the paper Natural Language-Guided Programming. The model was trained on a collection of Jupyter notebooks and can be used to synthesize Python code that addresses a natural language intent in a certain code context (see the example below). Also see the NLGP natural model. This work was carried out by a research team in Nokia Bell Labs. Context Intent Prediction ## Usage ## License and copyright Copyright 2021 Nokia Licensed under the Apache License 2.0 SPDX-License-Identifier: Apache-2.0
[ "# NLGP docstring model\n\nThe NLGP docstring model was introduced in the paper Natural Language-Guided Programming. The model was trained on a collection of Jupyter notebooks and can be used to synthesize Python code that addresses a natural language intent in a certain code context (see the example below). \nAlso see the NLGP natural model.\n\nThis work was carried out by a research team in Nokia Bell Labs.\n\nContext\n\n\nIntent\n\n\nPrediction", "## Usage", "## License and copyright\n\nCopyright 2021 Nokia\n\nLicensed under the Apache License 2.0\n\nSPDX-License-Identifier: Apache-2.0" ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #code completion #code generation #en #code #arxiv-2108.05198 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# NLGP docstring model\n\nThe NLGP docstring model was introduced in the paper Natural Language-Guided Programming. The model was trained on a collection of Jupyter notebooks and can be used to synthesize Python code that addresses a natural language intent in a certain code context (see the example below). \nAlso see the NLGP natural model.\n\nThis work was carried out by a research team in Nokia Bell Labs.\n\nContext\n\n\nIntent\n\n\nPrediction", "## Usage", "## License and copyright\n\nCopyright 2021 Nokia\n\nLicensed under the Apache License 2.0\n\nSPDX-License-Identifier: Apache-2.0" ]
[ 75, 102, 3, 29 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #code completion #code generation #en #code #arxiv-2108.05198 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# NLGP docstring model\n\nThe NLGP docstring model was introduced in the paper Natural Language-Guided Programming. The model was trained on a collection of Jupyter notebooks and can be used to synthesize Python code that addresses a natural language intent in a certain code context (see the example below). \nAlso see the NLGP natural model.\n\nThis work was carried out by a research team in Nokia Bell Labs.\n\nContext\n\n\nIntent\n\n\nPrediction## Usage## License and copyright\n\nCopyright 2021 Nokia\n\nLicensed under the Apache License 2.0\n\nSPDX-License-Identifier: Apache-2.0" ]
[ -0.006953684613108635, 0.15751293301582336, -0.00213157432153821, 0.038426950573921204, 0.04653603583574295, -0.0643460676074028, 0.04797614738345146, 0.0411798432469368, -0.00791777390986681, -0.03501308709383011, 0.16468289494514465, 0.14731964468955994, 0.006748550105839968, 0.09411731362342834, 0.06155329570174217, -0.18411032855510712, 0.034526292234659195, 0.027762506157159805, -0.031522899866104126, 0.15123887360095978, 0.05344066023826599, 0.023530758917331696, 0.10223136842250824, 0.05739951506257057, -0.11160024255514145, -0.01655827835202217, -0.016271863132715225, -0.09104974567890167, 0.0785483717918396, 0.012632684782147408, 0.0014704720815643668, -0.0029933457262814045, 0.033227384090423584, -0.15197117626667023, -0.012182805687189102, -0.006801222451031208, 0.002648342866450548, 0.05520453304052353, 0.0025759164709597826, 0.018223751336336136, 0.25204524397850037, 0.05945315584540367, 0.019729482010006905, 0.025845099240541458, -0.11219228804111481, -0.14725008606910706, -0.0129545908421278, 0.02132503129541874, 0.022683989256620407, 0.13001172244548798, 0.007149724289774895, 0.08643707633018494, -0.03769660368561745, 0.08770602196455002, 0.09577250480651855, -0.2340225875377655, -0.01648871973156929, 0.22022928297519684, 0.0493307039141655, -0.012386983260512352, 0.0793432965874672, 0.0411340706050396, 0.08576510101556778, 0.041811078786849976, 0.08163853734731674, -0.08394929021596909, -0.05662061646580696, -0.013365592807531357, -0.16610048711299896, -0.09889750182628632, 0.23267953097820282, -0.09586625546216965, -0.02612798847258091, -0.058337219059467316, -0.06250151246786118, -0.07457713037729263, 0.04585348442196846, -0.03155991807579994, -0.010795158334076405, 0.08435799181461334, 0.13915158808231354, -0.07187686860561371, -0.056203145533800125, -0.08621019124984741, -0.05364334583282471, 0.1362771987915039, 0.025330018252134323, 0.061131801456213, -0.1413549780845642, 0.13847170770168304, -0.04681098088622093, -0.0543719120323658, -0.031023340299725533, -0.02180439978837967, 0.1330379694700241, 0.08178725838661194, -0.0187821201980114, 0.055045679211616516, 0.04480867087841034, 0.11522573977708817, 0.008716939017176628, 0.016570081934332848, 0.020431121811270714, 0.06407935172319412, 0.019363634288311005, 0.09057425707578659, 0.014624832198023796, -0.007437867112457752, 0.0787581354379654, -0.07779736071825027, 0.04221978411078453, -0.047079939395189285, -0.10178904235363007, -0.03387003391981125, -0.049480702728033066, 0.0868748277425766, 0.06069015711545944, 0.09400984644889832, -0.060564760118722916, -0.0264982171356678, 0.06768065690994263, -0.08567024767398834, -0.01776777021586895, -0.055368516594171524, 0.014465983025729656, 0.013867699541151524, 0.1386999487876892, 0.0011123919393867254, -0.13429738581180573, -0.10137834399938583, -0.0387306846678257, -0.03058193065226078, -0.1056932732462883, -0.016445230692625046, 0.046917326748371124, 0.009897609241306782, 0.0008134937379509211, -0.12459210306406021, -0.2935899794101715, -0.021304791793227196, 0.09008613973855972, 0.025794142857193947, -0.1343819499015808, -0.08788508176803589, -0.04077950119972229, -0.0018534555565565825, -0.08038665354251862, -0.13240133225917816, -0.053238775581121445, 0.060834214091300964, 0.002155862981453538, 0.050632450729608536, -0.11435481160879135, 0.032508160918951035, -0.17726700007915497, 0.043542273342609406, -0.0018187451642006636, 0.05696864798665047, -0.051438022404909134, 0.0033753118477761745, -0.05718974024057388, -0.10644497722387314, 0.015256334096193314, 0.05479386821389198, 0.005701701622456312, 0.10955537110567093, -0.03252442926168442, -0.03760521113872528, 0.17822246253490448, -0.10768988728523254, -0.06220231577754021, 0.07883049547672272, 0.02438540570437908, 0.2381325662136078, 0.05161096900701523, 0.15064537525177002, 0.11571674048900604, -0.06333855539560318, 0.05442754551768303, 0.02875959686934948, -0.03806616738438606, -0.11906427890062332, 0.004655537661164999, -0.01678730733692646, -0.18873605132102966, 0.09571775048971176, -0.12314438819885254, 0.0713377594947815, 0.037149153649806976, -0.0877661481499672, -0.034795526415109634, -0.002775839064270258, 0.045754436403512955, -0.010693586431443691, 0.0047825477086007595, -0.014912256971001625, -0.04948122426867485, 0.023943297564983368, 0.08415314555168152, -0.07178635895252228, 0.03552944213151932, -0.10494635999202728, 0.056199561804533005, -0.016986818984150887, 0.03860253095626831, -0.10794014483690262, -0.03435125946998596, -0.006072815041989088, -0.1267639398574829, 0.09906662255525589, 0.0027988040819764137, 0.008754139766097069, 0.02471262775361538, 0.03073938749730587, -0.012911721132695675, -0.056583862751722336, -0.037691742181777954, 0.01502335537225008, -0.08922677487134933, 0.04978124424815178, -0.02295316942036152, 0.04138126224279404, -0.13611365854740143, 0.06518921256065369, -0.046807702630758286, -0.009757809340953827, -0.03835931420326233, -0.0014786467654630542, -0.00944623164832592, 0.07253411412239075, -0.06232323870062828, 0.0038920340593904257, 0.06968086957931519, 0.01432803738862276, -0.014599255286157131, 0.10616850852966309, -0.10507737845182419, 0.03863653168082237, 0.10457388311624527, -0.1496865451335907, -0.021947475150227547, 0.05366136506199837, -0.03763167932629585, -0.008737342432141304, 0.011243104934692383, 0.014434934593737125, 0.22206462919712067, -0.06687053292989731, 0.06633498519659042, -0.08539613336324692, -0.02816871367394924, 0.011739310808479786, -0.1307913362979889, 0.03597450256347656, 0.05949121341109276, 0.17740309238433838, 0.038342513144016266, 0.12764309346675873, 0.10010645538568497, -0.0626298040151596, 0.18741613626480103, -0.018686214461922646, 0.00629799859598279, -0.024102231487631798, 0.03802434727549553, 0.006586107891052961, 0.032419804483652115, -0.14297078549861908, -0.019052810966968536, 0.05040320008993149, 0.03914754465222359, 0.08588603138923645, -0.14479871094226837, -0.032624658197164536, -0.003966470714658499, -0.05086927488446236, -0.04475283622741699, 0.027681127190589905, -0.09360753744840622, 0.058223430067300797, -0.05234365165233612, -0.11201194673776627, 0.06741711497306824, 0.030515609309077263, -0.10474090278148651, 0.15800240635871887, -0.08788906037807465, -0.30197960138320923, -0.18967024981975555, -0.015977036207914352, -0.07700712233781815, -0.0012177737662568688, 0.04994003847241402, -0.06487305462360382, -0.04673926159739494, -0.0629248172044754, 0.05610857158899307, -0.033163752406835556, -0.07380250096321106, -0.009067042730748653, -0.019209884107112885, -0.02195047028362751, -0.13546988368034363, -0.036099523305892944, 0.018777409568428993, -0.11114773154258728, 0.09972795099020004, -0.11681848019361496, 0.09625950455665588, 0.2526906430721283, 0.005340977571904659, 0.012706373818218708, -0.0011050855973735452, 0.11459242552518845, -0.02544499933719635, -0.0036760231014341116, 0.21649672091007233, 0.0425359383225441, 0.023803360760211945, 0.057832106947898865, -0.014331286773085594, -0.1334451586008072, 0.04456692561507225, -0.04375031962990761, -0.12587329745292664, -0.18632300198078156, -0.15646015107631683, -0.07954563945531845, 0.175076425075531, 0.0583057664334774, 0.03800384700298309, 0.10147443413734436, 0.05316883325576782, 0.03271038085222244, 0.06030375137925148, 0.06347151100635529, 0.11611203849315643, 0.24077074229717255, -0.03978859633207321, 0.07388988882303238, -0.04109602048993111, -0.06515693664550781, 0.04355882108211517, 0.0456666499376297, 0.10754533857107162, 0.003203644882887602, 0.12133778631687164, 0.05705646425485611, 0.16952574253082275, 0.07574862241744995, 0.09336121380329132, -0.004497897811233997, 0.09416960179805756, -0.01873622089624405, -0.09192532300949097, -0.10088036954402924, 0.05396152287721634, -0.1325226128101349, -0.08444184809923172, 0.07211078703403473, -0.058343105018138885, -0.0022718473337590694, 0.038199372589588165, 0.021508939564228058, -0.25204557180404663, -0.04408868029713631, 0.01867602951824665, 0.08093125373125076, -0.11695162206888199, 0.15050211548805237, 0.004662699066102505, -0.09589806199073792, 0.06369992345571518, -0.03722206875681877, 0.0868988037109375, -0.008341670967638493, 0.027307715266942978, -0.03617982566356659, -0.022805748507380486, 0.07420389354228973, 0.14377367496490479, -0.19966468214988708, 0.07605041563510895, -0.026019684970378876, 0.027093179523944855, -0.07167962193489075, 0.025832567363977432, -0.033327680081129074, 0.11337537318468094, 0.13719262182712555, 0.0638028234243393, 0.026202712208032608, -0.02927100472152233, -0.13214465975761414, 0.06851224601268768, -0.01672549359500408, -0.03393896296620369, -0.0033980580046772957, 0.007385440170764923, 0.030211258679628372, -0.01091067586094141, -0.14838822185993195, -0.020042981952428818, -0.11065793037414551, 0.06666617095470428, 0.040945954620838165, 0.09574052691459656, -0.0645156055688858, -0.03727297857403755, 0.03520917892456055, 0.21863862872123718, -0.006240225397050381, -0.11251623183488846, -0.0595976747572422, -0.12267899513244629, 0.07275237888097763, -0.056418806314468384, 0.047148339450359344, -0.011070579290390015, -0.045501723885536194, -0.023290351033210754, -0.15234489738941193, 0.12562376260757446, -0.07518334686756134, -0.10242041945457458, 0.0032918311189860106, 0.08898095786571503, 0.06394178420305252, 0.017531683668494225, 0.015158984810113907, 0.012585296295583248, -0.07954540103673935, -0.17374223470687866, -0.028319668024778366, 0.16388912498950958, -0.059925805777311325, 0.022392211481928825, 0.012783545069396496, -0.02732846513390541, 0.0343366339802742, -0.05360338091850281, 0.10352523624897003, 0.11305182427167892, -0.04117593169212341, 0.15280136466026306, 0.15335145592689514, -0.15691114962100983, -0.3132408559322357, -0.06629153341054916, -0.02482522465288639, 0.021606523543596268, -0.0388290211558342, -0.21147143840789795, 0.12289531528949738, 0.0029356207232922316, -0.054111115634441376, 0.13873526453971863, -0.3321467339992523, -0.09404697269201279, 0.12312241643667221, 0.04114505276083946, 0.30031058192253113, -0.07011964172124863, -0.007561081554740667, 0.010751484893262386, -0.22718799114227295, 0.21447844803333282, -0.10836880654096603, 0.05483593791723251, -0.04504309222102165, 0.1450025886297226, 0.017254427075386047, -0.047440070658922195, 0.10286706686019897, -0.043814003467559814, 0.022398317232728004, -0.06829007714986801, 0.04280112683773041, 0.06600438058376312, 0.03519282117486, 0.1793227642774582, -0.013494751416146755, 0.045497260987758636, -0.06704229861497879, -0.11477918922901154, -0.09778576344251633, 0.14585426449775696, -0.02681240625679493, -0.13545051217079163, -0.09692708402872086, 0.0009441347792744637, 0.06515337526798248, 0.01563211902976036, 0.10780837386846542, -0.03351941332221031, -0.03731410577893257, 0.18594418466091156, 0.15283557772636414, -0.008636819198727608, -0.023720335215330124, 0.012866100296378136, -0.04108850657939911, 0.1064947322010994, -0.1593860238790512, -0.026872584596276283, 0.13767367601394653, 0.029921183362603188, 0.016536366194486618, 0.0407240092754364, -0.09389454126358032, -0.028465989977121353, 0.03749578446149826, -0.15200835466384888, -0.07817471772432327, -0.03454186022281647, 0.04654116928577423, 0.05669557303190231, 0.0918903723359108, 0.11598425358533859, -0.1500627100467682, -0.038382478058338165, 0.025976037606596947, 0.054730337113142014, -0.07789646089076996, 0.06397204846143723, -0.0061562154442071915, -0.023288333788514137, -0.08954610675573349, 0.033334262669086456, 0.021678181365132332, 0.015405126847326756, 0.03487558662891388, 0.04960303381085396, -0.12455898523330688, -0.06895028054714203, 0.018374061211943626, 0.0845114067196846, -0.19342416524887085, -0.1318853348493576, -0.0058298613876104355, -0.09787330031394958, 0.010390844196081161, 0.04110852628946304, 0.07219061255455017, 0.02963496372103691, -0.05949556082487106, 0.003466136986389756, -0.05824105069041252, 0.012118443846702576, -0.019340580329298973, 0.02175230346620083, -0.07290008664131165, 0.08325093984603882, 0.04905296862125397, 0.051358502358198166, -0.04927162453532219, -0.06489056348800659, -0.0693831667304039, 0.047956183552742004, -0.035127971321344376, -0.0026223331224173307, -0.12155034393072128, 0.03954606130719185, 0.009765753522515297, 0.024152323603630066, -0.008090341463685036, 0.05062538757920265, -0.08920232206583023, 0.006722564343363047, -0.050900280475616455, 0.008517453446984291, -0.050773024559020996, 0.005467766430228949, 0.06937612593173981, -0.014969375915825367, 0.05418591573834419, -0.004198110196739435, -0.0670214369893074, 0.12624484300613403, -0.12158332020044327, 0.08816876262426376, 0.11254231631755829, 0.07338206470012665, -0.0373452790081501, -0.04123971611261368, -0.026684051379561424, 0.07589680701494217, -0.018224017694592476, 0.007786811329424381, -0.028036829084157944, -0.12657837569713593, 0.004777972120791674, 0.0004439805925358087, -0.1021001860499382, -0.016620881855487823, 0.018488747999072075, -0.06790805608034134, 0.10686029493808746, 0.11609020084142685, -0.06153717637062073, 0.005976476240903139, -0.054736461490392685, -0.0058407229371368885, 0.024915849789977074, -0.035168640315532684, -0.06265922635793686, -0.050152987241744995, -0.0026697118300944567, 0.009762438014149666, 0.20025506615638733, 0.04560910537838936, -0.1846100389957428, -0.08711434155702591, -0.041011884808540344, 0.08000978827476501, -0.07001954317092896, 0.12314177304506302, 0.02965550497174263, 0.04764861613512039, 0.010931073687970638, 0.033539433032274246, 0.013405326753854752, 0.07366538047790527, 0.09231147170066833, -0.10728417336940765, 0.0822090357542038, 0.05180658772587776, -0.044609468430280685, -0.007189016789197922, -0.11359293758869171, -0.06953071057796478, -0.01116738747805357, 0.11165926605463028, -0.009503324516117573, 0.18131588399410248, 0.13369318842887878, -0.047906070947647095, 0.07532356679439545, 0.04065714403986931, -0.0634608045220375, -0.14764167368412018, -0.3147086203098297, -0.029577041044831276, -0.11020035296678543, -0.04216345399618149, -0.11195164173841476, 0.010540354065597057, -0.020732346922159195, 0.01277942769229412, -0.0702044665813446, 0.0712152048945427, -0.019987745210528374, -0.08953443169593811, -0.023776285350322723, -0.043661002069711685, 0.029336975887417793, -0.08090192824602127, 0.026699531823396683, -0.06025620549917221, 0.04231971129775047, 0.05701795592904091, 0.05655132979154587, -0.013078168034553528, 0.04597686603665352, -0.0549798347055912, 0.03136328607797623, -0.06818747520446777, 0.06313980370759964, 0.004799941089004278, 0.04340861737728119, 0.06573934108018875, -0.09242963045835495, 0.045215506106615067, 0.12854337692260742, -0.022523481398820877, -0.15312033891677856, -0.05302079766988754, 0.2520101070404053, -0.02285648137331009, 0.04447377845644951, 0.0034473026171326637, 0.021267598494887352, -0.03947234898805618, 0.25856509804725647, 0.2014002799987793, 0.05050371214747429, -0.02845553494989872, -0.0422864593565464, 0.014291273429989815, 0.0016744325403124094, 0.09631886333227158, 0.10486255586147308, 0.41928404569625854, -0.05263269692659378, -0.04936283081769943, -0.08284586668014526, 0.0020507187582552433, -0.13167212903499603, -0.09617004543542862, -0.013933840207755566, -0.11466065049171448, -0.03800779953598976, 0.02311977557837963, -0.14642277359962463, 0.07481630146503448, 0.003445156617090106, -0.005345192737877369, -0.019129768013954163, -0.0009083861950784922, 0.010545357130467892, -0.014546753838658333, 0.051477182656526566, -0.06480167806148529, -0.010668782517313957, 0.0761803612112999, -0.00043227881542406976, -0.175465390086174, 0.029294034466147423, 0.05519270896911621, 0.049321919679641724, 0.1671338528394699, 0.01990865170955658, 0.12005014717578888, 0.043727051466703415, 0.0008014148916117847, -0.07284093648195267, 0.11461859941482544, -0.00502229668200016, 0.05017212778329849, -0.042231235653162, -0.08391544222831726, -0.01620127446949482, 0.046426285058259964, 0.01936337724328041, 0.05274955555796623, 0.009522229433059692, 0.10714764147996902, 0.007592716719955206, -0.06480148434638977, -0.06550572067499161, -0.10730338096618652, 0.09739565849304199, 0.05630910024046898, -0.03478081896901131, -0.044248417019844055, -0.10809236019849777, 0.0769713744521141, -0.0756043791770935, -0.060528017580509186, 0.013486156240105629, -0.05872103199362755, -0.08147267997264862, 0.05423522740602493, 0.034864697605371475, -0.05496295168995857, 0.0070306421257555485, -0.05170201510190964, 0.012934138998389244, -0.10757952928543091, 0.06001543998718262, 0.02107567898929119, -0.0031267337035387754, -0.008480988442897797, -0.025273345410823822, -0.006019929423928261, 0.10650009661912918, -0.07400388270616531, -0.1248089075088501 ]
null
null
transformers
# NLGP natural model The NLGP natural model was introduced in the paper [Natural Language-Guided Programming](https://arxiv.org/abs/2108.05198). The model was trained on a collection of Jupyter notebooks and can be used to synthesize Python code that addresses a natural language **intent** in a certain code **context** (see the example below). This work was carried out by a research team in Nokia Bell Labs. **Context** ```py import matplotlib.pyplot as plt values = [1, 2, 3, 4] labels = ["a", "b", "c", "d"] ``` **Intent** ```py # plot a bar chart ``` **Prediction** ```py plt.bar(labels, values) plt.show() ``` ## Usage ```py import re from transformers import GPT2LMHeadModel, GPT2TokenizerFast # load the model tok = GPT2TokenizerFast.from_pretrained("Nokia/nlgp-natural") model = GPT2LMHeadModel.from_pretrained("Nokia/nlgp-natural") # preprocessing functions num_spaces = [2, 4, 6, 8, 10, 12, 14, 16, 18] def preprocess(context, query): """ Encodes context + query as a single string and replaces whitespace with special tokens <|2space|>, <|4space|>, ... """ input_str = f"{context}\n{query} <|endofcomment|>\n" indentation_symbols = {n: f"<|{n}space|>" for n in num_spaces} m = re.match("^[ ]+", input_str) if not m: return input_str leading_whitespace = m.group(0) N = len(leading_whitespace) for n in self.num_spaces: leading_whitespace = leading_whitespace.replace(n * " ", self.indentation_symbols[n]) return leading_whitespace + input_str[N:] detokenize_pattern = re.compile(fr"<\|(\d+)space\|>") def postprocess(output): output = output.split("<|cell|>")[0] def insert_space(m): num_spaces = int(m.group(1)) return num_spaces * " " return detokenize_pattern.sub(insert_space, output) # inference code_context = """ import matplotlib.pyplot as plt values = [1, 2, 3, 4] labels = ["a", "b", "c", "d"] """ query = "# plot a bar chart" input_str = preprocess(code_context, query) input_ids = tok(input_str, return_tensors="pt").input_ids max_length = 150 # don't generate output longer than this length total_max_length = min(1024 - input_ids.shape[-1], input_ids.shape[-1] + 150) # total = input + output input_and_output = model.generate( input_ids=input_ids, max_length=total_max_length, min_length=10, do_sample=False, num_beams=4, early_stopping=True, eos_token_id=tok.encode("<|cell|>")[0] ) output = input_and_output[:, input_ids.shape[-1]:] # remove the tokens that correspond to the input_str output_str = tok.decode(output[0]) postprocess(output_str) ``` ## License and copyright Copyright 2021 Nokia Licensed under the Apache License 2.0 SPDX-License-Identifier: Apache-2.0
{"language": ["en", "code"], "license": "apache-2.0", "tags": ["code completion", "code generation"]}
text-generation
Nokia/nlgp-natural
[ "transformers", "pytorch", "gpt2", "text-generation", "code completion", "code generation", "en", "code", "arxiv:2108.05198", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2108.05198" ]
[ "en", "code" ]
TAGS #transformers #pytorch #gpt2 #text-generation #code completion #code generation #en #code #arxiv-2108.05198 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# NLGP natural model The NLGP natural model was introduced in the paper Natural Language-Guided Programming. The model was trained on a collection of Jupyter notebooks and can be used to synthesize Python code that addresses a natural language intent in a certain code context (see the example below). This work was carried out by a research team in Nokia Bell Labs. Context Intent Prediction ## Usage ## License and copyright Copyright 2021 Nokia Licensed under the Apache License 2.0 SPDX-License-Identifier: Apache-2.0
[ "# NLGP natural model\n\nThe NLGP natural model was introduced in the paper Natural Language-Guided Programming. The model was trained on a collection of Jupyter notebooks and can be used to synthesize Python code that addresses a natural language intent in a certain code context (see the example below). This work was carried out by a research team in Nokia Bell Labs.\n\nContext\n\n\nIntent\n\n\nPrediction", "## Usage", "## License and copyright\n\nCopyright 2021 Nokia\n\nLicensed under the Apache License 2.0\n\nSPDX-License-Identifier: Apache-2.0" ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #code completion #code generation #en #code #arxiv-2108.05198 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# NLGP natural model\n\nThe NLGP natural model was introduced in the paper Natural Language-Guided Programming. The model was trained on a collection of Jupyter notebooks and can be used to synthesize Python code that addresses a natural language intent in a certain code context (see the example below). This work was carried out by a research team in Nokia Bell Labs.\n\nContext\n\n\nIntent\n\n\nPrediction", "## Usage", "## License and copyright\n\nCopyright 2021 Nokia\n\nLicensed under the Apache License 2.0\n\nSPDX-License-Identifier: Apache-2.0" ]
[ 75, 91, 3, 29 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #code completion #code generation #en #code #arxiv-2108.05198 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# NLGP natural model\n\nThe NLGP natural model was introduced in the paper Natural Language-Guided Programming. The model was trained on a collection of Jupyter notebooks and can be used to synthesize Python code that addresses a natural language intent in a certain code context (see the example below). This work was carried out by a research team in Nokia Bell Labs.\n\nContext\n\n\nIntent\n\n\nPrediction## Usage## License and copyright\n\nCopyright 2021 Nokia\n\nLicensed under the Apache License 2.0\n\nSPDX-License-Identifier: Apache-2.0" ]
[ -0.01762612722814083, 0.10049989819526672, -0.0024399650283157825, 0.03351430222392082, 0.05169235169887543, -0.06786969304084778, 0.04154035076498985, 0.04213574901223183, -0.015508663840591908, -0.042404208332300186, 0.18121646344661713, 0.176433265209198, 0.00945267453789711, 0.10973813384771347, 0.025584515184164047, -0.1582007259130478, 0.03916163370013237, 0.03842092677950859, -0.0230401661247015, 0.16702578961849213, 0.06071015074849129, -0.00039999085129238665, 0.09660976380109787, 0.05138252675533295, -0.11769452691078186, -0.01905503310263157, -0.02143806405365467, -0.0920758917927742, 0.08077728003263474, -0.00244348868727684, 0.03841763734817505, 0.01336988527327776, 0.016187990084290504, -0.1395709216594696, -0.013762089423835278, -0.022902827709913254, 0.00570332957431674, 0.04252023622393608, -0.015734978020191193, -0.013832141645252705, 0.19637437164783478, 0.06961657106876373, 0.016587063670158386, 0.029921354725956917, -0.09911477565765381, -0.16406111419200897, 0.0012621873756870627, 0.0031515187583863735, 0.027556557208299637, 0.13573145866394043, 0.004134448245167732, 0.0896148681640625, -0.06402244418859482, 0.0939970463514328, 0.08974291384220123, -0.25702163577079773, -0.017851578071713448, 0.1612861305475235, 0.04451613873243332, -0.010406188666820526, 0.0782894715666771, 0.018209340050816536, 0.07754091918468475, 0.05287730321288109, 0.08202610909938812, -0.09466526657342911, -0.051319658756256104, -0.023063402622938156, -0.1713593304157257, -0.10181821882724762, 0.23906993865966797, -0.09219467639923096, -0.023752793669700623, -0.04555424675345421, -0.07435349375009537, -0.06487569957971573, 0.033818114548921585, -0.04149229824542999, -0.01751287467777729, 0.0862087830901146, 0.11297448724508286, -0.022159332409501076, -0.06682664155960083, -0.09232565015554428, -0.07803046703338623, 0.15623997151851654, 0.018366528674960136, 0.07687870413064957, -0.14718107879161835, 0.12075250595808029, 0.025287218391895294, -0.06470402330160141, -0.01747480221092701, -0.03689858317375183, 0.1233312264084816, 0.077877476811409, -0.0282464399933815, 0.011631502769887447, 0.05370522662997246, 0.11374566704034805, 0.006505433935672045, 0.03599732369184494, 0.03568388521671295, 0.06580168753862381, 0.005947957281023264, 0.10888312757015228, 0.03022599034011364, -0.026978330686688423, 0.06514395773410797, -0.05429374426603317, 0.03903069719672203, -0.04032319039106369, -0.10074128210544586, -0.04448330029845238, -0.0018275477923452854, 0.0788353681564331, 0.05470927432179451, 0.10442186146974564, -0.06507847458124161, -0.03440679982304573, 0.01571372151374817, -0.08972009271383286, -0.02777201682329178, -0.028575997799634933, 0.005190665367990732, 0.008641252294182777, 0.1320907175540924, -0.004312117118388414, -0.12857922911643982, -0.1031474694609642, -0.031288374215364456, -0.017105065286159515, -0.11484477669000626, -0.027325700968503952, 0.022268954664468765, 0.016902919858694077, 0.00039906552410684526, -0.14289335906505585, -0.2629786729812622, -0.02138814702630043, 0.1097453236579895, 0.029307009652256966, -0.11259455978870392, -0.10860313475131989, -0.04600454866886139, 0.00882800668478012, -0.07745468616485596, -0.12581273913383484, -0.05240968242287636, 0.06767923384904861, 0.015588260255753994, 0.0651911050081253, -0.11567472666501999, 0.04708845913410187, -0.18565016984939575, 0.042817696928977966, -0.025896253064274788, 0.08372204005718231, -0.03437143564224243, 0.04462704434990883, -0.06751634925603867, -0.10575021803379059, 0.010139894671738148, 0.04519367590546608, 0.00448641600087285, 0.12337321043014526, -0.056708794087171555, -0.04571673274040222, 0.16226334869861603, -0.1166498064994812, -0.09503906220197678, 0.08843940496444702, 0.007196818012744188, 0.23947827517986298, 0.05914902314543724, 0.15210779011249542, 0.09009785950183868, -0.009912626817822456, 0.07278850674629211, 0.04123343154788017, -0.06700268387794495, -0.13663986325263977, 0.021709386259317398, -0.007714851293712854, -0.20645639300346375, 0.09174761176109314, -0.10631934553384781, 0.09222805500030518, 0.03322722762823105, -0.10565479844808578, -0.02770865149796009, -0.011776473373174667, 0.06284287571907043, -0.000951114809140563, 0.029870806261897087, -0.009826568886637688, -0.03681706264615059, 0.005091224797070026, 0.06199590489268303, -0.07962758094072342, 0.0381169393658638, -0.11566910147666931, 0.05694134533405304, 0.00497257336974144, 0.04945109784603119, -0.12416596710681915, 0.020896846428513527, -0.0072806840762495995, -0.11947833746671677, 0.057464856654405594, -0.012359573505818844, 0.02180681750178337, -0.00559983542189002, 0.050896693021059036, -0.033067114651203156, -0.016719967126846313, -0.026704303920269012, 0.0013968965504318476, -0.09511113911867142, 0.03940987586975098, -0.008407742716372013, 0.07560707628726959, -0.09154932200908661, 0.05055386945605278, -0.04879042133688927, -0.013606315478682518, -0.04465525224804878, 0.020346980541944504, -0.02019405923783779, 0.09174211323261261, -0.05532483384013176, 0.01873992197215557, 0.09451775252819061, 0.014526245184242725, -0.04435032606124878, 0.09475264698266983, -0.1066225990653038, 0.07152178883552551, 0.1189751848578453, -0.21705783903598785, -0.013568580150604248, 0.08166319131851196, -0.030401458963751793, -0.008912093937397003, 0.051570285111665726, 0.029132332652807236, 0.23691879212856293, -0.06218809261918068, 0.07696512341499329, -0.08098495006561279, -0.02247609570622444, 0.004556642379611731, -0.12288165837526321, 0.0464949794113636, 0.07155361026525497, 0.1284463256597519, -0.01300814189016819, 0.14099937677383423, 0.1242782399058342, -0.096114382147789, 0.15830451250076294, -0.032538868486881256, 0.016539929434657097, -0.026645934209227562, 0.052917491644620895, -0.0013188798911869526, 0.030355896800756454, -0.19848348200321198, -0.05016450956463814, 0.04441143572330475, 0.012589440681040287, 0.06832846254110336, -0.120940200984478, -0.017201030626893044, -0.0037175659090280533, -0.027649180963635445, -0.04411253333091736, 0.0021320548839867115, -0.07873298972845078, 0.045518532395362854, -0.025073090568184853, -0.1482008844614029, 0.09019015729427338, 0.01847974769771099, -0.11350209265947342, 0.17609573900699615, -0.08964123576879501, -0.3009747564792633, -0.16528776288032532, -0.024507831782102585, -0.06084151938557625, -0.003482050495222211, 0.06743526458740234, -0.06768199801445007, -0.0349915474653244, -0.04264835640788078, 0.05271704122424126, -0.03415800258517265, -0.05056634917855263, -0.001046620192937553, -0.02375349961221218, -0.002055996796116233, -0.12087275087833405, -0.03879404067993164, -0.00982761662453413, -0.09251727163791656, 0.10430172085762024, -0.12535642087459564, 0.1036989837884903, 0.25167107582092285, 0.008123951032757759, 0.026578621938824654, -0.0035454463213682175, 0.12789776921272278, -0.012508189305663109, -0.03488684445619583, 0.21276728808879852, 0.00992773286998272, 0.02940169721841812, 0.08919920772314072, -0.011259973980486393, -0.1177849993109703, 0.03888694569468498, -0.0690116286277771, -0.12034740298986435, -0.16326279938220978, -0.14569950103759766, -0.07905197143554688, 0.15964201092720032, 0.05369604006409645, 0.021712198853492737, 0.08855054527521133, 0.07310117036104202, 0.041038502007722855, 0.06323694437742233, 0.05522964149713516, 0.10236980766057968, 0.24097543954849243, -0.02501753717660904, 0.09065564721822739, -0.03682633116841316, -0.09078051149845123, 0.012639490887522697, 0.026185955852270126, 0.10217287391424179, 0.007431833539158106, 0.12812989950180054, 0.06427635252475739, 0.19932600855827332, 0.07384573668241501, 0.11965954303741455, -0.023397883400321007, 0.08346931636333466, -0.017510518431663513, -0.0837746411561966, -0.14765417575836182, 0.05660460889339447, -0.12840907275676727, -0.0402783565223217, 0.06002499535679817, -0.0683622658252716, 0.015821680426597595, 0.09639649093151093, 0.020624175667762756, -0.2788569927215576, -0.056161362677812576, 0.019882027059793472, 0.0772145539522171, -0.0894414559006691, 0.15239675343036652, -0.016795461997389793, -0.08480126410722733, 0.04984239861369133, -0.05185670405626297, 0.0822410136461258, -0.02146976999938488, 0.04705062136054039, -0.037097420543432236, -0.04525228962302208, 0.08653524518013, 0.1330748051404953, -0.23720522224903107, 0.09108997136354446, -0.03382713347673416, 0.031030988320708275, -0.0778975561261177, 0.021509462967514992, -0.02930414490401745, 0.10354842990636826, 0.12886933982372284, 0.047686412930488586, 0.04989560320973396, -0.06066789850592613, -0.1053442507982254, 0.06682246923446655, -0.01626530848443508, -0.03182027488946915, 0.013221788220107555, 0.0005525536253117025, 0.025834502652287483, -0.0018667866243049502, -0.10369565337896347, -0.0035986111033707857, -0.10368623584508896, 0.05437491461634636, 0.03997170552611351, 0.10350095480680466, -0.056063782423734665, -0.03517082333564758, 0.048320915549993515, 0.18923313915729523, -0.041105613112449646, -0.1290545016527176, -0.058539990335702896, -0.1316867619752884, 0.05671572685241699, -0.06777790933847427, 0.05505943298339844, -0.01653941161930561, -0.0717800110578537, -0.0057343789376318455, -0.1683017909526825, 0.13193072378635406, -0.07552708685398102, -0.07402687519788742, 0.0018993864068761468, 0.07743334025144577, 0.07449406385421753, 0.012665800750255585, 0.019697031006217003, 0.00694655068218708, -0.10748308151960373, -0.16659767925739288, -0.03029709681868553, 0.13698038458824158, -0.04763690009713173, 0.012683791108429432, 0.02436142973601818, -0.06622277945280075, 0.010900241322815418, -0.0035954464692622423, 0.12956933677196503, 0.058489907532930374, -0.056947894394397736, 0.1392831653356552, 0.17654450237751007, -0.1265466809272766, -0.32918044924736023, -0.10083102434873581, -0.027593005448579788, 0.022603070363402367, -0.06099977716803551, -0.18988527357578278, 0.13580557703971863, -0.0136730270460248, -0.050972163677215576, 0.10375238955020905, -0.32924729585647583, -0.07771962136030197, 0.16192832589149475, 0.05767614394426346, 0.35497531294822693, -0.0970977321267128, -0.014274987392127514, 0.005301075056195259, -0.2025037407875061, 0.18014828860759735, -0.09083139151334763, 0.049968499690294266, -0.0456099733710289, 0.12135877460241318, 0.012356488965451717, -0.05419578775763512, 0.08506683260202408, -0.03743564710021019, 0.02782127633690834, -0.07397855818271637, 0.01088709570467472, 0.05685947835445404, 0.02535013109445572, 0.16346748173236847, -0.021302971988916397, 0.043854497373104095, -0.057824838906526566, -0.09995631873607635, -0.10001135617494583, 0.14443707466125488, -0.01826598308980465, -0.11993741989135742, -0.07425209879875183, -0.01850941963493824, 0.08498972654342651, 0.0304131880402565, 0.11786819249391556, -0.022655140608549118, -0.01353282667696476, 0.21555598080158234, 0.17064335942268372, -0.042788948863744736, -0.021249141544103622, 0.016676776111125946, -0.041098155081272125, 0.11493703722953796, -0.13103362917900085, -0.013403918594121933, 0.13794854283332825, 0.03165009990334511, 0.022988762706518173, 0.07034175097942352, -0.07839855551719666, -0.011812195181846619, 0.03647201508283615, -0.15856873989105225, -0.06526892632246017, -0.02948976308107376, 0.02964901365339756, 0.04159531369805336, 0.10712006688117981, 0.09744244068861008, -0.1292901337146759, -0.03949339687824249, 0.012372691184282303, 0.044978365302085876, -0.0801331102848053, 0.06778272241353989, -0.011478854343295097, -0.0026252625975757837, -0.10113071650266647, 0.014372150413691998, 0.009198350831866264, 0.00039075518725439906, 0.02657543681561947, 0.0484471321105957, -0.15261445939540863, -0.08093953877687454, 0.008662316016852856, 0.11550415307283401, -0.1851784884929657, -0.12687227129936218, -0.027037138119339943, -0.13095688819885254, 0.015269835479557514, 0.0926935225725174, 0.11196500062942505, 0.052395615726709366, -0.051743216812610626, 0.00036866377922706306, -0.03500837832689285, 0.02724805474281311, -0.02890315279364586, 0.0017193944659084082, -0.0800209790468216, 0.07062921673059464, 0.05698138475418091, 0.059453755617141724, -0.06341055780649185, -0.07240283489227295, -0.09011341631412506, 0.07068488746881485, -0.0509222038090229, -0.0003945619973819703, -0.11489381641149521, 0.03654107451438904, 0.015090596862137318, 0.012696254067122936, -0.01814892143011093, 0.06328997761011124, -0.09826050698757172, 0.0005468259332701564, -0.03169892728328705, 0.007869691587984562, -0.06054815277457237, -0.004071937408298254, 0.05795757845044136, -0.02359660156071186, 0.045937977731227875, 0.05007009953260422, -0.0857180505990982, 0.11584950238466263, -0.10256525129079819, 0.04854156821966171, 0.11195404082536697, 0.06406328827142715, -0.024591604247689247, -0.04791714623570442, -0.03659065440297127, 0.09574691206216812, -0.03461272269487381, 0.014358007349073887, 0.010518384166061878, -0.11578592658042908, 0.018109401687979698, 0.020064085721969604, -0.09740407019853592, -0.023939138278365135, 0.00909485388547182, -0.024307293817400932, 0.11443855613470078, 0.13075314462184906, -0.06416594982147217, 0.012149333022534847, -0.04818785935640335, -0.009726247750222683, 0.03261785954236984, -0.040553078055381775, -0.09810201823711395, -0.06815824657678604, -0.005541087593883276, -0.0007810209644958377, 0.17456993460655212, 0.03987076133489609, -0.1476484090089798, -0.07887661457061768, -0.041424453258514404, 0.08361347764730453, -0.04844117537140846, 0.15345440804958344, 0.029993323609232903, 0.035084422677755356, 0.03737254440784454, 0.04553166404366493, 0.014514148235321045, 0.08429226279258728, 0.058963336050510406, -0.0674944594502449, 0.10146717727184296, 0.07711376249790192, -0.04808567091822624, 0.02299140952527523, -0.10443613678216934, -0.07861639559268951, -0.025182390585541725, 0.09910377115011215, 0.00311673479154706, 0.21758396923542023, 0.1300320029258728, -0.011243751272559166, 0.06575820595026016, 0.020294446498155594, -0.06675580888986588, -0.1456599235534668, -0.2682890295982361, -0.043555840849876404, -0.1327342838048935, -0.057716961950063705, -0.09160000085830688, -0.031455669552087784, 0.03150821849703789, 0.0032263135071843863, -0.059546567499637604, 0.07715804129838943, 0.011232418939471245, -0.07099012285470963, -0.0049545918591320515, -0.0504542700946331, -0.0009658324997872114, -0.06357424706220627, 0.022164851427078247, -0.0499335378408432, 0.028670605272054672, 0.05522378906607628, 0.04067884758114815, -0.0344536267220974, 0.04160981997847557, -0.05374846234917641, 0.008398300036787987, -0.05625908449292183, 0.06401590257883072, 0.003096742322668433, 0.0347001738846302, 0.057811181992292404, -0.08582162857055664, 0.0391213521361351, 0.11561097204685211, -0.02402113936841488, -0.14018695056438446, -0.061366960406303406, 0.2897486090660095, 0.002450735541060567, 0.05748787149786949, 0.0024370017927139997, 0.015470847487449646, -0.04127322509884834, 0.2744542062282562, 0.19601045548915863, 0.03784240409731865, -0.023248275741934776, -0.03467654064297676, 0.015259811654686928, 0.00135620788205415, 0.09096406400203705, 0.11546111851930618, 0.3963478207588196, -0.06924594938755035, -0.04488243907690048, -0.09194374084472656, -0.013159774243831635, -0.12571217119693756, -0.10412917286157608, -0.0007498961058445275, -0.08875986933708191, -0.05106938257813454, 0.02812633290886879, -0.1612708866596222, 0.0706159919500351, 0.015495019033551216, -0.029238566756248474, -0.01875317096710205, -0.006774684879928827, 0.05756739154458046, -0.014008750207722187, 0.06653670966625214, -0.058354999870061874, 0.0002508014440536499, 0.06180596724152565, -0.0019072899594902992, -0.16543419659137726, 0.023147374391555786, 0.060390230268239975, 0.006038858089596033, 0.14890708029270172, 0.015558559447526932, 0.10573387145996094, 0.06300753355026245, 0.026924744248390198, -0.05322182551026344, 0.10958347469568253, -0.004323725122958422, 0.04995707422494888, -0.0294738058000803, -0.08247750997543335, -0.014803885482251644, 0.06250297278165817, 0.010980622842907906, 0.01987507753074169, 0.020842451602220535, 0.08033671230077744, 0.0011412539752200246, -0.06361179798841476, -0.07340944558382034, -0.0848834365606308, 0.08405381441116333, 0.05263185873627663, -0.03678475692868233, -0.03437505662441254, -0.093882255256176, 0.05003831908106804, -0.08167045563459396, -0.07502171397209167, -0.02068762294948101, -0.07131124287843704, -0.07392413169145584, 0.023667365312576294, 0.01990259625017643, -0.04325990751385689, 0.007295276504009962, -0.0471685454249382, 0.024890149012207985, -0.09926678240299225, 0.025591421872377396, 0.01361432857811451, -0.00021767521684523672, -0.0128265880048275, -0.017382537946105003, -0.01726967841386795, 0.12072613090276718, -0.07210171222686768, -0.13708896934986115 ]
null
null
transformers
# Wav2vec2 German Model This model has been fine-tuned on the wav2vec-large-xlsr-53 with the German CommonVoice dataset. It achieves a 11.26 WER on the full test dataset. It was basically trained with the code provided by [Max Idahl](https://huggingface.co/maxidl/wav2vec2-large-xlsr-german) with small adjustments in data preprocessing and on training parameters. You can use it to transcribe your own files by the following code. Please note, that your input file must be *.wav, encoded in 16 kHz and be single channel. To convert an audio file using ffmpeg use: "ffmpeg -i input.wav -ar 16000 -ac 1 output.wav". The transcribe process is very memory consuming (around 10GB per 10 seconds). If the script ends with "Killed" it means the Python interpreter ran out of memory. In this case, try with a shorter audio file. ```python # !pip3 install transformers torch soundfile import soundfile as sf import torch from transformers import Wav2Vec2ForCTC, Wav2Vec2Tokenizer # load pretrained model tokenizer = Wav2Vec2Tokenizer.from_pretrained("Noricum/wav2vec2-large-xlsr-53-german") model = Wav2Vec2ForCTC.from_pretrained("Noricum/wav2vec2-large-xlsr-53-german") #load audio audio_input, _ = sf.read("/path/to/your/audio.wav") # transcribe input_values = tokenizer(audio_input, return_tensors="pt").input_values logits = model(input_values).logits predicted_ids = torch.argmax(logits, dim=-1) transcription = tokenizer.batch_decode(predicted_ids)[0] print(str(transcription)) ``` To evaluate the model on the full CommonVoice test dataset, run this script: ```python import re import torch import torchaudio from datasets import load_dataset, load_metric from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor test_dataset = load_dataset("common_voice", "de", split="test") # use "test[:1%]" for 1% sample wer = load_metric("wer") processor = Wav2Vec2Processor.from_pretrained("Noricum/wav2vec2-large-xlsr-53-german") model = Wav2Vec2ForCTC.from_pretrained("Noricum/wav2vec2-large-xlsr-53-german") model.to("cuda") chars_to_ignore_regex = '[\\\\,\\\\?\\\\.\\\\!\\\\-\\\\;\\\\:\\\\"\\\\“]' resampler = torchaudio.transforms.Resample(48_000, 16_000) # Preprocessing the datasets. # We need to read the aduio files as arrays def speech_file_to_array_fn(batch): batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower() speech_array, sampling_rate = torchaudio.load(batch["path"]) batch["speech"] = resampler(speech_array).squeeze().numpy() return batch test_dataset = test_dataset.map(speech_file_to_array_fn) # Preprocessing the datasets. # We need to read the audio files as arrays def evaluate(batch): inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True) with torch.no_grad(): logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits pred_ids = torch.argmax(logits, dim=-1) batch["pred_strings"] = processor.batch_decode(pred_ids) return batch result = test_dataset.map(evaluate, batched=True, batch_size=4) # batch_size=8 -> requires ~14.5GB GPU memory # Chunked version, see https://discuss.huggingface.co/t/spanish-asr-fine-tuning-wav2vec2/4586/5: import jiwer def chunked_wer(targets, predictions, chunk_size=None): if chunk_size is None: return jiwer.wer(targets, predictions) start = 0 end = chunk_size H, S, D, I = 0, 0, 0, 0 while start < len(targets): chunk_metrics = jiwer.compute_measures(targets[start:end], predictions[start:end]) H = H + chunk_metrics["hits"] S = S + chunk_metrics["substitutions"] D = D + chunk_metrics["deletions"] I = I + chunk_metrics["insertions"] start += chunk_size end += chunk_size return float(S + D + I) / float(H + S + D) print("Total (chunk_size=1000), WER: {:2f}".format(100 * chunked_wer(result["pred_strings"], result["sentence"], chunk_size=1000))) ``` Output: Total (chunk_size=1000), WER: 11.256522
{}
automatic-speech-recognition
Noricum/wav2vec2-large-xlsr-53-german
[ "transformers", "pytorch", "jax", "wav2vec2", "automatic-speech-recognition", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #endpoints_compatible #region-us
# Wav2vec2 German Model This model has been fine-tuned on the wav2vec-large-xlsr-53 with the German CommonVoice dataset. It achieves a 11.26 WER on the full test dataset. It was basically trained with the code provided by Max Idahl with small adjustments in data preprocessing and on training parameters. You can use it to transcribe your own files by the following code. Please note, that your input file must be *.wav, encoded in 16 kHz and be single channel. To convert an audio file using ffmpeg use: "ffmpeg -i URL -ar 16000 -ac 1 URL". The transcribe process is very memory consuming (around 10GB per 10 seconds). If the script ends with "Killed" it means the Python interpreter ran out of memory. In this case, try with a shorter audio file. To evaluate the model on the full CommonVoice test dataset, run this script: Output: Total (chunk_size=1000), WER: 11.256522
[ "# Wav2vec2 German Model\n \n This model has been fine-tuned on the wav2vec-large-xlsr-53 with the German CommonVoice dataset.\n \n It achieves a 11.26 WER on the full test dataset.\n It was basically trained with the code provided by Max Idahl with small adjustments in data preprocessing and on training parameters.\n \n You can use it to transcribe your own files by the following code. Please note, that your input file must be *.wav, encoded in 16 kHz and be single channel. To convert an audio file using ffmpeg use: \"ffmpeg -i URL -ar 16000 -ac 1 URL\". The transcribe process is very memory consuming (around 10GB per 10 seconds). If the script ends with \"Killed\" it means the Python interpreter ran out of memory. In this case, try with a shorter audio file.\n \n\n\nTo evaluate the model on the full CommonVoice test dataset, run this script:\n\n\n\nOutput: Total (chunk_size=1000), WER: 11.256522" ]
[ "TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #endpoints_compatible #region-us \n", "# Wav2vec2 German Model\n \n This model has been fine-tuned on the wav2vec-large-xlsr-53 with the German CommonVoice dataset.\n \n It achieves a 11.26 WER on the full test dataset.\n It was basically trained with the code provided by Max Idahl with small adjustments in data preprocessing and on training parameters.\n \n You can use it to transcribe your own files by the following code. Please note, that your input file must be *.wav, encoded in 16 kHz and be single channel. To convert an audio file using ffmpeg use: \"ffmpeg -i URL -ar 16000 -ac 1 URL\". The transcribe process is very memory consuming (around 10GB per 10 seconds). If the script ends with \"Killed\" it means the Python interpreter ran out of memory. In this case, try with a shorter audio file.\n \n\n\nTo evaluate the model on the full CommonVoice test dataset, run this script:\n\n\n\nOutput: Total (chunk_size=1000), WER: 11.256522" ]
[ 40, 241 ]
[ "passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #endpoints_compatible #region-us \n# Wav2vec2 German Model\n \n This model has been fine-tuned on the wav2vec-large-xlsr-53 with the German CommonVoice dataset.\n \n It achieves a 11.26 WER on the full test dataset.\n It was basically trained with the code provided by Max Idahl with small adjustments in data preprocessing and on training parameters.\n \n You can use it to transcribe your own files by the following code. Please note, that your input file must be *.wav, encoded in 16 kHz and be single channel. To convert an audio file using ffmpeg use: \"ffmpeg -i URL -ar 16000 -ac 1 URL\". The transcribe process is very memory consuming (around 10GB per 10 seconds). If the script ends with \"Killed\" it means the Python interpreter ran out of memory. In this case, try with a shorter audio file.\n \n\n\nTo evaluate the model on the full CommonVoice test dataset, run this script:\n\n\n\nOutput: Total (chunk_size=1000), WER: 11.256522" ]
[ -0.12651017308235168, -0.11265696585178375, 0.000657741678878665, -0.015735527500510216, 0.09910130500793457, -0.0024899959098547697, -0.11478850990533829, 0.0660027265548706, -0.05027439445257187, 0.1321467012166977, 0.13443642854690552, -0.01404131855815649, 0.01928594335913658, 0.11248385906219482, 0.022440578788518906, -0.15110190212726593, 0.06317898631095886, -0.008694710209965706, 0.11456218361854553, 0.06734730303287506, 0.036409370601177216, -0.012186918407678604, 0.05120857432484627, -0.005916897673159838, -0.10058353841304779, -0.04488963633775711, -0.017557373270392418, 0.042425524443387985, 0.04343922436237335, 0.04624049365520477, 0.00548438960686326, -0.06669450551271439, -0.011373844929039478, -0.13153325021266937, 0.008055437356233597, 0.07958804070949554, 0.0444364994764328, -0.010431596077978611, 0.0007979164947755635, 0.12363072484731674, 0.16410644352436066, -0.03064056858420372, -0.06891006976366043, 0.07883165031671524, 0.005170827731490135, -0.16220523416996002, -0.03240057826042175, -0.0005919859977439046, 0.06700815260410309, 0.07033996284008026, -0.052572235465049744, 0.07129328697919846, 0.05073893070220947, 0.15771229565143585, 0.08627225458621979, -0.15907712280750275, 0.014178826473653316, 0.16658131778240204, 0.11267701536417007, -0.002994769485667348, -0.028668778017163277, -0.005345663987100124, 0.09099233895540237, 0.04723508656024933, -0.06779812276363373, -0.05918724834918976, -0.16740864515304565, -0.09900684654712677, -0.1126105859875679, -0.06158541515469551, 0.12792056798934937, -0.054471392184495926, -0.10102153569459915, -0.04621865972876549, -0.05386823043227196, -0.28848183155059814, 0.025749502703547478, 0.07369205355644226, -0.04715398699045181, 0.059142645448446274, -0.0197448693215847, 0.0665769875049591, -0.07410762459039688, -0.09542957693338394, -0.13035078346729279, 0.15802784264087677, 0.02770799584686756, 0.09660615772008896, -0.02809164859354496, 0.033459022641181946, -0.23866483569145203, -0.03413822874426842, -0.07518000155687332, 0.00043730330071412027, -0.0938682034611702, -0.001060299458913505, -0.10350318253040314, -0.11802065372467041, 0.03983946517109871, 0.018158143386244774, -0.09584793448448181, 0.040595900267362595, 0.022562677040696144, 0.07772518694400787, 0.0955294668674469, 0.07260913401842117, -0.03764616325497627, -0.05444857105612755, 0.009795431047677994, -0.0415266752243042, 0.009810399264097214, -0.012975194491446018, -0.034318383783102036, -0.04362313076853752, 0.06612106412649155, 0.0023896375205367804, -0.002436688169836998, 0.05879192799329758, -0.014342938549816608, 0.010675097815692425, -0.01761062629520893, -0.03109968826174736, -0.004131488502025604, 0.03701787069439888, -0.0011434557382017374, 0.2885844111442566, 0.04459359869360924, -0.1143922358751297, -0.11473599821329117, 0.07532418519258499, -0.0005864072008989751, 0.033798668533563614, -0.08325706422328949, -0.16984574496746063, 0.06385345757007599, 0.011137619614601135, -0.01827581226825714, -0.11891116946935654, -0.1109318882226944, -0.030509907752275467, 0.057728420943021774, -0.008287524804472923, 0.022739090025424957, -0.011320654302835464, -0.07260055094957352, 0.0455748476088047, -0.015319815836846828, 0.009675554931163788, -0.0438801608979702, 0.08695392310619354, 0.03805915266275406, 0.10981934517621994, -0.12286577373743057, 0.03271719440817833, -0.022872569039463997, -0.00355069013312459, 0.0026313632261008024, 0.07969144731760025, -0.07103615254163742, -0.013366752304136753, -0.11593827605247498, -0.08485647290945053, -0.15391290187835693, 0.0614561066031456, 0.07040803879499435, 0.07909601181745529, -0.2765518128871918, 0.01265675202012062, 0.2416161447763443, -0.1299833357334137, 0.01529836654663086, 0.2221192717552185, 0.054998964071273804, -0.02490079030394554, 0.07118508219718933, 0.08172355592250824, 0.0749477818608284, -0.2377738505601883, -0.11241442710161209, 0.15264549851417542, -0.02034822478890419, -0.04425081983208656, 0.08601207286119461, -0.08760778605937958, -0.002035986864939332, 0.030266068875789642, 0.040948107838630676, 0.08041772246360779, -0.02307533100247383, 0.00995617639273405, -0.03686802461743355, -0.03476620838046074, 0.003638348774984479, -0.024112993851304054, 0.006875281222164631, -0.07783621549606323, -0.06805475056171417, 0.022684693336486816, 0.15654918551445007, -0.04906351864337921, 0.061290279030799866, -0.02278468757867813, 0.04596763849258423, -0.1671152412891388, 0.08592024445533752, -0.13068152964115143, 0.11138345301151276, -0.00474320026114583, 0.029569802805781364, -0.01880371943116188, 0.04591637849807739, 0.053683023899793625, 0.01844664290547371, 0.0409538708627224, 0.007024749182164669, 0.05213308706879616, -0.035765498876571655, -0.07618488371372223, 0.0019549974240362644, -0.06072172895073891, -0.058818407356739044, -0.0906907171010971, -0.0787026509642601, -0.015892615541815758, 0.11603128165006638, -0.09687013924121857, -0.027142655104398727, -0.08258149027824402, -0.011140071786940098, 0.0630425289273262, -0.037313152104616165, -0.0029704675544053316, 0.014570007100701332, 0.041176412254571915, -0.053316980600357056, 0.12846016883850098, -0.16403010487556458, -0.00868353620171547, 0.12797823548316956, 0.04559614881873131, -0.07110413163900375, 0.032111555337905884, 0.01051961537450552, -0.03293345868587494, -0.04153282567858696, -0.19970548152923584, 0.1372203826904297, 0.010845500975847244, 0.08774178475141525, -0.055850621312856674, 0.03513572737574577, 0.07768876850605011, -0.010584330186247826, 0.02562762051820755, 0.017008114606142044, -0.053354229778051376, -0.038934387266635895, 0.01827961392700672, -0.005117220804095268, -0.19254763424396515, 0.21029354631900787, -0.05887860059738159, -0.12114719301462173, -0.0032810664270073175, 0.11002571880817413, 0.008327639661729336, 0.07915300875902176, -0.028101257979869843, 0.0180459376424551, 0.03263260796666145, 0.06295445561408997, 0.031989071518182755, -0.05881224572658539, 0.10441480576992035, 0.04709717258810997, -0.018426811322569847, -0.1334175169467926, 0.0520046167075634, -0.07978707551956177, -0.012598547153174877, -0.04307793825864792, 0.033919304609298706, 0.002401382429525256, 0.010564996860921383, -0.07504120469093323, 0.1840800940990448, -0.06371409446001053, -0.06447071582078934, -0.1851739138364792, -0.05596275255084038, -0.013217332772910595, -0.030045248568058014, 0.06995410472154617, -0.0677422508597374, -0.0274113230407238, -0.04369800537824631, 0.11060179024934769, -0.1413688063621521, 0.051182571798563004, 0.042874112725257874, -0.06304605305194855, 0.03791527822613716, -0.11913260817527771, 0.024865059182047844, 0.013450860977172852, -0.01977263204753399, 0.0492025762796402, -0.03384314477443695, 0.048982955515384674, 0.10979816317558289, -0.012877852655947208, 0.017083516344428062, -0.008242250420153141, 0.12106239050626755, -0.031086090952157974, 0.04085499048233032, 0.14254556596279144, -0.05637045204639435, 0.036117009818553925, 0.01961028017103672, 0.011080136522650719, -0.021021753549575806, -0.004607588984072208, -0.005505844950675964, -0.13016274571418762, -0.09712854772806168, -0.09743627905845642, -0.12239130586385727, 0.1494496464729309, 0.07782547175884247, -0.012714920565485954, -0.13490848243236542, 0.04387693107128143, -0.033558301627635956, 0.08939120173454285, -0.04966408386826515, 0.05156660079956055, -0.010725559666752815, -0.023285891860723495, 0.07245461642742157, -0.02918437123298645, 0.05389098823070526, 0.11496845632791519, 0.020765740424394608, 0.1811128705739975, -0.02602197788655758, 0.1561657339334488, 0.019224701449275017, -0.05384567007422447, 0.05223511531949043, 0.12031101435422897, -0.12057948857545853, -0.020539896562695503, 0.015405002050101757, -0.0653858631849289, -0.05127951130270958, 0.00972296018153429, 0.027209635823965073, -0.008450882509350777, -0.06484641879796982, 0.06025021895766258, 0.03972144424915314, 0.06444101780653, 0.05660700052976608, -0.1844048649072647, -0.17479631304740906, -0.05024242773652077, -0.05696358159184456, -0.039423972368240356, 0.07670433819293976, 0.18701903522014618, -0.018017590045928955, 0.00735361548140645, -0.017827697098255157, 0.12357666343450546, -0.05081683397293091, 0.050279662013053894, -0.054373741149902344, 0.07882660627365112, 0.004989524371922016, 0.03846212103962898, -0.09383777529001236, 0.04347027465701103, 0.013417595997452736, 0.1018051728606224, -0.04013495519757271, 0.022799784317612648, 0.035860825330019, -0.02691926620900631, 0.058513227850198746, 0.025949379429221153, -0.12581995129585266, -0.037977706640958786, -0.09305418282747269, -0.01566973701119423, 0.06636064499616623, 0.15609349310398102, 0.06898140162229538, -0.03573336824774742, 0.05639682710170746, -0.0031065631192177534, 0.0396876186132431, 0.05873754248023033, -0.09619458764791489, -0.024792613461613655, 0.18171092867851257, 0.03792349249124527, 0.006543560419231653, -0.00819771084934473, -0.01896166428923607, 0.15705330669879913, -0.1795378178358078, -0.02906288392841816, -0.06644880026578903, 0.04094868525862694, 0.15060102939605713, -0.05922961235046387, 0.04493158683180809, 0.00547786196693778, 0.026304051280021667, -0.09902888536453247, -0.08406457304954529, 0.05938732624053955, -0.17103184759616852, -0.00714306253939867, -0.01555531844496727, 0.1202775090932846, -0.0709715336561203, 0.048718687146902084, -0.01776842772960663, -0.004517511464655399, -0.10018641501665115, -0.14220009744167328, -0.02055235393345356, 0.05042169988155365, -0.06423772126436234, 0.04052402824163437, -0.032687701284885406, -0.14912551641464233, 0.08819345384836197, -0.01700804941356182, 0.10338782519102097, 0.19148747622966766, -0.04834585264325142, 0.040863897651433945, 0.1438138335943222, -0.019592445343732834, -0.3048241138458252, -0.03979434072971344, 0.10799917578697205, 0.07193761318922043, -0.0237831212580204, -0.07737061381340027, 0.047979846596717834, 0.06327491253614426, -0.003363553900271654, 0.1808796226978302, -0.28272774815559387, -0.07287120819091797, 0.08630791306495667, -0.06339208036661148, 0.3585507571697235, -0.11753614991903305, -0.03838895261287689, -0.05881376564502716, -0.11509818583726883, -0.010435373522341251, -0.2117002010345459, 0.12859652936458588, 0.010891638696193695, 0.058814212679862976, 0.06112348288297653, -0.08388989418745041, 0.05928608402609825, 0.011660266667604446, -0.012132072821259499, -0.0018528657965362072, 0.053866758942604065, 0.040665097534656525, -0.02504395879805088, 0.20732760429382324, -0.22848211228847504, 0.04376531019806862, -0.12496057152748108, -0.00590292364358902, -0.04251985624432564, 0.03667125850915909, 0.007519825827330351, -0.013291987590491772, -0.031216159462928772, -0.025335153564810753, 0.06352430582046509, -0.005680182948708534, -0.04185771197080612, -0.008477511815726757, -0.08946139365434647, 0.2288050800561905, -0.010669388808310032, 0.05572957918047905, -0.17110233008861542, 0.05945984646677971, -0.050730910152196884, 0.16234929859638214, -0.08151866495609283, 0.007121951784938574, 0.029841545969247818, 0.05015303194522858, 0.0055648209527134895, 0.050538320094347, -0.07519138604402542, 0.045399028807878494, 0.04365801066160202, -0.047462888062000275, -0.01754646562039852, 0.0003603797231335193, 0.03118264116346836, -0.06369667500257492, 0.031445231288671494, 0.10769698768854141, -0.05332391336560249, 0.06898138672113419, -0.00606607785448432, 0.04679885506629944, -0.0993468314409256, 0.19143737852573395, -0.036054037511348724, 0.056170374155044556, -0.09101669490337372, -0.016230730339884758, -0.05293134227395058, -0.09038460999727249, 0.11024907231330872, 0.02908971719443798, -0.0909074991941452, -0.06695756316184998, 0.00357424165122211, -0.11395096033811569, -0.015108730643987656, -0.054380618035793304, 0.07385086268186569, -0.013385739177465439, -0.03611186891794205, -0.02361680008471012, 0.073338583111763, 0.049825821071863174, -0.060792192816734314, 0.008541779592633247, -0.14183543622493744, 0.0880669355392456, -0.007492697797715664, 0.04015585780143738, -0.04188470169901848, 0.16532379388809204, -0.039270319044589996, -0.0006497949943877757, -0.03156078979372978, 0.0026739714667201042, -0.13411419093608856, 0.005936580244451761, 0.0012084103655070066, -0.005670842714607716, -0.04210149869322777, -0.03316936269402504, 0.03927253559231758, 0.08395565301179886, -0.010268359445035458, 0.07611855119466782, -0.016494549810886383, -0.0006638074992224574, -0.046608440577983856, -0.023351483047008514, -0.00900290161371231, 0.024943217635154724, 0.034626420587301254, -0.034765202552080154, 0.02323903515934944, 0.10006601363420486, -0.03501459211111069, 0.0285897608846426, -0.041403260082006454, -0.03397753834724426, 0.11374416947364807, -0.005193017423152924, -0.04646603763103485, 0.004267989657819271, 0.033921193331480026, 0.07472483068704605, 0.07510941475629807, -0.007633093744516373, 0.13388176262378693, -0.06256810575723648, -0.016308384016156197, -0.061182670295238495, 0.06669232994318008, -0.0255030058324337, -0.005444787442684174, 0.041336074471473694, 0.13516227900981903, 0.14134323596954346, -0.0831591859459877, 0.025156697258353233, -0.04134419187903404, 0.023981137201189995, 0.024935463443398476, -0.02748033031821251, -0.04623521864414215, -0.01732967421412468, 0.04090540483593941, -0.05346505716443062, 0.12684407830238342, -0.1264839768409729, -0.02401398867368698, -0.02627338096499443, -0.05262253060936928, -0.059953175485134125, -0.06402913480997086, 0.23186953365802765, -0.016299212351441383, -0.024583937600255013, -0.17831715941429138, 0.0456184446811676, 0.10191439092159271, 0.09942472726106644, 0.05454035475850105, 0.0333121083676815, -0.052236706018447876, 0.15533383190631866, -0.06888743489980698, -0.04620761796832085, -0.12024619430303574, -0.10233250260353088, -0.1066870167851448, 0.09795421361923218, -0.0017665785271674395, 0.1614791303873062, 0.13716085255146027, -0.021590663120150566, 0.06398506462574005, 0.03295021876692772, -0.04015268012881279, -0.08370958268642426, -0.08755049109458923, -0.018025696277618408, -0.11692942678928375, 0.008237631060183048, -0.06641074270009995, 0.058134324848651886, 0.013125375844538212, 0.035299159586429596, -0.0014964203583076596, 0.2899662256240845, -0.1312946230173111, -0.10240165144205093, 0.05994659662246704, -0.07276928424835205, -0.03130735456943512, 0.13381905853748322, -0.008652080781757832, 0.09873896092176437, -0.12508566677570343, 0.14568904042243958, 0.05656595900654793, -0.012201869860291481, 0.07570411264896393, -0.019051983952522278, -0.06014684587717056, -0.040329255163669586, 0.03150280937552452, 0.04292324185371399, 0.2422596514225006, 0.0746651366353035, -0.07714337855577469, -0.03292986378073692, -0.08273351937532425, -0.05323110148310661, -0.12271469831466675, -0.12136104702949524, 0.06115111708641052, 0.07191376388072968, 0.0477035790681839, -0.10482258349657059, -0.07074624300003052, -0.038181304931640625, 0.16853171586990356, 0.08414274454116821, 0.009441971778869629, -0.015685128048062325, 0.021389927715063095, -0.004897767677903175, 0.0205241609364748, 0.11138425767421722, 0.025661563500761986, 0.32645103335380554, 0.004410575143992901, 0.06533269584178925, -0.004565402865409851, -0.0642840638756752, -0.09596364945173264, 0.03508676216006279, -0.057210519909858704, -0.06775595247745514, -0.03223985806107521, 0.03453228250145912, -0.1244138777256012, -0.01778140850365162, -0.04605896770954132, 0.009293188340961933, -0.015516403131186962, -0.019351208582520485, 0.043708667159080505, 0.06798609346151352, 0.06437907367944717, -0.1156105175614357, 0.037653855979442596, 0.030793270096182823, -0.06253752112388611, -0.09881509840488434, -0.08241605013608932, -0.05855398252606392, 0.05091842636466026, -0.05860189348459244, 0.045796941965818405, 0.16497963666915894, 0.05081276223063469, 0.006401864346116781, -0.021147508174180984, 0.13254185020923615, -0.05284441262483597, -0.17253923416137695, -0.005051164422184229, 0.14741699397563934, -0.07074086368083954, 0.05858892574906349, -0.00351371755823493, -0.014464070089161396, -0.03926888108253479, 0.06082228198647499, 0.03221815451979637, -0.11497437208890915, -0.03332347795367241, -0.04128485545516014, 0.11638199537992477, 0.15466025471687317, -0.01310713030397892, -0.042096398770809174, -0.04998426139354706, 0.07695388048887253, -0.0010504787787795067, 0.002726862672716379, -0.05357705429196358, -0.1600857526063919, -0.015773948282003403, 0.023063478991389275, 0.02411266602575779, -0.17973242700099945, -0.0008172426023520529, -0.07523725926876068, -0.009069103747606277, -0.03953935578465462, 0.02807558886706829, 0.05165766179561615, -0.009071013890206814, -0.002330187475308776, -0.00548374280333519, 0.04685836657881737, 0.013832533732056618, -0.11686420440673828, -0.15645766258239746 ]
null
null
transformers
# distilgpt2-base-pretrained-he A tiny GPT2 based Hebrew text generation model initially trained on a TPUv3-8 which was made avilable to me via the [TPU Research Cloud](https://sites.research.google/trc/) Program. Then was further fine-tuned on GPU. ## Dataset ### oscar (unshuffled deduplicated he) - [Homepage](https://oscar-corpus.com) | [Dataset Permalink](https://huggingface.co/datasets/viewer/?dataset=oscar&config=unshuffled_deduplicated_he) The Open Super-large Crawled ALMAnaCH coRpus is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture. ### CC-100 (he) - [HomePage](https://data.statmt.org/cc-100/) This corpus comprises of monolingual data for 100+ languages and also includes data for romanized languages. This was constructed using the urls and paragraph indices provided by the CC-Net repository by processing January-December 2018 Commoncrawl snapshots. Each file comprises of documents separated by double-newlines and paragraphs within the same document separated by a newline. The data is generated using the open source CC-Net repository. ### Misc * Hebrew Twitter * Wikipedia * Various other sources ## Training * Done on a TPUv3-8 VM using [Huggingface's clm-flax example script](https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_clm_flax.py) <BR> * I have made a list of items which might make it easier for other to use this script. The list was posted to [This discussion forum](https://discuss.huggingface.co/t/ideas-for-beginner-friendlier-tpu-vm-clm-training/8351) * Further training was performed on GPU ## Usage #### Simple usage sample code ```python from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline def main(): model_name="Norod78/distilgpt2-base-pretrained-he" prompt_text = "שלום, קוראים לי" generated_max_length = 192 print("Loading model...") model = AutoModelForCausalLM.from_pretrained(model_name) print('Loading Tokenizer...') tokenizer = AutoTokenizer.from_pretrained(model_name) text_generator = pipeline(task="text-generation", model=model, tokenizer=tokenizer) print("Generating text...") result = text_generator(prompt_text, num_return_sequences=1, batch_size=1, do_sample=True, top_k=40, top_p=0.92, temperature = 1, repetition_penalty=5.0, max_length = generated_max_length) print("result = " + str(result)) if __name__ == '__main__': main() ```
{"language": "he", "license": "mit", "thumbnail": "https://avatars1.githubusercontent.com/u/3617152?norod.jpg", "widget": [{"text": "\u05d4\u05d0\u05d9\u05e9 \u05d4\u05d0\u05d7\u05e8\u05d5\u05df \u05e2\u05dc\u05d9 \u05d0\u05d3\u05de\u05d5\u05ea \u05d9\u05e9\u05d1 \u05dc\u05d1\u05d3 \u05d1\u05d7\u05d3\u05e8\u05d5 \u05db\u05e9\u05dc\u05e4\u05ea\u05e2 \u05e0\u05e9\u05de\u05e2\u05d4 \u05e0\u05e7\u05d9\u05e9\u05d4"}, {"text": "\u05e9\u05dc\u05d5\u05dd, \u05e7\u05e8\u05d5\u05d0\u05d9\u05dd \u05dc\u05d9"}, {"text": "\u05d4\u05d0\u05e8\u05d9 \u05e4\u05d5\u05d8\u05e8 \u05d7\u05d9\u05d9\u05da \u05d7\u05d9\u05d5\u05da \u05e0\u05d1\u05d5\u05da"}, {"text": "\u05d4\u05d7\u05ea\u05d5\u05dc \u05e9\u05dc\u05da \u05de\u05d0\u05d5\u05d3 \u05d7\u05de\u05d5\u05d3 \u05d5"}]}
text-generation
Norod78/distilgpt2-base-pretrained-he
[ "transformers", "pytorch", "tf", "jax", "coreml", "onnx", "safetensors", "gpt2", "text-generation", "he", "license:mit", "autotrain_compatible", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "he" ]
TAGS #transformers #pytorch #tf #jax #coreml #onnx #safetensors #gpt2 #text-generation #he #license-mit #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
# distilgpt2-base-pretrained-he A tiny GPT2 based Hebrew text generation model initially trained on a TPUv3-8 which was made avilable to me via the TPU Research Cloud Program. Then was further fine-tuned on GPU. ## Dataset ### oscar (unshuffled deduplicated he) - Homepage | Dataset Permalink The Open Super-large Crawled ALMAnaCH coRpus is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture. ### CC-100 (he) - HomePage This corpus comprises of monolingual data for 100+ languages and also includes data for romanized languages. This was constructed using the urls and paragraph indices provided by the CC-Net repository by processing January-December 2018 Commoncrawl snapshots. Each file comprises of documents separated by double-newlines and paragraphs within the same document separated by a newline. The data is generated using the open source CC-Net repository. ### Misc * Hebrew Twitter * Wikipedia * Various other sources ## Training * Done on a TPUv3-8 VM using Huggingface's clm-flax example script <BR> * I have made a list of items which might make it easier for other to use this script. The list was posted to This discussion forum * Further training was performed on GPU ## Usage #### Simple usage sample code
[ "# distilgpt2-base-pretrained-he\n\nA tiny GPT2 based Hebrew text generation model initially trained on a TPUv3-8 which was made avilable to me via the TPU Research Cloud Program. Then was further fine-tuned on GPU.", "## Dataset", "### oscar (unshuffled deduplicated he) - Homepage | Dataset Permalink\n\nThe Open Super-large Crawled ALMAnaCH coRpus is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture.", "### CC-100 (he) - HomePage\n\nThis corpus comprises of monolingual data for 100+ languages and also includes data for romanized languages. This was constructed using the urls and paragraph indices provided by the CC-Net repository by processing January-December 2018 Commoncrawl snapshots. Each file comprises of documents separated by double-newlines and paragraphs within the same document separated by a newline. The data is generated using the open source CC-Net repository.", "### Misc\n* Hebrew Twitter\n* Wikipedia\n* Various other sources", "## Training\n\n* Done on a TPUv3-8 VM using Huggingface's clm-flax example script <BR>\n* I have made a list of items which might make it easier for other to use this script. The list was posted to This discussion forum\n* Further training was performed on GPU", "## Usage", "#### Simple usage sample code" ]
[ "TAGS\n#transformers #pytorch #tf #jax #coreml #onnx #safetensors #gpt2 #text-generation #he #license-mit #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n", "# distilgpt2-base-pretrained-he\n\nA tiny GPT2 based Hebrew text generation model initially trained on a TPUv3-8 which was made avilable to me via the TPU Research Cloud Program. Then was further fine-tuned on GPU.", "## Dataset", "### oscar (unshuffled deduplicated he) - Homepage | Dataset Permalink\n\nThe Open Super-large Crawled ALMAnaCH coRpus is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture.", "### CC-100 (he) - HomePage\n\nThis corpus comprises of monolingual data for 100+ languages and also includes data for romanized languages. This was constructed using the urls and paragraph indices provided by the CC-Net repository by processing January-December 2018 Commoncrawl snapshots. Each file comprises of documents separated by double-newlines and paragraphs within the same document separated by a newline. The data is generated using the open source CC-Net repository.", "### Misc\n* Hebrew Twitter\n* Wikipedia\n* Various other sources", "## Training\n\n* Done on a TPUv3-8 VM using Huggingface's clm-flax example script <BR>\n* I have made a list of items which might make it easier for other to use this script. The list was posted to This discussion forum\n* Further training was performed on GPU", "## Usage", "#### Simple usage sample code" ]
[ 76, 61, 3, 68, 114, 15, 65, 3, 6 ]
[ "passage: TAGS\n#transformers #pytorch #tf #jax #coreml #onnx #safetensors #gpt2 #text-generation #he #license-mit #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n# distilgpt2-base-pretrained-he\n\nA tiny GPT2 based Hebrew text generation model initially trained on a TPUv3-8 which was made avilable to me via the TPU Research Cloud Program. Then was further fine-tuned on GPU.## Dataset### oscar (unshuffled deduplicated he) - Homepage | Dataset Permalink\n\nThe Open Super-large Crawled ALMAnaCH coRpus is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture.### CC-100 (he) - HomePage\n\nThis corpus comprises of monolingual data for 100+ languages and also includes data for romanized languages. This was constructed using the urls and paragraph indices provided by the CC-Net repository by processing January-December 2018 Commoncrawl snapshots. Each file comprises of documents separated by double-newlines and paragraphs within the same document separated by a newline. The data is generated using the open source CC-Net repository.### Misc\n* Hebrew Twitter\n* Wikipedia\n* Various other sources## Training\n\n* Done on a TPUv3-8 VM using Huggingface's clm-flax example script <BR>\n* I have made a list of items which might make it easier for other to use this script. The list was posted to This discussion forum\n* Further training was performed on GPU## Usage#### Simple usage sample code" ]
[ -0.057578545063734055, 0.17145727574825287, -0.0035788631066679955, 0.0409325435757637, 0.05102358013391495, 0.02169598452746868, 0.13303397595882416, 0.1486120969057083, -0.015072204172611237, 0.07898803055286407, -0.03144770860671997, 0.005696089472621679, 0.10233204811811447, 0.033646102994680405, 0.05238649249076843, -0.23733440041542053, 0.03559177741408348, -0.07252593338489532, 0.020803069695830345, 0.03733226656913757, 0.07374095916748047, -0.03978566825389862, 0.0594012513756752, -0.04164950177073479, -0.08136048167943954, -0.019574902951717377, -0.06011483818292618, -0.03912464901804924, 0.0816802829504013, 0.08700093626976013, 0.05782861262559891, 0.027391083538532257, 0.046100787818431854, -0.09723225235939026, 0.022002898156642914, 0.10609105229377747, -0.05455320328474045, 0.04395386949181557, 0.15391680598258972, -0.05825646594166756, 0.13003388047218323, -0.10704580694437027, 0.045931413769721985, 0.04819474369287491, -0.09438503533601761, -0.15920855104923248, -0.19674189388751984, 0.04563453793525696, 0.03860866650938988, 0.016876816749572754, -0.001852158922702074, 0.031907182186841965, -0.06049077585339546, 0.08199653774499893, 0.10285749286413193, -0.16904300451278687, -0.0392010435461998, 0.11177180707454681, 0.014202424325048923, 0.1191360279917717, -0.0635429322719574, -0.02818313054740429, -0.029320985078811646, 0.040911514312028885, -0.014476804062724113, 0.00993483979254961, -0.10510600358247757, -0.01290587242692709, -0.11003617197275162, -0.013073637150228024, 0.09853152185678482, -0.02320179156959057, -0.015440803952515125, -0.13676226139068604, -0.055862270295619965, -0.03416232764720917, -0.0219831895083189, 0.0674867033958435, -0.022937793284654617, -0.010974826291203499, 0.07196904718875885, -0.0927157998085022, -0.1097291111946106, 0.015207072719931602, -0.02461644448339939, 0.05869255214929581, 0.029597679153084755, 0.037178151309490204, 0.03578462824225426, 0.10226181894540787, 0.050710029900074005, -0.07997063547372818, -0.0014990741619840264, -0.0033419611863791943, -0.08327703922986984, 0.008296370506286621, -0.03756328299641609, -0.11218835413455963, 0.05040125921368599, 0.09689284861087799, 0.02693302184343338, 0.03549798205494881, -0.06749173253774643, 0.009750086814165115, 0.02947808988392353, 0.1388441026210785, -0.06797049939632416, -0.050307758152484894, 0.08639480173587799, 0.008928013034164906, 0.0532555915415287, -0.028094487264752388, -0.06782379001379013, 0.01640240103006363, 0.017952842637896538, 0.07169211655855179, 0.08002258837223053, 0.03588282689452171, -0.016489187255501747, -0.045083049684762955, 0.16876648366451263, -0.1897662729024887, 0.03697109967470169, 0.02702568657696247, 0.016639037057757378, 0.0644024908542633, 0.05769271031022072, -0.054097678512334824, -0.1143203005194664, 0.037673041224479675, -0.03791925683617592, 0.018939578905701637, -0.04936957731842995, -0.07346638292074203, 0.05877675488591194, -0.1431538164615631, -0.087755486369133, -0.03832366690039635, -0.09585859626531601, -0.07131960988044739, 0.008804894983768463, -0.07109741866588593, -0.0018225323874503374, -0.0421183742582798, 0.021802391856908798, -0.007700942922383547, 0.023645935580134392, -0.021385515108704567, -0.05454267933964729, 0.027695300057530403, -0.12258519232273102, 0.058081869035959244, -0.05133017525076866, 0.026902837678790092, -0.08275038003921509, 0.03463299572467804, -0.12741117179393768, 0.14747390151023865, -0.08999086916446686, 0.018555745482444763, -0.042329173535108566, -0.01967606134712696, -0.028347942978143692, -0.025823524221777916, -0.03727736324071884, 0.09667477756738663, -0.23961715400218964, -0.04993833228945732, 0.20194676518440247, -0.14570945501327515, -0.023030655458569527, 0.11224313080310822, -0.019778160378336906, 0.06741432100534439, 0.06878629326820374, 0.1458546221256256, 0.0989842340350151, -0.011841890402138233, -0.07268383353948593, 0.03152278810739517, 0.004712364636361599, 0.11504001915454865, 0.062199365347623825, -0.06903309375047684, 0.11619316041469574, 0.021669045090675354, 0.03225691616535187, -0.03299592807888985, 0.011336703784763813, -0.06846252083778381, 0.0026136054657399654, 0.016400301828980446, -0.06377032399177551, -0.029061345383524895, -0.004942997358739376, -0.033656928688287735, -0.09025800973176956, -0.030323579907417297, 0.09810665994882584, -0.08859622478485107, 0.07537569850683212, -0.03554995730519295, 0.04350195825099945, -0.006271020974963903, 0.033638112246990204, -0.15545029938220978, -0.1620466113090515, 0.04714372009038925, -0.09287463873624802, 0.08363202214241028, -0.04035850614309311, -0.010595384053885937, 0.08665335923433304, -0.03507286310195923, 0.007706128526479006, -0.010604700073599815, 0.0057701729238033295, -0.04630386456847191, -0.13060979545116425, -0.09316178411245346, -0.03873230889439583, 0.14868605136871338, -0.056763824075460434, 0.030096909031271935, 0.10763711482286453, 0.15212583541870117, 0.013246249407529831, -0.03427771478891373, 0.007240816950798035, 0.01779579557478428, -0.02593497559428215, -0.0845465138554573, 0.02093437872827053, 0.020887313410639763, -0.022733302786946297, 0.10386732965707779, -0.11280103772878647, -0.12306531518697739, 0.1055363118648529, 0.043609779328107834, -0.043254315853118896, -0.034135568886995316, -0.027288509532809258, -0.03225093334913254, -0.009685272350907326, -0.04158521443605423, 0.03603605180978775, 0.04270044341683388, 0.0959785208106041, -0.097100630402565, -0.06462202966213226, -0.014794494025409222, 0.005299211014062166, -0.023950351402163506, 0.08542712032794952, -0.019944801926612854, -0.168295219540596, 0.12768152356147766, 0.03515613451600075, 0.06604378670454025, 0.08421242982149124, 0.012730696238577366, -0.076154924929142, -0.0004955535987392068, 0.05124820023775101, 0.047222577035427094, 0.0365636870265007, -0.0058928076177835464, 0.016654986888170242, 0.041063856333494186, 0.037163540720939636, 0.07102034986019135, -0.1300210952758789, 0.04709561541676521, -0.010581465438008308, -0.03557736426591873, 0.05405212938785553, 0.01116747036576271, 0.0328962542116642, 0.09507886320352554, 0.012745813466608524, 0.09405992180109024, -0.00008769724809098989, -0.06597806513309479, -0.08213087171316147, 0.14800037443637848, -0.1613726168870926, -0.22333534061908722, -0.1637340933084488, -0.07515257596969604, -0.07544586807489395, 0.025027746334671974, 0.04652511700987816, -0.03340337052941322, -0.049521852284669876, -0.13369794189929962, 0.05386500805616379, 0.04509621113538742, -0.08747214823961258, -0.09466329962015152, 0.053072743117809296, -0.05386362597346306, -0.1376711130142212, 0.03587006777524948, 0.033645715564489365, -0.09729011356830597, 0.056161195039749146, -0.008785209618508816, 0.030440879985690117, 0.09693750739097595, 0.02935561165213585, -0.011522016488015652, -0.021335316821932793, 0.08675399422645569, -0.061000142246484756, 0.0755837932229042, 0.07924759387969971, -0.04956892132759094, 0.07768155634403229, 0.021689631044864655, 0.03138662874698639, -0.03750191256403923, -0.027941245585680008, 0.018003609031438828, -0.05265022814273834, -0.21721374988555908, -0.11965546756982803, -0.07349537312984467, 0.06506730616092682, 0.08066507428884506, 0.06858350336551666, -0.035311609506607056, 0.03609318658709526, -0.11923092603683472, 0.016619589179754257, 0.07384252548217773, 0.0924815833568573, 0.1109759584069252, -0.02165072411298752, 0.038972023874521255, -0.10294920951128006, 0.018937136977910995, 0.11351355165243149, 0.11787685006856918, 0.15602026879787445, -0.055853065103292465, 0.17600902915000916, 0.04049661383032799, 0.11184271425008774, 0.03589112311601639, 0.07672464847564697, 0.014170908369123936, 0.0680604800581932, -0.024149777367711067, -0.10956744104623795, 0.0026597839314490557, 0.0425446480512619, -0.03071986325085163, -0.07816281914710999, 0.024213340133428574, -0.08853184431791306, 0.06577511131763458, 0.28630784153938293, 0.04611079767346382, -0.14603127539157867, -0.05352123826742172, 0.03860169276595116, -0.005477576516568661, -0.07081766426563263, -0.022251544520258904, 0.09983278810977936, -0.11668673902750015, 0.05849246308207512, 0.00008113398507703096, 0.08250347524881363, -0.09567155689001083, -0.05027789995074272, 0.039458099752664566, 0.03721459209918976, -0.018256161361932755, 0.0987325981259346, -0.04412334784865379, 0.07609359920024872, 0.03036765567958355, -0.006406690925359726, -0.07569054514169693, 0.04808611422777176, 0.0028742761351168156, -0.002572012133896351, 0.09084354341030121, 0.028365381062030792, -0.02410411462187767, -0.044656090438365936, -0.1343754231929779, -0.0061096553690731525, 0.10742902755737305, -0.07145687937736511, 0.097586490213871, 0.00604964280501008, -0.02273622713983059, -0.031442876905202866, -0.049935370683670044, 0.022856824100017548, -0.27479031682014465, 0.07648059725761414, -0.07353480905294418, -0.03517414629459381, -0.026920892298221588, -0.04969683289527893, -0.061508454382419586, 0.19317777454853058, -0.1256418228149414, -0.10513336956501007, -0.10299282521009445, -0.018612943589687347, 0.15878057479858398, -0.03539266809821129, 0.04552865028381348, -0.0430418960750103, 0.06955350190401077, -0.046796765178442, -0.1430080384016037, 0.017014944925904274, -0.07720465958118439, -0.15906502306461334, -0.027359530329704285, 0.1904999315738678, 0.016129791736602783, 0.004076479002833366, -0.012141351588070393, 0.05333007499575615, -0.03007739596068859, -0.07320842146873474, -0.02749905362725258, 0.11514973640441895, 0.09658252447843552, 0.17406491935253143, -0.13382191956043243, -0.14662785828113556, -0.06761772185564041, -0.04603682458400726, 0.08119987696409225, 0.1992194801568985, -0.055767886340618134, 0.14050687849521637, 0.09454502165317535, -0.11454439163208008, -0.21778465807437897, -0.051794931292533875, -0.023424696177244186, 0.003191074589267373, 0.014194045215845108, -0.20451322197914124, 0.09746130555868149, 0.12308821827173233, 0.006519147660583258, 0.09744761884212494, -0.2322523295879364, -0.09508050978183746, 0.009573908522725105, -0.007793477736413479, -0.05870724469423294, -0.10415364801883698, -0.05074150487780571, -0.04257885739207268, -0.05673206225037575, 0.13490834832191467, -0.028333308175206184, 0.07487551122903824, -0.0009814349468797445, -0.03340388089418411, 0.028428461402654648, -0.01500904094427824, 0.12753115594387054, 0.007066021207720041, 0.045577019453048706, -0.07371439784765244, 0.044107746332883835, 0.0809803158044815, -0.006619746331125498, 0.0548010915517807, 0.011073192581534386, 0.03444806486368179, -0.01055092178285122, 0.0009347448358312249, -0.0628357082605362, 0.025514042004942894, -0.045232828706502914, 0.0045555997639894485, -0.1002030000090599, 0.053680770099163055, 0.08702237904071808, -0.01686156541109085, 0.12134260684251785, -0.012026281096041203, 0.10421524941921234, 0.109764963388443, 0.05283145233988762, 0.023968758061528206, -0.03314082324504852, -0.06946146488189697, -0.03491538017988205, 0.020530296489596367, 0.00044408993562683463, 0.017821088433265686, 0.06419935077428818, 0.003267026739194989, 0.07030729949474335, 0.03395173326134682, -0.14884395897388458, -0.012779475189745426, 0.09577278047800064, -0.139108806848526, -0.08536318689584732, 0.005678150337189436, 0.0014185744803398848, -0.04896142706274986, 0.02151252143085003, 0.11948659271001816, 0.006331372540444136, -0.06033062934875488, 0.0013744725147262216, 0.10201329737901688, 0.006664580665528774, 0.07861591130495071, -0.01710408739745617, -0.012605508789420128, -0.09303924441337585, 0.10984981805086136, 0.17407672107219696, -0.12541009485721588, -0.04687131568789482, 0.22093458473682404, -0.10900776088237762, -0.06297459453344345, -0.08281295746564865, 0.0027742094825953245, -0.06829872727394104, 0.00711515499278903, -0.005222560837864876, -0.0592975839972496, -0.007390871178358793, 0.1336730569601059, 0.012718429788947105, 0.11451636254787445, 0.026837240904569626, -0.008185584098100662, -0.043134741485118866, 0.07521243393421173, -0.021726645529270172, 0.03642256557941437, -0.01568833366036415, 0.10068970918655396, 0.003968433942645788, 0.04275554418563843, -0.011720613576471806, -0.03760762885212898, -0.06724750250577927, -0.06033714860677719, -0.06881047785282135, 0.048461973667144775, -0.08987242728471756, 0.0010147941065952182, -0.03433254733681679, 0.0452146977186203, 0.029042067006230354, -0.0034246796276420355, -0.037727076560258865, -0.0504162572324276, -0.11669737845659256, 0.06543571501970291, -0.10282081365585327, 0.01370780449360609, 0.034420598298311234, -0.09299799799919128, 0.13896051049232483, 0.03312999755144119, -0.004913258831948042, 0.01838102377951145, -0.03550790622830391, -0.051632024347782135, -0.010181700810790062, 0.05811280384659767, 0.009271439164876938, -0.06360560655593872, 0.008357754908502102, 0.008116361685097218, -0.04957662522792816, 0.024589860811829567, 0.11990080028772354, -0.09583620727062225, 0.09591669589281082, -0.08329109102487564, -0.024466311559081078, -0.06310315430164337, 0.0500357523560524, 0.051333360373973846, 0.04339462146162987, 0.06399229168891907, -0.06524690240621567, 0.07576920092105865, -0.16343699395656586, -0.04997730627655983, -0.00710503663867712, -0.057121094316244125, -0.02828482910990715, -0.007915045134723186, 0.05321146175265312, 0.047758400440216064, 0.1981232762336731, 0.023602887988090515, -0.020387161523103714, 0.01862151548266411, 0.0406884104013443, -0.0677335187792778, 0.07190552353858948, 0.05418536812067032, -0.0010416125878691673, -0.022129196673631668, -0.024748073890805244, -0.0294934194535017, -0.07784026116132736, -0.03983987122774124, 0.09837204962968826, 0.06702038645744324, 0.10541900247335434, 0.013736037537455559, 0.030460624024271965, -0.11637528240680695, -0.017380770295858383, 0.012326271273195744, -0.004727010149508715, 0.03681935742497444, -0.07686835527420044, 0.055091194808483124, 0.08382125198841095, -0.12768016755580902, 0.13308903574943542, 0.005311245564371347, -0.05131986737251282, -0.08130098879337311, -0.16589193046092987, -0.04555680230259895, -0.03063317947089672, -0.06506906449794769, -0.10678155720233917, 0.06000187247991562, 0.061906926333904266, 0.019994817674160004, -0.01974250003695488, 0.06459853798151016, -0.10518337041139603, -0.09023700654506683, 0.00005728997348342091, 0.04959198087453842, 0.07660616934299469, 0.05489684268832207, -0.014359138906002045, 0.0561976321041584, 0.08245761692523956, 0.04621601477265358, 0.058932434767484665, 0.12750686705112457, 0.05710286647081375, -0.09850208461284637, -0.058394934982061386, 0.023790506646037102, -0.02122076414525509, 0.011504915542900562, 0.20523026585578918, 0.05132600665092468, -0.028363538905978203, 0.02285970002412796, 0.14986127614974976, -0.01801043376326561, -0.06274772435426712, -0.07664986699819565, 0.13840864598751068, -0.012902176938951015, -0.013850869610905647, 0.023483814671635628, -0.09543570131063461, 0.011657126247882843, 0.09077144414186478, 0.21858744323253632, -0.019756684079766273, 0.013068949803709984, 0.002690849592909217, 0.003559199394658208, 0.004020281136035919, 0.08890576660633087, 0.05015893653035164, 0.24549788236618042, -0.020221805199980736, 0.06633075326681137, -0.015162118710577488, 0.005009075626730919, -0.15806715190410614, 0.02703150175511837, -0.0783555805683136, 0.0035886934492737055, -0.0020508861634880304, 0.08376528322696686, -0.009260497987270355, -0.2089611142873764, 0.06565052270889282, -0.15504370629787445, -0.1296423375606537, -0.011743721552193165, 0.006862685550004244, -0.030658036470413208, 0.041531533002853394, 0.018887808546423912, -0.025805888697504997, 0.18879029154777527, -0.016181915998458862, -0.06087817624211311, -0.07815546542406082, 0.05829624831676483, -0.14671605825424194, 0.1509789526462555, 0.005372295156121254, 0.08752699941396713, 0.05130428075790405, 0.0008598066051490605, -0.1183796375989914, 0.05172531679272652, -0.004612329415977001, -0.040462810546159744, 0.04571295902132988, 0.09637537598609924, -0.028751032426953316, 0.02548103779554367, 0.05428515374660492, -0.0031619954388588667, 0.05287514626979828, 0.0655730813741684, -0.00027996188146062195, -0.031554289162158966, 0.027123864740133286, -0.11649622768163681, 0.14252185821533203, 0.17089688777923584, -0.03845314681529999, 0.007129949517548084, -0.028751442208886147, -0.03614727780222893, 0.0014747533714398742, 0.07439179718494415, -0.013795834966003895, -0.11490000784397125, 0.019185779616236687, -0.1364585906267166, 0.07070158421993256, -0.1441023051738739, -0.025338269770145416, 0.010240618139505386, -0.033841900527477264, -0.0780402272939682, 0.10146039724349976, 0.008679973892867565, -0.0030525431502610445, -0.029898788779973984, -0.06302885711193085, 0.009955517947673798, 0.044008076190948486, -0.07165932655334473, -0.05622417479753494 ]
null
null
transformers
# hebrew-bad_wiki-gpt_neo-tiny ## Table of Contents - [Model Details](#model-details) - [Uses](#uses) - [Risks, Limitations and Biases](#risks-limitations-and-biases) - [Training](#training) - [Evaluation](#evaluation) - [Environmental Impact](#environmental-impact) - [How to Get Started With the Model](#how-to-get-started-with-the-model) ## Model Details **Model Description:** The model developer notes that the model is > Hebrew nonsense generation model which produces really bad wiki-abstract text. - **Developed by:** [Doron Adler](https://github.com/Norod) - **Model Type:** Text Generation - **Language(s):** Hebrew - **License:** MIT - **Resources for more information:** - [GitHub Repo](https://github.com/Norod/hebrew-gpt_neo) - [HuggingFace Space](https://huggingface.co/spaces/Norod78/Hebrew-GPT-Neo-Small) ## Uses #### Direct Use This model can be used for text generation. #### Misuse and Out-of-scope Use ## Risks, Limitations and Biases **CONTENT WARNING: Readers should be aware this section contains content that is disturbing, offensive, and can propagate historical and current stereotypes.** Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). ## Training #### Training Data [Hebrew Wikipedia Dump](https://dumps.wikimedia.org/hewiki/latest/) (hewiki abstract) from May 2020 #### Training Procedure This model was fined tuned upon [hebrew-gpt_neo-tiny](https://huggingface.co/Norod78/hebrew-gpt_neo-tiny) which was previously trained using [EleutherAI's gpt-neo](https://github.com/EleutherAI/gpt-neo). Fine-tuning on the wiki-absract text was done using [@minimaxir](https://twitter.com/minimaxir)'s [aitextgen](https://github.com/minimaxir/aitextgen). ## Evaluation #### Configs Model configs for the hebrew-gpt_neo-tiny is available on the [hebrew-gpt_neo model github](https://github.com/Norod/hebrew-gpt_neo/tree/main/hebrew-gpt_neo-tiny/configs) * **Activation Function:** gelu * **Number_Head:** 12 * **Number_Vocab:** 50257 * **Train batch size:** 250 * **Eval batch size:** 64 * **Predict batch size:** 1 ## Environmental Impact Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). We present the hardware type based on the [associated paper](https://arxiv.org/pdf/2105.09680.pdf). - **Hardware Type:** [More information needed] - **Hours used:** Unknown - **Cloud Provider:** GCP tpu-v8s - **Compute Region:** europe-west4 - **Carbon Emitted:** [More information needed] ## How to Get Started With the Model A Google Colab Notebook is also available [here](https://colab.research.google.com/github/Norod/hebrew-gpt_neo/blob/main/hebrew-gpt_neo-tiny/Norod78_hebrew_gpt_neo_tiny_Colab.ipynb) ​​ ``` from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("Norod78/hebrew-bad_wiki-gpt_neo-tiny") model = AutoModelForCausalLM.from_pretrained("Norod78/hebrew-bad_wiki-gpt_neo-tiny") ```
{"language": "he", "license": "mit", "thumbnail": "https://avatars1.githubusercontent.com/u/3617152?norod.jpg", "widget": [{"text": "\u05de\u05ea\u05de\u05d8\u05d9\u05e7\u05d4:"}, {"text": "\u05e2\u05dc\u05d9\u05d9\u05ea \u05d4\u05de\u05db\u05d5\u05e0\u05d5\u05ea"}, {"text": "\u05d5\u05d9\u05e7\u05d9\u05e4\u05d3\u05d9\u05d4 \u05d4\u05e2\u05d1\u05e8\u05d9\u05ea"}, {"text": "\u05d4\u05d0\u05d9\u05e8\u05d5\u05d5\u05d9\u05d6\u05d9\u05d5\u05df \u05d4\u05d5\u05d0"}, {"text": "\u05d3\u05d5\u05d3 \u05d1\u05df-\u05d2\u05d5\u05e8\u05d9\u05d5\u05df \u05d4\u05d9\u05d4"}]}
text-generation
Norod78/hebrew-bad_wiki-gpt_neo-tiny
[ "transformers", "pytorch", "coreml", "safetensors", "gpt_neo", "text-generation", "he", "arxiv:1910.09700", "arxiv:2105.09680", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "1910.09700", "2105.09680" ]
[ "he" ]
TAGS #transformers #pytorch #coreml #safetensors #gpt_neo #text-generation #he #arxiv-1910.09700 #arxiv-2105.09680 #license-mit #autotrain_compatible #endpoints_compatible #region-us
# hebrew-bad_wiki-gpt_neo-tiny ## Table of Contents - Model Details - Uses - Risks, Limitations and Biases - Training - Evaluation - Environmental Impact - How to Get Started With the Model ## Model Details Model Description: The model developer notes that the model is > Hebrew nonsense generation model which produces really bad wiki-abstract text. - Developed by: Doron Adler - Model Type: Text Generation - Language(s): Hebrew - License: MIT - Resources for more information: - GitHub Repo - HuggingFace Space ## Uses #### Direct Use This model can be used for text generation. #### Misuse and Out-of-scope Use ## Risks, Limitations and Biases CONTENT WARNING: Readers should be aware this section contains content that is disturbing, offensive, and can propagate historical and current stereotypes. Significant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). ## Training #### Training Data Hebrew Wikipedia Dump (hewiki abstract) from May 2020 #### Training Procedure This model was fined tuned upon hebrew-gpt_neo-tiny which was previously trained using EleutherAI's gpt-neo. Fine-tuning on the wiki-absract text was done using @minimaxir's aitextgen. ## Evaluation #### Configs Model configs for the hebrew-gpt_neo-tiny is available on the hebrew-gpt_neo model github * Activation Function: gelu * Number_Head: 12 * Number_Vocab: 50257 * Train batch size: 250 * Eval batch size: 64 * Predict batch size: 1 ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). We present the hardware type based on the associated paper. - Hardware Type: [More information needed] - Hours used: Unknown - Cloud Provider: GCP tpu-v8s - Compute Region: europe-west4 - Carbon Emitted: [More information needed] ## How to Get Started With the Model A Google Colab Notebook is also available here ​​
[ "# hebrew-bad_wiki-gpt_neo-tiny", "## Table of Contents\n- Model Details\n- Uses\n- Risks, Limitations and Biases\n- Training\n- Evaluation\n- Environmental Impact\n- How to Get Started With the Model", "## Model Details\nModel Description:\n\nThe model developer notes that the model is \n> Hebrew nonsense generation model which produces really bad wiki-abstract text. \n\n\n- Developed by: Doron Adler\n- Model Type: Text Generation\n- Language(s): Hebrew\n- License: MIT\n- Resources for more information:\n- GitHub Repo\n- HuggingFace Space", "## Uses", "#### Direct Use\n\nThis model can be used for text generation.", "#### Misuse and Out-of-scope Use", "## Risks, Limitations and Biases\nCONTENT WARNING: Readers should be aware this section contains content that is disturbing, offensive, and can propagate historical and current stereotypes.\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)).", "## Training", "#### Training Data\n Hebrew Wikipedia Dump (hewiki abstract) from May 2020", "#### Training Procedure\n\n\nThis model was fined tuned upon hebrew-gpt_neo-tiny which was previously trained using EleutherAI's gpt-neo. \n\nFine-tuning on the wiki-absract text was done using @minimaxir's aitextgen.", "## Evaluation", "#### Configs\n\nModel configs for the hebrew-gpt_neo-tiny is available on the hebrew-gpt_neo model github \n\n* Activation Function: gelu\n* Number_Head: 12\n* Number_Vocab: 50257\n* Train batch size: 250\n* Eval batch size: 64\n* Predict batch size: 1", "## Environmental Impact\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). We present the hardware type based on the associated paper.\n\n\n- Hardware Type: [More information needed]\n\n- Hours used: Unknown\n\n- Cloud Provider: GCP tpu-v8s\n\n- Compute Region: europe-west4\n\n- Carbon Emitted: [More information needed]", "## How to Get Started With the Model\n\nA Google Colab Notebook is also available here\n\n\n​​" ]
[ "TAGS\n#transformers #pytorch #coreml #safetensors #gpt_neo #text-generation #he #arxiv-1910.09700 #arxiv-2105.09680 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n", "# hebrew-bad_wiki-gpt_neo-tiny", "## Table of Contents\n- Model Details\n- Uses\n- Risks, Limitations and Biases\n- Training\n- Evaluation\n- Environmental Impact\n- How to Get Started With the Model", "## Model Details\nModel Description:\n\nThe model developer notes that the model is \n> Hebrew nonsense generation model which produces really bad wiki-abstract text. \n\n\n- Developed by: Doron Adler\n- Model Type: Text Generation\n- Language(s): Hebrew\n- License: MIT\n- Resources for more information:\n- GitHub Repo\n- HuggingFace Space", "## Uses", "#### Direct Use\n\nThis model can be used for text generation.", "#### Misuse and Out-of-scope Use", "## Risks, Limitations and Biases\nCONTENT WARNING: Readers should be aware this section contains content that is disturbing, offensive, and can propagate historical and current stereotypes.\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)).", "## Training", "#### Training Data\n Hebrew Wikipedia Dump (hewiki abstract) from May 2020", "#### Training Procedure\n\n\nThis model was fined tuned upon hebrew-gpt_neo-tiny which was previously trained using EleutherAI's gpt-neo. \n\nFine-tuning on the wiki-absract text was done using @minimaxir's aitextgen.", "## Evaluation", "#### Configs\n\nModel configs for the hebrew-gpt_neo-tiny is available on the hebrew-gpt_neo model github \n\n* Activation Function: gelu\n* Number_Head: 12\n* Number_Vocab: 50257\n* Train batch size: 250\n* Eval batch size: 64\n* Predict batch size: 1", "## Environmental Impact\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). We present the hardware type based on the associated paper.\n\n\n- Hardware Type: [More information needed]\n\n- Hours used: Unknown\n\n- Cloud Provider: GCP tpu-v8s\n\n- Compute Region: europe-west4\n\n- Carbon Emitted: [More information needed]", "## How to Get Started With the Model\n\nA Google Colab Notebook is also available here\n\n\n​​" ]
[ 72, 14, 38, 78, 3, 13, 11, 85, 2, 17, 65, 3, 81, 90, 18 ]
[ "passage: TAGS\n#transformers #pytorch #coreml #safetensors #gpt_neo #text-generation #he #arxiv-1910.09700 #arxiv-2105.09680 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n# hebrew-bad_wiki-gpt_neo-tiny## Table of Contents\n- Model Details\n- Uses\n- Risks, Limitations and Biases\n- Training\n- Evaluation\n- Environmental Impact\n- How to Get Started With the Model## Model Details\nModel Description:\n\nThe model developer notes that the model is \n> Hebrew nonsense generation model which produces really bad wiki-abstract text. \n\n\n- Developed by: Doron Adler\n- Model Type: Text Generation\n- Language(s): Hebrew\n- License: MIT\n- Resources for more information:\n- GitHub Repo\n- HuggingFace Space## Uses#### Direct Use\n\nThis model can be used for text generation.#### Misuse and Out-of-scope Use## Risks, Limitations and Biases\nCONTENT WARNING: Readers should be aware this section contains content that is disturbing, offensive, and can propagate historical and current stereotypes.\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)).## Training#### Training Data\n Hebrew Wikipedia Dump (hewiki abstract) from May 2020#### Training Procedure\n\n\nThis model was fined tuned upon hebrew-gpt_neo-tiny which was previously trained using EleutherAI's gpt-neo. \n\nFine-tuning on the wiki-absract text was done using @minimaxir's aitextgen.## Evaluation#### Configs\n\nModel configs for the hebrew-gpt_neo-tiny is available on the hebrew-gpt_neo model github \n\n* Activation Function: gelu\n* Number_Head: 12\n* Number_Vocab: 50257\n* Train batch size: 250\n* Eval batch size: 64\n* Predict batch size: 1" ]
[ -0.07007297873497009, 0.16147129237651825, -0.0026359432376921177, 0.02931816130876541, 0.055209483951330185, -0.004408437293022871, 0.09972809255123138, 0.10219758003950119, 0.02667323872447014, 0.11736134439706802, 0.02981867454946041, 0.0027573690749704838, 0.1160578653216362, 0.12230823189020157, -0.009479790925979614, -0.2003471851348877, 0.04547477513551712, -0.06355476379394531, 0.02764425426721573, 0.0883534774184227, 0.13049952685832977, -0.07421672344207764, 0.06571526825428009, 0.014584331773221493, -0.061261191964149475, -0.00014887956785969436, -0.04047345742583275, -0.03916551545262337, 0.038226380944252014, 0.06633448600769043, 0.04321960359811783, -0.01984710991382599, 0.02819746732711792, -0.22157657146453857, 0.028703438118100166, 0.05487370118498802, 0.01620038039982319, 0.08293002098798752, 0.13589869439601898, 0.018981851637363434, 0.17763188481330872, -0.073893241584301, 0.046901900321245193, 0.07525113970041275, -0.06948694586753845, -0.08612143248319626, -0.15182997286319733, 0.1129031553864479, 0.10538002103567123, 0.08939200639724731, -0.05308506265282631, 0.09890580177307129, -0.043397076427936554, 0.002525614108890295, 0.1664062738418579, -0.11497858911752701, -0.03346847742795944, 0.027026912197470665, -0.0015730239683762193, 0.019296593964099884, -0.13448990881443024, -0.013149688951671124, 0.03889356926083565, 0.042274605482816696, 0.021471930667757988, -0.0006951800896786153, 0.14004334807395935, -0.01003576721996069, -0.1296437829732895, -0.036039870232343674, 0.038087695837020874, 0.03541712090373039, -0.03199922665953636, -0.24287830293178558, -0.004278672393411398, 0.0657591000199318, 0.0123797208070755, -0.07186263054609299, -0.003246631007641554, -0.026174647733569145, 0.09746018797159195, -0.08636440336704254, -0.07187777012586594, 0.005372871644794941, -0.0060339723713696, 0.15053561329841614, 0.04031573608517647, -0.0008140058489516377, -0.01175524853169918, 0.05190066993236542, 0.0031513767316937447, -0.10964743793010712, -0.07569879293441772, -0.08336211740970612, -0.05300391465425491, -0.059548769146203995, 0.004466288723051548, -0.01165654044598341, 0.04584131017327309, 0.14344707131385803, -0.20398271083831787, 0.02874545380473137, 0.01947901025414467, -0.009244157001376152, 0.08224552124738693, 0.08749895542860031, -0.08360430598258972, -0.09169550985097885, 0.049109701067209244, 0.05170602723956108, 0.005062493961304426, 0.012294717133045197, -0.02666286565363407, -0.052764322608709335, 0.04354698583483696, 0.07471619546413422, 0.08333232253789902, 0.006159555166959763, -0.06098124384880066, -0.06639929860830307, 0.19139030575752258, -0.15603692829608917, 0.008884089067578316, 0.03768613189458847, -0.06289023905992508, 0.04255123808979988, -0.03976546600461006, -0.02572615258395672, -0.08326602727174759, 0.06465046852827072, -0.11193370074033737, -0.019248099997639656, -0.07276766747236252, -0.0740136057138443, 0.04219252988696098, 0.02706245891749859, -0.04664624109864235, -0.07780948281288147, -0.17073217034339905, -0.07952859252691269, -0.014349117875099182, -0.08325958251953125, -0.03193913772702217, -0.057891637086868286, -0.08498656749725342, 0.0165549423545599, -0.004057145677506924, 0.0065458654426038265, -0.022606035694479942, 0.02090933918952942, -0.05528976395726204, 0.02184690535068512, 0.10858246684074402, 0.02483268640935421, -0.08138977736234665, 0.04087377339601517, -0.16871321201324463, 0.13582244515419006, -0.0824122205376625, -0.05616656318306923, -0.11416588723659515, -0.04309764504432678, 0.008027194999158382, 0.008679086342453957, 0.02082907222211361, 0.1794918179512024, -0.16406206786632538, -0.054723069071769714, 0.19816632568836212, -0.1282266527414322, -0.09204430133104324, 0.08013300597667694, -0.049077484756708145, 0.06735232472419739, 0.11225567013025284, 0.06783121824264526, 0.07863637059926987, -0.11966712772846222, -0.11754200607538223, -0.06645020842552185, -0.001743657747283578, 0.17899033427238464, 0.06337440013885498, -0.0895034521818161, 0.10410074889659882, -0.020722804591059685, -0.06972213834524155, 0.028142040595412254, -0.004665344022214413, -0.046183764934539795, 0.02518327534198761, -0.016018714755773544, 0.07883913815021515, -0.032315127551555634, -0.031089719384908676, -0.0032484056428074837, -0.16913847625255585, -0.04843314737081528, 0.08926092088222504, -0.030460869893431664, 0.023315127938985825, -0.08653916418552399, 0.04974814131855965, 0.025172140449285507, 0.019929591566324234, -0.17084212601184845, -0.08799121528863907, 0.018271571025252342, -0.1613275706768036, 0.044885486364364624, -0.008643030188977718, 0.03106742352247238, 0.050956841558218, -0.05472693219780922, -0.013879629783332348, -0.016973091289401054, 0.026172462850809097, -0.054529041051864624, -0.166801318526268, -0.0002952151407953352, -0.04697393998503685, 0.10971032083034515, -0.17966802418231964, -0.0020457652863115072, 0.09948854893445969, 0.13577967882156372, 0.014231652952730656, -0.05077926069498062, 0.02081916481256485, -0.019144609570503235, -0.010521960444748402, -0.10170169919729233, -0.01941503770649433, -0.03748330846428871, -0.050812721252441406, 0.06925146281719208, -0.17357338964939117, -0.13396938145160675, 0.09483836591243744, 0.08724302798509598, -0.15325535833835602, -0.08437570184469223, -0.02507626824080944, -0.04361451789736748, -0.06932743638753891, -0.10696060955524445, 0.1709088832139969, 0.06631514430046082, 0.05038703605532646, -0.07715856283903122, -0.08516699820756912, -0.046408504247665405, 0.009738363325595856, -0.020101264119148254, 0.06189500913023949, -0.08196327090263367, -0.19752585887908936, 0.1365833431482315, 0.10290464013814926, 0.003523790743201971, 0.13358686864376068, 0.025724640116095543, -0.09962598234415054, -0.03323807939887047, 0.005758706480264664, -0.006295311730355024, 0.06362476944923401, 0.02355869486927986, 0.05211995914578438, 0.018377354368567467, 0.009204063564538956, 0.0590183362364769, -0.05375710874795914, 0.04070815443992615, 0.011695821769535542, -0.013346514664590359, 0.039728619158267975, 0.02260608971118927, 0.0385177880525589, 0.10119815170764923, 0.0523509755730629, 0.04053126648068428, 0.007457255385816097, -0.05399007350206375, -0.12327273935079575, 0.17181046307086945, -0.10770080983638763, -0.20939134061336517, -0.09033121168613434, 0.05335862189531326, -0.016845114529132843, -0.0015516635030508041, 0.0006744608981534839, -0.06677200645208359, -0.09699317812919617, -0.09163810312747955, 0.10053340345621109, 0.0573546476662159, -0.09415113925933838, -0.024035129696130753, -0.006787670310586691, -0.000029944792913738638, -0.12468114495277405, 0.006087138317525387, 0.04906029254198074, -0.055401865392923355, -0.006325596943497658, 0.05783027410507202, 0.03114326484501362, 0.09704276919364929, 0.04613451287150383, -0.009532298892736435, -0.02009654976427555, 0.2419854998588562, -0.12377354502677917, 0.08931072056293488, 0.13280068337917328, -0.00980529747903347, 0.08900415897369385, 0.10102313756942749, 0.014755266718566418, -0.030859626829624176, 0.014048025012016296, 0.04924092814326286, -0.007247133646160364, -0.22318847477436066, -0.08550643920898438, -0.02256873808801174, -0.08034412562847137, 0.04947063699364662, 0.03999750316143036, 0.08013847470283508, 0.04688947647809982, -0.1479291468858719, -0.03386737033724785, 0.09659133106470108, 0.10193410515785217, 0.09108231961727142, 0.02043812908232212, 0.021044503897428513, -0.0032153059728443623, 0.0029230937361717224, 0.10922656953334808, -0.04380001127719879, 0.24496981501579285, -0.023500243201851845, 0.16430817544460297, 0.05879467353224754, 0.020932191982865334, -0.009092447347939014, 0.014201070182025433, -0.003477981314063072, 0.01813611015677452, -0.015408510342240334, -0.08306513726711273, 0.00230834330432117, 0.10747180879116058, 0.03126836568117142, -0.002287129405885935, 0.051358241587877274, -0.07511444389820099, 0.059346478432416916, 0.10506518185138702, 0.013428139500319958, -0.1433708220720291, -0.03941161558032036, 0.08157176524400711, -0.07904303818941116, -0.061650436371564865, 0.03240499645471573, 0.10601231455802917, -0.1732446253299713, 0.05599053204059601, -0.048427194356918335, 0.10156413167715073, -0.08431423455476761, -0.012567151337862015, -0.026784274727106094, 0.1087498739361763, -0.019722551107406616, 0.10002319514751434, -0.14316335320472717, 0.06298680603504181, 0.019447047263383865, 0.06248018890619278, -0.07789047062397003, 0.056532587856054306, 0.0783291608095169, -0.046059221029281616, 0.12229474633932114, 0.01964227855205536, -0.056018274277448654, -0.13274668157100677, -0.0516616627573967, -0.045013852417469025, 0.04710904508829117, -0.07181946188211441, 0.09739405661821365, -0.007950992323458195, 0.007097064051777124, -0.03677765280008316, -0.0046508051455020905, -0.1717301607131958, -0.18565762042999268, 0.059099167585372925, -0.0675138458609581, 0.07212807983160019, -0.059146683663129807, -0.06255339086055756, -0.020140374079346657, 0.11978080868721008, -0.19354405999183655, -0.141318216919899, -0.1140068918466568, 0.0342225581407547, 0.16619546711444855, -0.10630286484956741, 0.004458087030798197, 0.04412172734737396, 0.1726289540529251, -0.053496796637773514, -0.059642307460308075, -0.0020966713782399893, -0.06608172506093979, -0.16123300790786743, -0.007631653919816017, 0.08780400454998016, 0.16471487283706665, 0.0407199002802372, 0.03147939592599869, 0.05127112939953804, -0.029335692524909973, -0.15951882302761078, -0.0123832942917943, 0.18636144697666168, 0.11745227873325348, 0.05562487617135048, 0.03178376704454422, -0.09208463877439499, -0.12349362671375275, -0.019454780966043472, 0.046264246106147766, 0.21821807324886322, -0.03079395741224289, 0.08586111664772034, 0.08882562816143036, -0.0819912850856781, -0.17613762617111206, -0.022235296666622162, 0.07127546519041061, 0.014866683632135391, 0.07162344455718994, -0.1494625359773636, 0.045988764613866806, 0.06777436286211014, -0.002012520097196102, 0.11407458037137985, -0.1931360512971878, -0.11858560144901276, 0.1209460198879242, 0.008331856690347195, -0.1526685357093811, -0.10755869001150131, -0.08164749294519424, -0.026839904487133026, -0.03550668805837631, 0.1380240023136139, -0.05395960435271263, 0.010924414731562138, 0.01733696088194847, 0.06817062199115753, 0.05097910389304161, -0.03469907492399216, 0.16107723116874695, 0.03094550408422947, 0.03481210395693779, -0.06515854597091675, -0.031732287257909775, 0.06405188143253326, -0.04839014634490013, 0.07047630101442337, -0.0025756198447197676, 0.02520294114947319, -0.07579857856035233, -0.03100874274969101, -0.09834367036819458, 0.052031874656677246, -0.0798473060131073, -0.03899235278367996, -0.06369991600513458, 0.11977697908878326, 0.10007655620574951, -0.024299761280417442, 0.047681692987680435, -0.03469022363424301, 0.04503902420401573, 0.15024392306804657, 0.11480201035737991, 0.07740569114685059, -0.10510629415512085, -0.012023720890283585, 0.008793571963906288, 0.06973174214363098, -0.08252646028995514, 0.011181825771927834, 0.04584787040948868, 0.02683665044605732, 0.14061681926250458, -0.019114859402179718, -0.17467935383319855, 0.010069411247968674, 0.018125755712389946, -0.10247103869915009, -0.13663041591644287, -0.06258135288953781, 0.05667685717344284, -0.10357321798801422, -0.09400863200426102, 0.09926880151033401, -0.012475775554776192, -0.05451592430472374, 0.02523145079612732, 0.10039103031158447, 0.01584875024855137, 0.1325024515390396, 0.03747720643877983, 0.04774556681513786, -0.0813273936510086, 0.012197265401482582, 0.061072465032339096, -0.08999872952699661, 0.04209556430578232, 0.11637304723262787, -0.06031190603971481, -0.025540104135870934, -0.05128932744264603, 0.07227864861488342, -0.08250411599874496, -0.027667604386806488, 0.03653949871659279, -0.07259668409824371, 0.010184218175709248, 0.11755065619945526, 0.026215186342597008, 0.047190479934215546, 0.018640456721186638, 0.01826312765479088, -0.022542450577020645, 0.09380271285772324, 0.06099800765514374, 0.02071194164454937, -0.06372896581888199, 0.04331169277429581, -0.006185547914355993, -0.019801383838057518, -0.012621005065739155, -0.007467934861779213, -0.11514714360237122, -0.046948302537202835, -0.18677644431591034, 0.09455227851867676, -0.15631745755672455, 0.00012991092808078974, -0.03385211527347565, -0.01649627834558487, 0.02469182200729847, 0.020401010289788246, -0.036146484315395355, -0.07136538624763489, -0.017782188951969147, 0.10088731348514557, -0.17232218384742737, 0.01594303548336029, 0.07157070189714432, -0.10968933254480362, 0.07588047534227371, 0.021787123754620552, 0.005898179020732641, 0.03985578566789627, -0.09324205666780472, -0.003100777044892311, -0.02466682903468609, 0.017245285212993622, 0.04262308031320572, -0.22462819516658783, -0.01785827986896038, -0.020895231515169144, -0.027214208617806435, 0.00897311419248581, -0.03904164209961891, -0.07973679900169373, 0.02073155902326107, -0.016416363418102264, -0.04235890880227089, -0.06552469730377197, 0.04803403094410896, 0.17125950753688812, -0.01847606711089611, 0.13664239645004272, -0.054013434797525406, 0.09486868232488632, -0.1374230533838272, -0.00773450406268239, 0.03458140417933464, 0.038213904947042465, 0.05076492950320244, 0.021745692938566208, 0.06232747808098793, -0.0029768056701868773, 0.12884657084941864, -0.04639537259936333, -0.0017871352611109614, 0.0766352042555809, 0.06321480125188828, -0.015291846357285976, 0.04603999853134155, -0.010135588236153126, -0.014671145007014275, -0.03010133095085621, -0.011806287802755833, -0.019277626648545265, -0.04257349297404289, -0.15008726716041565, 0.16192744672298431, 0.1278405338525772, 0.02361789159476757, 0.001119369873777032, 0.018848884850740433, -0.08785686641931534, -0.009227975271642208, 0.08577480167150497, 0.006652751937508583, -0.07175332307815552, -0.056453634053468704, 0.1280241757631302, 0.16938626766204834, -0.20502227544784546, 0.0973501056432724, -0.03328859433531761, -0.06367403268814087, -0.08987624198198318, -0.197666198015213, -0.048589229583740234, -0.006694575771689415, 0.016718080267310143, -0.12658198177814484, 0.09237269312143326, 0.1631539762020111, 0.015565660782158375, -0.05626114830374718, 0.09248407930135727, -0.024185074493288994, -0.14360500872135162, 0.04419718310236931, 0.0607670359313488, 0.019898712635040283, -0.003951121587306261, 0.029657740145921707, 0.04267892614006996, 0.0464363656938076, 0.07302811741828918, 0.0410425178706646, 0.02092473953962326, 0.0008645746856927872, -0.09668322652578354, -0.06472066044807434, 0.026740703731775284, 0.005864591337740421, 0.02822950854897499, 0.17003318667411804, 0.02885107696056366, -0.005207676440477371, -0.03930676355957985, 0.17490072548389435, -0.03323506936430931, -0.06290391832590103, -0.12503984570503235, 0.04265063628554344, -0.0015699826180934906, 0.01283820066601038, 0.05542512983083725, -0.14189767837524414, 0.002398640150204301, 0.11166474968194962, 0.1280384212732315, -0.060996171087026596, 0.005679498426616192, -0.053561605513095856, 0.007965806871652603, -0.013422472402453423, 0.027035709470510483, 0.00041685369797050953, 0.20937125384807587, -0.050342295318841934, 0.11970699578523636, -0.018987419083714485, -0.04036491736769676, -0.07402178645133972, 0.11617646366357803, -0.016718484461307526, 0.029378753155469894, -0.08873762935400009, 0.11395753175020218, -0.025617163628339767, -0.24719960987567902, -0.017561446875333786, -0.054562076926231384, -0.09916138648986816, 0.002334464807063341, -0.018137937411665916, 0.00818660482764244, 0.072478286921978, 0.06905176490545273, -0.0012754667550325394, 0.14619779586791992, 0.00902653019875288, -0.04210394620895386, -0.054011378437280655, 0.0919468104839325, -0.12218349426984787, 0.2049395591020584, 0.021043792366981506, 0.11864466220140457, 0.10272546857595444, -0.01820053905248642, -0.13738195598125458, 0.013924039900302887, 0.04769871383905411, -0.06324600428342819, 0.04440847411751747, 0.20101268589496613, 0.005365265998989344, 0.07533954083919525, 0.09198129922151566, -0.06470688432455063, 0.029164519160985947, 0.0057310545817017555, -0.050072211772203445, -0.09021800011396408, 0.12850281596183777, -0.07870393246412277, 0.15494981408119202, 0.1479705423116684, -0.07192900031805038, 0.009874552488327026, -0.011984766460955143, -0.0020728164818137884, -0.02193719893693924, 0.1301809549331665, 0.005526165943592787, -0.1672818511724472, 0.004093667957931757, -0.002654721261933446, 0.07840869575738907, -0.21800139546394348, -0.029474202543497086, -0.002931770170107484, -0.01095560472458601, -0.03131154552102089, 0.11636260896921158, -0.0009313637856394053, -0.01806011237204075, -0.01672198995947838, -0.06059712544083595, -0.025180168449878693, 0.08865897357463837, -0.044785816222429276, -0.0119097251445055 ]
null
null
transformers
# hebrew-gpt_neo-small Hebrew text generation model based on [EleutherAI's gpt-neo](https://github.com/EleutherAI/gpt-neo). Each was trained on a TPUv3-8 which was made avilable to me via the [TPU Research Cloud](https://sites.research.google/trc/) Program. ## Datasets 1. An assortment of various Hebrew corpuses - I have made it available [here](https://mega.nz/folder/CodSSA4R#4INvMes-56m_WUi7jQMbJQ) 2. oscar / unshuffled_deduplicated_he - [Homepage](https://oscar-corpus.com) | [Dataset Permalink](https://huggingface.co/datasets/viewer/?dataset=oscar&config=unshuffled_deduplicated_he) The Open Super-large Crawled ALMAnaCH coRpus is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture. 3. CC100-Hebrew Dataset [Homepage](https://metatext.io/datasets/cc100-hebrew) Created by Conneau & Wenzek et al. at 2020, the CC100-Hebrew This dataset is one of the 100 corpora of monolingual data that was processed from the January-December 2018 Commoncrawl snapshots from the CC-Net repository. The size of this corpus is 6.1G., in Hebrew language. ## Training Config Available [here](https://github.com/Norod/hebrew-gpt_neo/tree/main/hebrew-gpt_neo-small/configs) <BR> ## Usage ### Google Colab Notebook Available [here ](https://colab.research.google.com/github/Norod/hebrew-gpt_neo/blob/main/hebrew-gpt_neo-small/Norod78_hebrew_gpt_neo_small_Colab.ipynb) <BR> #### Simple usage sample code ```python !pip install tokenizers==0.10.2 transformers==4.6.0 from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("Norod78/hebrew-gpt_neo-small") model = AutoModelForCausalLM.from_pretrained("Norod78/hebrew-gpt_neo-small", pad_token_id=tokenizer.eos_token_id) prompt_text = "אני אוהב שוקולד ועוגות" max_len = 512 sample_output_num = 3 seed = 1000 import numpy as np import torch device = torch.device("cuda" if torch.cuda.is_available() else "cpu") n_gpu = 0 if torch.cuda.is_available()==False else torch.cuda.device_count() print(f"device: {device}, n_gpu: {n_gpu}") np.random.seed(seed) torch.manual_seed(seed) if n_gpu > 0: torch.cuda.manual_seed_all(seed) model.to(device) encoded_prompt = tokenizer.encode( prompt_text, add_special_tokens=False, return_tensors="pt") encoded_prompt = encoded_prompt.to(device) if encoded_prompt.size()[-1] == 0: input_ids = None else: input_ids = encoded_prompt print("input_ids = " + str(input_ids)) if input_ids != None: max_len += len(encoded_prompt[0]) if max_len > 2048: max_len = 2048 print("Updated max_len = " + str(max_len)) stop_token = "<|endoftext|>" new_lines = "\n\n\n" sample_outputs = model.generate( input_ids, do_sample=True, max_length=max_len, top_k=50, top_p=0.95, num_return_sequences=sample_output_num ) print(100 * '-' + "\n\t\tOutput\n" + 100 * '-') for i, sample_output in enumerate(sample_outputs): text = tokenizer.decode(sample_output, skip_special_tokens=True) # Remove all text after the stop token text = text[: text.find(stop_token) if stop_token else None] # Remove all text after 3 newlines text = text[: text.find(new_lines) if new_lines else None] print("\n{}: {}".format(i, text)) print("\n" + 100 * '-') ```
{"language": "he", "license": "mit", "thumbnail": "https://avatars1.githubusercontent.com/u/3617152?norod.jpg", "widget": [{"text": "\u05e2\u05d5\u05d3 \u05d1\u05d9\u05de\u05d9 \u05e7\u05d3\u05dd"}, {"text": "\u05e7\u05d5\u05e8\u05d0\u05d9\u05dd \u05dc\u05d9 \u05d3\u05d5\u05e8\u05d5\u05df \u05d5\u05d0\u05e0\u05d9 \u05de\u05e2\u05d5\u05e0\u05d9\u05d9\u05df \u05dc"}, {"text": "\u05e7\u05d5\u05e8\u05d0\u05d9\u05dd \u05dc\u05d9 \u05d0\u05d9\u05e6\u05d9\u05e7 \u05d5\u05d0\u05e0\u05d9 \u05d7\u05d5\u05e9\u05d1 \u05e9"}, {"text": "\u05d4\u05d7\u05ea\u05d5\u05dc \u05e9\u05dc\u05da \u05de\u05d0\u05d5\u05d3 \u05d7\u05de\u05d5\u05d3 \u05d5"}]}
text-generation
Norod78/hebrew-gpt_neo-small
[ "transformers", "pytorch", "jax", "onnx", "safetensors", "gpt_neo", "text-generation", "he", "license:mit", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "he" ]
TAGS #transformers #pytorch #jax #onnx #safetensors #gpt_neo #text-generation #he #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us
# hebrew-gpt_neo-small Hebrew text generation model based on EleutherAI's gpt-neo. Each was trained on a TPUv3-8 which was made avilable to me via the TPU Research Cloud Program. ## Datasets 1. An assortment of various Hebrew corpuses - I have made it available here 2. oscar / unshuffled_deduplicated_he - Homepage | Dataset Permalink The Open Super-large Crawled ALMAnaCH coRpus is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture. 3. CC100-Hebrew Dataset Homepage Created by Conneau & Wenzek et al. at 2020, the CC100-Hebrew This dataset is one of the 100 corpora of monolingual data that was processed from the January-December 2018 Commoncrawl snapshots from the CC-Net repository. The size of this corpus is 6.1G., in Hebrew language. ## Training Config Available here <BR> ## Usage ### Google Colab Notebook Available here <BR> #### Simple usage sample code
[ "# hebrew-gpt_neo-small\n\nHebrew text generation model based on EleutherAI's gpt-neo. Each was trained on a TPUv3-8 which was made avilable to me via the TPU Research Cloud Program.", "## Datasets\n\n1. An assortment of various Hebrew corpuses - I have made it available here\n\n\n2. oscar / unshuffled_deduplicated_he - Homepage | Dataset Permalink\n\nThe Open Super-large Crawled ALMAnaCH coRpus is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture.\n\n3. CC100-Hebrew Dataset Homepage \n\nCreated by Conneau & Wenzek et al. at 2020, the CC100-Hebrew This dataset is one of the 100 corpora of monolingual data that was processed from the January-December 2018 Commoncrawl snapshots from the CC-Net repository. The size of this corpus is 6.1G., in Hebrew language.", "## Training Config\n\nAvailable here <BR>", "## Usage", "### Google Colab Notebook\n\nAvailable here <BR>", "#### Simple usage sample code" ]
[ "TAGS\n#transformers #pytorch #jax #onnx #safetensors #gpt_neo #text-generation #he #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n", "# hebrew-gpt_neo-small\n\nHebrew text generation model based on EleutherAI's gpt-neo. Each was trained on a TPUv3-8 which was made avilable to me via the TPU Research Cloud Program.", "## Datasets\n\n1. An assortment of various Hebrew corpuses - I have made it available here\n\n\n2. oscar / unshuffled_deduplicated_he - Homepage | Dataset Permalink\n\nThe Open Super-large Crawled ALMAnaCH coRpus is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture.\n\n3. CC100-Hebrew Dataset Homepage \n\nCreated by Conneau & Wenzek et al. at 2020, the CC100-Hebrew This dataset is one of the 100 corpora of monolingual data that was processed from the January-December 2018 Commoncrawl snapshots from the CC-Net repository. The size of this corpus is 6.1G., in Hebrew language.", "## Training Config\n\nAvailable here <BR>", "## Usage", "### Google Colab Notebook\n\nAvailable here <BR>", "#### Simple usage sample code" ]
[ 62, 55, 175, 9, 3, 11, 6 ]
[ "passage: TAGS\n#transformers #pytorch #jax #onnx #safetensors #gpt_neo #text-generation #he #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n# hebrew-gpt_neo-small\n\nHebrew text generation model based on EleutherAI's gpt-neo. Each was trained on a TPUv3-8 which was made avilable to me via the TPU Research Cloud Program.## Datasets\n\n1. An assortment of various Hebrew corpuses - I have made it available here\n\n\n2. oscar / unshuffled_deduplicated_he - Homepage | Dataset Permalink\n\nThe Open Super-large Crawled ALMAnaCH coRpus is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture.\n\n3. CC100-Hebrew Dataset Homepage \n\nCreated by Conneau & Wenzek et al. at 2020, the CC100-Hebrew This dataset is one of the 100 corpora of monolingual data that was processed from the January-December 2018 Commoncrawl snapshots from the CC-Net repository. The size of this corpus is 6.1G., in Hebrew language.## Training Config\n\nAvailable here <BR>## Usage### Google Colab Notebook\n\nAvailable here <BR>#### Simple usage sample code" ]
[ -0.08637609332799911, 0.24672073125839233, -0.0019427603110671043, 0.039984602481126785, 0.002198815578594804, 0.03209085017442703, 0.2239653319120407, 0.09370202571153641, -0.0362972766160965, 0.05948614329099655, 0.03322739899158478, -0.05272837355732918, 0.10092636942863464, 0.046922922134399414, 0.06779618561267853, -0.12098982185125351, -0.03498724848031998, -0.025912774726748466, -0.052584223449230194, 0.037066586315631866, 0.06277870386838913, -0.055713217705488205, 0.08185028284788132, -0.059664465487003326, -0.0778227299451828, 0.009933198802173138, -0.06167855113744736, -0.05080905556678772, 0.056462932378053665, 0.07028793543577194, 0.07778442651033401, -0.008448629640042782, 0.042580246925354004, -0.14517903327941895, 0.018349111080169678, 0.06800027936697006, -0.03169897943735123, 0.04562411457300186, 0.09175428748130798, -0.05462666228413582, 0.2132241427898407, -0.1776503324508667, -0.07903017103672028, 0.059399962425231934, -0.08469787985086441, -0.09420114755630493, -0.1431194245815277, 0.06725718080997467, -0.005866583902388811, 0.07885140925645828, 0.0019513755105435848, 0.10414045304059982, -0.007580675650388002, 0.050492603331804276, 0.05302158370614052, -0.16448678076267242, -0.06740260124206543, 0.15029887855052948, -0.05635906755924225, 0.1337827891111374, -0.08885021507740021, -0.028680013492703438, 0.045455340296030045, 0.018452484160661697, -0.02643139660358429, -0.046650391072034836, -0.10073444247245789, -0.031239349395036697, -0.07437928020954132, -0.061280906200408936, 0.16596034169197083, -0.021986426785588264, 0.007899186573922634, -0.08329340070486069, -0.07693180441856384, 0.004325079265981913, 0.06697657704353333, 0.08172028511762619, 0.0030627287924289703, -0.019079072400927544, -0.04348592832684517, -0.13794979453086853, -0.12141069769859314, 0.03613799065351486, -0.02352471649646759, 0.0490485243499279, 0.004815034568309784, 0.06372786313295364, 0.026790065690875053, 0.059147316962480545, 0.0685615986585617, -0.13896101713180542, -0.010208137333393097, -0.03393334895372391, -0.009768101386725903, -0.03604616969823837, -0.00618584081530571, -0.08219805359840393, 0.040211185812950134, 0.10382269322872162, -0.02456614375114441, 0.003743615699931979, -0.019426755607128143, -0.00621084775775671, 0.017374727874994278, 0.14068496227264404, -0.004494063556194305, -0.12034185975790024, 0.04876289144158363, -0.023789621889591217, 0.07384838908910751, 0.031724363565444946, -0.07085387408733368, -0.02162107825279236, -0.01685531623661518, 0.029862718656659126, -0.004684989806264639, 0.04844333231449127, 0.10002684593200684, -0.05742812529206276, 0.15804028511047363, -0.13058623671531677, -0.016620244830846786, 0.05061620473861694, -0.022056302055716515, 0.006832631770521402, 0.001099268440157175, -0.043569374829530716, -0.09527802467346191, -0.020709693431854248, -0.04345846548676491, 0.004719479940831661, -0.04145935922861099, -0.09133411198854446, 0.05388735234737396, -0.11497732996940613, -0.01301558967679739, -0.15652920305728912, -0.10063280165195465, -0.04247498884797096, 0.0194847472012043, 0.0038951076567173004, 0.04248769208788872, -0.07453003525733948, -0.0013838190352544188, -0.010383320972323418, -0.02181808277964592, -0.015455937013030052, -0.04738371819257736, 0.040234096348285675, -0.0908384844660759, 0.021274060010910034, -0.07095571607351303, 0.0046532670967280865, -0.13639625906944275, -0.011543860659003258, -0.0569303035736084, 0.07037877291440964, -0.07935637980699539, 0.04198053479194641, -0.12281283736228943, -0.028205834329128265, 0.010988965630531311, -0.04590195417404175, 0.0043930853717029095, 0.18555916845798492, -0.2287726253271103, -0.06123129650950432, 0.1954602748155594, -0.10933943092823029, -0.06890978664159775, 0.08879509568214417, -0.02056770958006382, 0.06329230219125748, 0.09900180995464325, 0.17707878351211548, 0.11482306569814682, -0.05221929773688316, -0.18443553149700165, 0.014539234340190887, 0.06202295422554016, 0.025459514930844307, 0.033265385776758194, -0.044562872499227524, 0.11256396770477295, -0.016851790249347687, 0.0038390422705560923, 0.04437878355383873, 0.03152518719434738, -0.08274389803409576, 0.02194625698029995, -0.0540606752038002, -0.09662089496850967, 0.01972036063671112, 0.022238830104470253, -0.018822554498910904, -0.04267444834113121, -0.08581890910863876, 0.0909394919872284, -0.06863508373498917, 0.06090208515524864, -0.013764388859272003, 0.00014522462151944637, -0.07251102477312088, 0.010234414599835873, -0.06888297200202942, -0.07647529989480972, 0.04793013259768486, -0.06653033196926117, 0.09827938675880432, -0.03811819106340408, 0.08825800567865372, 0.04513222351670265, -0.04029576852917671, -0.018600767478346825, 0.01782955788075924, -0.022192729637026787, -0.062061332166194916, -0.08600480854511261, -0.034112464636564255, -0.005445314105600119, 0.12315333634614944, -0.10927870124578476, -0.013676411472260952, 0.042397819459438324, 0.124568872153759, 0.03407195955514908, -0.05133328586816788, 0.023797588422894478, -0.01301575917750597, -0.04059036448597908, -0.1331714540719986, -0.019293904304504395, 0.02230743318796158, -0.08781280368566513, 0.048578593879938126, -0.09896133095026016, -0.08030787110328674, 0.1338200718164444, 0.06177593022584915, -0.005081936717033386, -0.08544313162565231, -0.0557093545794487, -0.024210138246417046, 0.02434513159096241, -0.003111412515863776, 0.13882286846637726, -0.004139439202845097, 0.079193115234375, -0.0649566724896431, -0.0358467623591423, -0.009680595248937607, 0.027052121236920357, 0.0005989158526062965, 0.0740954577922821, 0.04138763248920441, -0.12903118133544922, 0.13909168541431427, 0.17000050842761993, -0.0020724362693727016, 0.16178327798843384, -0.017463847994804382, -0.07501435279846191, 0.02295367419719696, 0.011233216151595116, 0.02870965749025345, 0.11623956263065338, -0.01007402129471302, -0.01841847226023674, 0.001273512956686318, 0.00043486227514222264, 0.05165643244981766, -0.05957970395684242, 0.0478878878057003, -0.034372225403785706, -0.0414697490632534, 0.08178456872701645, -0.002196228364482522, -0.028583211824297905, 0.0650937631726265, -0.00489974906668067, 0.03866194933652878, -0.04229796305298805, -0.05743495747447014, -0.06309862434864044, 0.14392037689685822, -0.12665611505508423, -0.10692377388477325, -0.07805948704481125, -0.03763473033905029, -0.06315319240093231, 0.04203129932284355, 0.03581578657031059, -0.052937742322683334, -0.048687249422073364, -0.09285484999418259, 0.06523627787828445, 0.026120789349079132, -0.056001726537942886, -0.09237857162952423, 0.007628770545125008, -0.0498100183904171, -0.1543511003255844, 0.03918546438217163, 0.001256829360499978, -0.18222938477993011, 0.040185872465372086, -0.025329982861876488, 0.030321791768074036, 0.05649750307202339, 0.05996004119515419, -0.011871199123561382, -0.04206905886530876, 0.1545524150133133, -0.09488045424222946, 0.06532268226146698, 0.004099186975508928, 0.017372550442814827, 0.007377853617072105, 0.08780932426452637, 0.005818062461912632, -0.024442920461297035, -0.023280909284949303, -0.008556428365409374, -0.03804594650864601, -0.25912293791770935, -0.12492174655199051, -0.05337689071893692, 0.03927317261695862, 0.10745427012443542, 0.05557534471154213, 0.009626101702451706, 0.08567435294389725, -0.1462015062570572, 0.09976338595151901, 0.030285222455859184, 0.07034596800804138, -0.010136468335986137, 0.027890142053365707, -0.03984164446592331, -0.08172549307346344, -0.017798179760575294, 0.11758126318454742, 0.14175532758235931, 0.20016932487487793, -0.04617768153548241, 0.22938191890716553, 0.0041477009654045105, 0.08485029637813568, -0.039710041135549545, 0.051148220896720886, 0.03926343470811844, 0.08924869447946548, -0.020479604601860046, -0.13646985590457916, -0.033289194107055664, 0.11027215421199799, 0.02821289747953415, -0.02022608555853367, 0.07488615065813065, -0.08628533035516739, 0.08012082427740097, 0.19521568715572357, 0.09332733601331711, -0.14331868290901184, -0.0621035099029541, 0.07965069264173508, 0.0026148823089897633, -0.05970146507024765, 0.041615430265665054, 0.1302625834941864, -0.07088243216276169, 0.06817762553691864, 0.0026878861244767904, 0.07629615068435669, -0.10226667672395706, -0.013021856546401978, 0.014767508022487164, -0.019142305478453636, -0.010655217804014683, 0.08447813987731934, -0.20228488743305206, 0.14114710688591003, 0.08424222469329834, 0.05055610090494156, -0.0941205620765686, 0.006658162456005812, 0.02640000358223915, -0.0639132410287857, 0.13498073816299438, 0.05970268324017525, -0.04799741134047508, -0.04561774805188179, -0.15846017003059387, 0.025682542473077774, 0.07754049450159073, -0.054657407104969025, 0.08620937913656235, 0.06335385143756866, -0.007797311060130596, -0.06003906577825546, 0.0033474937081336975, -0.08020444959402084, -0.20966817438602448, 0.030606403946876526, 0.03259303420782089, -0.03103894367814064, -0.02719169296324253, -0.024466514587402344, -0.027125714346766472, 0.17398671805858612, -0.18685562908649445, -0.10588358342647552, -0.06816498190164566, -0.002484511351212859, 0.13528423011302948, -0.08515419065952301, 0.0009788101306185126, 0.012258767150342464, 0.010486815124750137, -0.08408308029174805, -0.14630746841430664, -0.00633610924705863, -0.05794529244303703, -0.08596435189247131, -0.03546182066202164, 0.16841229796409607, 0.0639786347746849, 0.038758568465709686, -0.0008477599476464093, 0.035018645226955414, -0.02241845801472664, -0.08463454246520996, -0.0053476993925869465, 0.1359589844942093, 0.07793240994215012, 0.08137956261634827, 0.0004061925283167511, -0.1491093933582306, -0.09051699936389923, -0.00860603153705597, 0.03304212540388107, 0.14844655990600586, -0.025619475170969963, 0.044730667024850845, 0.11848785728216171, -0.1118801012635231, -0.1830846220254898, -0.0375438891351223, -0.008472947403788567, 0.0042823622934520245, -0.07763566821813583, -0.20395387709140778, 0.07324625551700592, 0.04509018361568451, 0.005735605955123901, 0.17371097207069397, -0.21528518199920654, -0.07688897103071213, 0.058184877038002014, -0.007552553899586201, -0.021130653098225594, -0.14215867221355438, -0.07943930476903915, -0.015014390461146832, -0.03961772099137306, 0.11650799214839935, -0.12847504019737244, 0.05709664896130562, 0.010362991131842136, 0.022601738572120667, 0.032429032027721405, -0.060860369354486465, 0.06867033988237381, 0.13231061398983002, 0.04132584109902382, -0.05993293598294258, 0.0008407043642364442, 0.10082537680864334, -0.029011400416493416, 0.02710644342005253, -0.022351836785674095, 0.026004517450928688, -0.0972229465842247, 0.02277361787855625, -0.06418653577566147, 0.10208591818809509, -0.04785052314400673, -0.07856684923171997, -0.08122320473194122, 0.09349421411752701, 0.10253036022186279, 0.027617907151579857, 0.170961394906044, 0.045732393860816956, 0.04749986529350281, 0.09664379060268402, 0.07634000480175018, 0.06135138124227524, 0.03835170343518257, -0.023881996050477028, -0.019241107627749443, 0.08440811187028885, -0.06641139090061188, -0.019290506839752197, 0.08828066289424896, 0.012110107578337193, 0.019782666116952896, -0.03019033744931221, -0.17957597970962524, 0.004262045491486788, 0.047144751995801926, -0.15187329053878784, 0.018800564110279083, -0.05146913230419159, 0.03419605270028114, -0.026395592838525772, 0.04887675493955612, 0.13626106083393097, -0.06644223630428314, -0.035586509853601456, -0.02797763980925083, 0.02975725382566452, 0.014665721915662289, 0.09389936923980713, -0.0034478381276130676, -0.024598952382802963, -0.08438801765441895, 0.08548176288604736, 0.16659347712993622, -0.13569477200508118, -0.021631989628076553, 0.16208061575889587, -0.10507158190011978, -0.04741604998707771, -0.03554248809814453, -0.007488733623176813, -0.09091487526893616, -0.0021854634396731853, 0.04764934256672859, -0.0301520936191082, -0.06182941794395447, 0.14407262206077576, 0.02545919269323349, 0.08613789081573486, 0.03339824825525284, 0.02242460660636425, -0.04519110172986984, 0.05638531595468521, -0.03660672903060913, 0.026032721623778343, 0.009742999449372292, 0.08261504024267197, -0.0160027127712965, 0.022584402933716774, 0.0107216602191329, 0.008299624547362328, -0.09159942716360092, -0.04775157943367958, -0.08140971511602402, 0.06064778193831444, -0.08967325091362, 0.010329879820346832, -0.04432230815291405, 0.01825067214667797, -0.019694779068231583, -0.002549723256379366, -0.011752352118492126, -0.0032172633800655603, -0.05881175398826599, 0.07086464762687683, -0.11183948069810867, -0.03489325940608978, -0.011693744920194149, -0.08072992414236069, 0.09662594646215439, 0.04122718423604965, 0.03733140602707863, 0.042946673929691315, -0.0706864595413208, -0.041821062564849854, -0.01754535362124443, 0.07022040337324142, 0.02613377384841442, -0.014691904187202454, -0.012636122293770313, 0.017626825720071793, 0.024421999230980873, -0.010445347055792809, -0.009183724410831928, -0.034245219081640244, 0.04611792415380478, -0.023576389998197556, 0.023946791887283325, -0.06201476976275444, 0.054326992481946945, 0.05751824378967285, 0.006546959746629, 0.054418593645095825, -0.08165188133716583, 0.06731826812028885, -0.12770414352416992, -0.002523765666410327, 0.013723030686378479, -0.0703790932893753, -0.12027476727962494, 0.005600122269243002, 0.07453847676515579, 0.018924672156572342, 0.17669248580932617, -0.009106086567044258, 0.013103704899549484, 0.01612776517868042, -0.008282474242150784, -0.09950796514749527, 0.005097631830722094, 0.14262576401233673, 0.02578093484044075, -0.012550769373774529, -0.0040208701975643635, -0.03586553782224655, -0.027744753286242485, 0.012758535332977772, 0.12077628821134567, 0.1316956728696823, 0.11722072213888168, 0.07339361310005188, 0.007954847998917103, -0.13776275515556335, 0.02442598156630993, -0.0507805198431015, 0.03882431983947754, 0.037827134132385254, -0.07163199782371521, 0.02133859321475029, 0.08931440860033035, -0.14706948399543762, 0.013068961910903454, -0.007709776051342487, -0.03642274811863899, -0.06372445076704025, -0.14963337779045105, -0.04561074450612068, 0.029956094920635223, 0.005686272867023945, -0.0938350111246109, 0.017781035974621773, 0.035511791706085205, 0.050772249698638916, -0.03422568738460541, 0.10750404745340347, 0.006203377153724432, -0.0954969972372055, 0.0447947233915329, 0.04755919426679611, 0.030167486518621445, 0.007876754738390446, -0.029024336487054825, -0.014371595345437527, 0.016781797632575035, 0.034264061599969864, 0.03749456629157066, 0.12782439589500427, 0.05874912068247795, -0.03414629027247429, -0.07658478617668152, -0.03226872906088829, 0.017915479838848114, 0.04191989079117775, 0.19926051795482635, 0.05121467262506485, 0.0026599587872624397, 0.01611676998436451, 0.10083337873220444, -0.029500070959329605, -0.07166173309087753, -0.042213182896375656, 0.13851293921470642, -0.020916881039738655, -0.02976449951529503, 0.010189823806285858, -0.09685345739126205, 0.016521863639354706, 0.1616673767566681, 0.28242796659469604, -0.017180679365992546, 0.002529689110815525, -0.013687115162611008, 0.0006161503260955215, -0.01409160066395998, 0.07850614190101624, -0.001159854233264923, 0.18620723485946655, -0.038509391248226166, -0.016171738505363464, 0.05092363804578781, -0.0019474352011457086, -0.10439960658550262, 0.07274460047483444, -0.08024750649929047, -0.012522772885859013, 0.013217196799814701, 0.05775061994791031, -0.06338334083557129, -0.2024810016155243, 0.026805786415934563, -0.11502312123775482, -0.07387053966522217, 0.02945488505065441, 0.024037418887019157, 0.07457251101732254, 0.0411016009747982, 0.002127366606146097, -0.009564668871462345, 0.15406468510627747, -0.023188013583421707, -0.12973450124263763, -0.05248292535543442, 0.0712757259607315, -0.20266646146774292, 0.22593940794467926, 0.004018248058855534, 0.07849753648042679, 0.03226413577795029, -0.0023091041948646307, -0.12783218920230865, 0.025466248393058777, 0.06132085621356964, -0.06453689932823181, -0.003232628107070923, 0.10201440751552582, -0.057204727083444595, 0.13751855492591858, 0.056427642703056335, 0.001213785377331078, 0.029267923906445503, 0.19913312792778015, 0.017349569126963615, -0.09179779142141342, 0.05136331170797348, -0.09314028173685074, 0.1492318958044052, 0.11986593157052994, -0.03597667068243027, 0.0054999664425849915, -0.021046346053481102, -0.04777473211288452, 0.02593092992901802, 0.07436054199934006, -0.01537355501204729, -0.06145206466317177, -0.03896114230155945, -0.22358040511608124, 0.06228295713663101, -0.1255263388156891, 0.040816206485033035, -0.03834836557507515, -0.02940245531499386, -0.07600858807563782, 0.1166187971830368, -0.016497528180480003, -0.01673438958823681, -0.02676696702837944, 0.06371405720710754, -0.01861339621245861, 0.006820634938776493, -0.10941442102193832, -0.10932396352291107 ]
null
null
transformers
# hebrew-gpt_neo-tiny Hebrew text generation model based on [EleutherAI's gpt-neo](https://github.com/EleutherAI/gpt-neo). Each was trained on a TPUv3-8 which was made avilable to me via the [TPU Research Cloud](https://sites.research.google/trc/) Program. ## Datasets 1. An assortment of various Hebrew corpuses - I have made it available [here](https://mega.nz/folder/CodSSA4R#4INvMes-56m_WUi7jQMbJQ) 2. oscar / unshuffled_deduplicated_he - [Homepage](https://oscar-corpus.com) | [Dataset Permalink](https://huggingface.co/datasets/viewer/?dataset=oscar&config=unshuffled_deduplicated_he) The Open Super-large Crawled ALMAnaCH coRpus is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture. ## Training Config Available [here](https://github.com/Norod/hebrew-gpt_neo/tree/main/hebrew-gpt_neo-tiny/configs) <BR> ## Usage ### Google Colab Notebook Available [here ](https://colab.research.google.com/github/Norod/hebrew-gpt_neo/blob/main/hebrew-gpt_neo-tiny/Norod78_hebrew_gpt_neo_tiny_Colab.ipynb) <BR> #### Simple usage sample code ```python !pip install tokenizers==0.10.2 transformers==4.6.0 from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("Norod78/hebrew-gpt_neo-tiny") model = AutoModelForCausalLM.from_pretrained("Norod78/hebrew-gpt_neo-tiny", pad_token_id=tokenizer.eos_token_id) prompt_text = "אני אוהב שוקולד ועוגות" max_len = 512 sample_output_num = 3 seed = 1000 import numpy as np import torch device = torch.device("cuda" if torch.cuda.is_available() else "cpu") n_gpu = 0 if torch.cuda.is_available()==False else torch.cuda.device_count() print(f"device: {device}, n_gpu: {n_gpu}") np.random.seed(seed) torch.manual_seed(seed) if n_gpu > 0: torch.cuda.manual_seed_all(seed) model.to(device) encoded_prompt = tokenizer.encode( prompt_text, add_special_tokens=False, return_tensors="pt") encoded_prompt = encoded_prompt.to(device) if encoded_prompt.size()[-1] == 0: input_ids = None else: input_ids = encoded_prompt print("input_ids = " + str(input_ids)) if input_ids != None: max_len += len(encoded_prompt[0]) if max_len > 1024: max_len = 1024 print("Updated max_len = " + str(max_len)) stop_token = "<|endoftext|>" new_lines = "\n\n\n" sample_outputs = model.generate( input_ids, do_sample=True, max_length=max_len, top_k=50, top_p=0.95, num_return_sequences=sample_output_num ) print(100 * '-' + "\n\t\tOutput\n" + 100 * '-') for i, sample_output in enumerate(sample_outputs): text = tokenizer.decode(sample_output, skip_special_tokens=True) # Remove all text after the stop token text = text[: text.find(stop_token) if stop_token else None] # Remove all text after 3 newlines text = text[: text.find(new_lines) if new_lines else None] print("\n{}: {}".format(i, text)) print("\n" + 100 * '-') ```
{"language": "he", "license": "mit", "thumbnail": "https://avatars1.githubusercontent.com/u/3617152?norod.jpg", "widget": [{"text": "\u05e2\u05d5\u05d3 \u05d1\u05d9\u05de\u05d9 \u05e7\u05d3\u05dd"}, {"text": "\u05e7\u05d5\u05e8\u05d0\u05d9\u05dd \u05dc\u05d9 \u05d3\u05d5\u05e8\u05d5\u05df \u05d5\u05d0\u05e0\u05d9 \u05de\u05e2\u05d5\u05e0\u05d9\u05d9\u05df \u05dc"}, {"text": "\u05e7\u05d5\u05e8\u05d0\u05d9\u05dd \u05dc\u05d9 \u05d0\u05d9\u05e6\u05d9\u05e7 \u05d5\u05d0\u05e0\u05d9 \u05d7\u05d5\u05e9\u05d1 \u05e9"}, {"text": "\u05d4\u05d7\u05ea\u05d5\u05dc \u05e9\u05dc\u05da \u05de\u05d0\u05d5\u05d3 \u05d7\u05de\u05d5\u05d3 \u05d5"}]}
text-generation
Norod78/hebrew-gpt_neo-tiny
[ "transformers", "pytorch", "jax", "onnx", "safetensors", "gpt_neo", "text-generation", "he", "license:mit", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "he" ]
TAGS #transformers #pytorch #jax #onnx #safetensors #gpt_neo #text-generation #he #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us
# hebrew-gpt_neo-tiny Hebrew text generation model based on EleutherAI's gpt-neo. Each was trained on a TPUv3-8 which was made avilable to me via the TPU Research Cloud Program. ## Datasets 1. An assortment of various Hebrew corpuses - I have made it available here 2. oscar / unshuffled_deduplicated_he - Homepage | Dataset Permalink The Open Super-large Crawled ALMAnaCH coRpus is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture. ## Training Config Available here <BR> ## Usage ### Google Colab Notebook Available here <BR> #### Simple usage sample code
[ "# hebrew-gpt_neo-tiny\n\nHebrew text generation model based on EleutherAI's gpt-neo. Each was trained on a TPUv3-8 which was made avilable to me via the TPU Research Cloud Program.", "## Datasets\n\n1. An assortment of various Hebrew corpuses - I have made it available here\n\n\n2. oscar / unshuffled_deduplicated_he - Homepage | Dataset Permalink\n\nThe Open Super-large Crawled ALMAnaCH coRpus is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture.", "## Training Config\n\nAvailable here <BR>", "## Usage", "### Google Colab Notebook\n\nAvailable here <BR>", "#### Simple usage sample code" ]
[ "TAGS\n#transformers #pytorch #jax #onnx #safetensors #gpt_neo #text-generation #he #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n", "# hebrew-gpt_neo-tiny\n\nHebrew text generation model based on EleutherAI's gpt-neo. Each was trained on a TPUv3-8 which was made avilable to me via the TPU Research Cloud Program.", "## Datasets\n\n1. An assortment of various Hebrew corpuses - I have made it available here\n\n\n2. oscar / unshuffled_deduplicated_he - Homepage | Dataset Permalink\n\nThe Open Super-large Crawled ALMAnaCH coRpus is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture.", "## Training Config\n\nAvailable here <BR>", "## Usage", "### Google Colab Notebook\n\nAvailable here <BR>", "#### Simple usage sample code" ]
[ 62, 54, 90, 9, 3, 11, 6 ]
[ "passage: TAGS\n#transformers #pytorch #jax #onnx #safetensors #gpt_neo #text-generation #he #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n# hebrew-gpt_neo-tiny\n\nHebrew text generation model based on EleutherAI's gpt-neo. Each was trained on a TPUv3-8 which was made avilable to me via the TPU Research Cloud Program.## Datasets\n\n1. An assortment of various Hebrew corpuses - I have made it available here\n\n\n2. oscar / unshuffled_deduplicated_he - Homepage | Dataset Permalink\n\nThe Open Super-large Crawled ALMAnaCH coRpus is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture.## Training Config\n\nAvailable here <BR>## Usage### Google Colab Notebook\n\nAvailable here <BR>#### Simple usage sample code" ]
[ -0.07775899767875671, 0.15472927689552307, -0.0011057924712076783, 0.07979419827461243, 0.09768558293581009, 0.0239894837141037, 0.21614937484264374, 0.08354481309652328, -0.020854312926530838, -0.018983513116836548, 0.04965224117040634, 0.03965429589152336, 0.021026168018579483, 0.11294323205947876, 0.03017287515103817, -0.15361826121807098, -0.051203832030296326, -0.04095705971121788, -0.0038517541252076626, 0.04900890216231346, 0.08807843178510666, -0.01872999221086502, 0.06071573868393898, -0.044079042971134186, -0.01187944132834673, -0.006759155075997114, -0.029291821643710136, -0.12752629816532135, 0.08554431051015854, 0.06793911010026932, 0.043533943593502045, 0.026930317282676697, 0.02222437597811222, -0.11786612868309021, 0.043949149549007416, 0.04927315562963486, -0.04864335432648659, 0.02577676810324192, 0.060723237693309784, -0.09011777490377426, 0.16407693922519684, -0.04542924836277962, -0.030012046918272972, 0.04394470527768135, -0.11650640517473221, -0.1502729058265686, -0.06498086452484131, 0.017312990501523018, 0.013587132096290588, 0.06884197145700455, 0.016061268746852875, 0.1939111351966858, -0.020518725737929344, 0.08580096065998077, 0.04002346098423004, -0.2839682102203369, -0.06601841747760773, 0.10219454765319824, 0.010144832544028759, 0.10327035188674927, -0.018437957391142845, 0.011142458766698837, 0.023821907117962837, 0.01656218059360981, 0.0018915570108219981, -0.06439165025949478, -0.2381783425807953, -0.03368109464645386, -0.08189733326435089, -0.013811320066452026, 0.22587326169013977, -0.05851122736930847, -0.035840217024087906, -0.026694990694522858, -0.08423055708408356, 0.008445005863904953, 0.027407024055719376, 0.06673964858055115, 0.00442094448953867, 0.016757244244217873, -0.03332766145467758, -0.12720724940299988, -0.11648113280534744, -0.0032059948425740004, -0.05543915182352066, 0.10649539530277252, 0.03227003663778305, 0.0344390906393528, 0.01242059376090765, 0.07322073727846146, 0.043933212757110596, -0.110542893409729, -0.014330398291349411, -0.04331795871257782, 0.005944165866822004, -0.0618165098130703, -0.03224823251366615, -0.06204680725932121, 0.09660907089710236, 0.1685720533132553, 0.04996061697602272, 0.02636394463479519, -0.010380906984210014, 0.01705656200647354, 0.006891284603625536, 0.10911811888217926, 0.013944861479103565, -0.09129472821950912, 0.1494869738817215, -0.025382554158568382, 0.13053198158740997, 0.017631379887461662, -0.10444875061511993, -0.010666406713426113, 0.037019263952970505, 0.0636075809597969, -0.004719519522041082, 0.10920717567205429, 0.05747910961508751, -0.05224119499325752, 0.09962038695812225, -0.13046032190322876, -0.015442858450114727, 0.03795973211526871, -0.001271347515285015, -0.12903906404972076, 0.06924359500408173, -0.0367170087993145, -0.08330753445625305, -0.09169938415288925, -0.04900660738348961, 0.03431644290685654, -0.02446397766470909, -0.0633353665471077, 0.04095970094203949, -0.10550877451896667, 0.014934996142983437, -0.2398621141910553, -0.16707412898540497, -0.0014300212496891618, 0.03652321174740791, -0.006865349132567644, 0.014949442818760872, -0.030552182346582413, 0.02944539487361908, -0.06211316958069801, -0.05978631228208542, -0.026831233873963356, -0.09860412776470184, 0.0418282151222229, -0.012266412377357483, 0.023901846259832382, -0.0831964984536171, -0.001955467974767089, -0.1501743495464325, -0.032484669238328934, -0.09450110793113708, 0.086924247443676, -0.09833309799432755, 0.07700930535793304, -0.08599339425563812, -0.028912251815199852, 0.021553773432970047, -0.044492803514003754, 0.0023761629126966, 0.17108489573001862, -0.20300373435020447, -0.026235612109303474, 0.15729326009750366, -0.12186328321695328, -0.12860028445720673, 0.16131864488124847, -0.03335890918970108, 0.03517916053533554, 0.11710558086633682, 0.1819065809249878, 0.1163342073559761, -0.032697565853595734, -0.10115578025579453, 0.04108338803052902, 0.047662246972322464, -0.02812652848660946, 0.06521330773830414, 0.028506239876151085, -0.006568389479070902, -0.0085177356377244, -0.02000066079199314, 0.10088326036930084, 0.019614754244685173, -0.09606985002756119, -0.008831363171339035, -0.07332252711057663, -0.0671612098813057, 0.014955281279981136, 0.07643424719572067, -0.06028325855731964, -0.03904690220952034, -0.09923706203699112, 0.05678289383649826, -0.0362023301422596, 0.054454587399959564, -0.09027954190969467, 0.12287767231464386, -0.06063587963581085, 0.01784498244524002, -0.09497816115617752, -0.08256465196609497, 0.05037600174546242, -0.023287585005164146, 0.11831492930650711, -0.08185131847858429, 0.04843873158097267, 0.06972712278366089, -0.041341669857501984, 0.018432246521115303, 0.041899461299180984, 0.012128224596381187, -0.05013745278120041, -0.08163248747587204, 0.028857318684458733, -0.029922327026724815, 0.08480732887983322, -0.09987896680831909, 0.008847696706652641, 0.021850045770406723, 0.10709279030561447, 0.025415295735001564, -0.014292282052338123, 0.036500103771686554, 0.014160645194351673, -0.06720343232154846, -0.14735768735408783, 0.039539534598588943, 0.06261129677295685, -0.12349766492843628, 0.00920137856155634, -0.10898550599813461, 0.10175851732492447, 0.13704891502857208, -0.04951462522149086, -0.06236967071890831, -0.012709512375295162, -0.03158347308635712, 0.012325488962233067, -0.004477339331060648, 0.05151728540658951, 0.2416139543056488, -0.023477686569094658, 0.12759020924568176, -0.07057169824838638, -0.010192581452429295, -0.023820294067263603, -0.059992335736751556, 0.05621393769979477, 0.066550113260746, 0.03810034692287445, -0.20855189859867096, 0.10454420000314713, 0.13395555317401886, -0.07878094166517258, 0.14695671200752258, -0.019033169373869896, 0.008271602913737297, 0.017915120348334312, 0.02547280676662922, 0.047116469591856, 0.07986608147621155, -0.10789113491773605, -0.013450139202177525, 0.015414542518556118, -0.004508883226662874, 0.0682094469666481, -0.0917103961110115, 0.02411419153213501, -0.0382266603410244, -0.04042736440896988, 0.05927440896630287, 0.024891339242458344, -0.05673234537243843, 0.06198464706540108, -0.04060479253530502, -0.001061547314748168, 0.02172612026333809, -0.01945406198501587, -0.058147281408309937, 0.17682930827140808, -0.12545759975910187, -0.14822840690612793, -0.05618656426668167, -0.0146943936124444, -0.052109066396951675, 0.08687327057123184, 0.07259409129619598, -0.021463710814714432, -0.037453293800354004, -0.0841851755976677, 0.0299882423132658, 0.09457233548164368, -0.041595447808504105, -0.09603013098239899, 0.012057386338710785, 0.0028360518626868725, -0.14274267852306366, 0.01010220404714346, 0.03869271278381348, -0.20841684937477112, 0.06320655345916748, -0.016804903745651245, -0.016079554334282875, 0.028262129053473473, 0.02116050198674202, -0.019864007830619812, -0.034709323197603226, 0.1577649563550949, -0.051758091896772385, -0.0028641337994486094, 0.11136306077241898, 0.01496708020567894, -0.015048610977828503, 0.07030775398015976, 0.003715591738000512, -0.06596209853887558, 0.01337337400764227, -0.056567754596471786, -0.07492723315954208, -0.2667434513568878, -0.14083997905254364, -0.05067554488778114, 0.06716133654117584, 0.03048660419881344, 0.0771610364317894, -0.01793537475168705, 0.1236465647816658, -0.09388642013072968, 0.13487353920936584, -0.009097709320485592, 0.07105183601379395, 0.14157024025917053, 0.015811001881957054, 0.004579765256494284, -0.09893229603767395, -0.06264808028936386, 0.11303512752056122, 0.14131887257099152, 0.16690844297409058, -0.030034614726901054, 0.14746439456939697, 0.027447925880551338, 0.06845667213201523, 0.013499478809535503, 0.02191903442144394, 0.057143911719322205, 0.07271423935890198, -0.043424058705568314, -0.0748705342411995, -0.07160608470439911, 0.06757870316505432, 0.005649151746183634, -0.052902139723300934, 0.08463040739297867, -0.03472777456045151, 0.12186362594366074, 0.10249760001897812, 0.015255707316100597, -0.22053377330303192, -0.07122626155614853, 0.09122410416603088, -0.023675059899687767, -0.05736096203327179, 0.07055716961622238, 0.06315934658050537, -0.09458489716053009, 0.10095946490764618, -0.00684367073699832, 0.06604108214378357, -0.1286676973104477, -0.020424513146281242, 0.022990984842181206, -0.059246402233839035, 0.020494818687438965, 0.10288608074188232, -0.21652795374393463, 0.13255639374256134, 0.05299968644976616, 0.06773664802312851, -0.06160460785031319, 0.03500929847359657, 0.08487522602081299, 0.06357581913471222, 0.113645538687706, 0.02034164033830166, -0.001270497334189713, -0.029087699949741364, -0.12948337197303772, 0.05741168558597565, 0.00011836517660412937, -0.02519371546804905, 0.03425334021449089, 0.018102403730154037, 0.013552096672356129, -0.06369272619485855, 0.015999382361769676, -0.12698572874069214, -0.1542898565530777, 0.007506270427256823, 0.03994213044643402, 0.002581164240837097, -0.0528707429766655, -0.060955747961997986, -0.09668700397014618, 0.17282120883464813, -0.0979887917637825, -0.11622445285320282, -0.06688129901885986, -0.0015787514857947826, 0.03404143452644348, -0.07809031009674072, -0.003427074523642659, 0.008178496733307838, -0.05310184508562088, -0.011218967847526073, -0.13474875688552856, 0.0502190962433815, -0.09034668654203415, -0.07756677269935608, -0.03851370885968208, 0.103073351085186, -0.02149689756333828, 0.019059505313634872, 0.011507879011332989, 0.013390216045081615, -0.029479485005140305, -0.09708765894174576, -0.04115058854222298, 0.05379648879170418, 0.08090519905090332, 0.08172234892845154, -0.047447964549064636, -0.17900170385837555, -0.02800174430012703, -0.02934345230460167, 0.09843601286411285, 0.13319271802902222, -0.015825944021344185, 0.059252623468637466, 0.12370850145816803, -0.08110832422971725, -0.21832691133022308, -0.07910589873790741, -0.010362539440393448, -0.009583181701600552, -0.02900727652013302, -0.20491327345371246, 0.08591078966856003, 0.030308429151773453, -0.033543843775987625, 0.15402214229106903, -0.1256924271583557, -0.10435810685157776, 0.15229852497577667, 0.04329778999090195, 0.0774543359875679, -0.17791438102722168, -0.06773730367422104, -0.04111012443900108, -0.006051359698176384, 0.13890105485916138, -0.1813400685787201, 0.12458840757608414, 0.02761537954211235, -0.052218832075595856, -0.006490633357316256, -0.03933653607964516, 0.07498817890882492, 0.06530440598726273, 0.03378940001130104, -0.0809454470872879, 0.00017481316172052175, 0.11196895688772202, 0.011811923235654831, 0.002686687046661973, -0.01843363046646118, 0.06568985432386398, -0.048330698162317276, -0.03166693076491356, 0.0021121827885508537, 0.14619413018226624, 0.007600162178277969, -0.08222919702529907, -0.0445467010140419, 0.06755061447620392, 0.026985665783286095, 0.005611858330667019, 0.16740646958351135, 0.01618381403386593, 0.1002519503235817, 0.1231490820646286, 0.15158051252365112, 0.017641304060816765, 0.07082713395357132, -0.023565277457237244, -0.02747855894267559, 0.0707281306385994, -0.05451926216483116, 0.0037645306438207626, 0.050718508660793304, -0.02427789568901062, 0.03815222159028053, 0.000633861927781254, -0.0742933452129364, 0.07864797860383987, 0.07380443066358566, -0.16409948468208313, -0.05998440086841583, -0.04811045154929161, -0.016107061877846718, 0.03317953273653984, 0.1166396215558052, 0.15549904108047485, -0.06878216564655304, -0.04576076939702034, -0.030833343043923378, -0.007828636094927788, 0.008681120350956917, 0.11444037407636642, 0.01179123017936945, -0.027098743245005608, -0.09421791881322861, 0.0922972783446312, 0.13133254647254944, -0.10524498671293259, -0.019840313121676445, 0.10487251728773117, -0.1509411782026291, -0.1033538430929184, -0.050418198108673096, 0.10625068843364716, -0.059942618012428284, -0.03112553246319294, -0.04439421370625496, -0.018897732719779015, -0.05291418731212616, 0.1393919140100479, 0.07905042916536331, 0.060503020882606506, -0.01976432092487812, 0.030224725604057312, 0.043520502746105194, 0.08118350058794022, 0.01840147376060486, 0.039765823632478714, -0.08726968616247177, 0.07941154390573502, -0.011212415993213654, 0.04776226356625557, -0.03901005536317825, 0.0005347182741388679, -0.10267170518636703, -0.04114171490073204, -0.10385829955339432, 0.10089685767889023, -0.07012572884559631, 0.061971161514520645, -0.017743827775120735, -0.017510442063212395, -0.06637846678495407, 0.022376274690032005, -0.023129690438508987, 0.008489940315485, -0.037277933210134506, 0.0997082069516182, -0.07450098544359207, -0.04169708490371704, 0.01732918620109558, -0.044892750680446625, 0.15245072543621063, 0.06735532730817795, -0.020586716011166573, 0.037822239100933075, -0.14807547628879547, -0.026392174884676933, 0.013288529589772224, 0.09343061596155167, -0.0022070794366300106, -0.05806490406394005, 0.049951620399951935, 0.03883728012442589, 0.06457097083330154, -0.0038335146382451057, 0.1061326339840889, -0.059719983488321304, -0.010026131756603718, -0.0358746312558651, 0.008193783462047577, -0.04425752907991409, -0.006818647030740976, 0.05363425239920616, 0.041919149458408356, 0.10415451228618622, -0.12201780825853348, 0.026408495381474495, -0.11627281457185745, 0.019279690459370613, 0.0037078699097037315, -0.11983772367238998, -0.18418575823307037, 0.009219186380505562, 0.06849881261587143, 0.008193534798920155, 0.24301815032958984, 0.028969410806894302, -0.002950685564428568, 0.011792916804552078, 0.05316995829343796, 0.03834761306643486, -0.022787071764469147, 0.1509493887424469, 0.026973973959684372, -0.006333749741315842, -0.06224585697054863, -0.019140401855111122, 0.02385294996201992, -0.06951476633548737, 0.04411691427230835, 0.11231683939695358, -0.027834545820951462, 0.0679449737071991, -0.02944805659353733, -0.11226852238178253, 0.008854319341480732, -0.03932878375053406, 0.027065914124250412, 0.06438100337982178, -0.09045250713825226, -0.0026718717999756336, 0.11966703832149506, -0.069267138838768, -0.005333357490599155, -0.02819991484284401, -0.02553975023329258, -0.10392298549413681, -0.1534298062324524, -0.07699641585350037, -0.0648859292268753, 0.01584862545132637, -0.07224857062101364, -0.03853593021631241, 0.06286828219890594, 0.04268019273877144, -0.03952985629439354, 0.15195488929748535, -0.06637405604124069, -0.06696584075689316, 0.01853153668344021, 0.0009128368110395968, 0.030426012352108955, -0.014219412580132484, -0.08488237112760544, -0.03338398411870003, -0.009548983536660671, 0.004311223514378071, 0.016385480761528015, 0.10244732350111008, 0.06975096464157104, -0.06348331272602081, -0.046579256653785706, -0.05448170751333237, 0.020316211506724358, 0.015785064548254013, 0.17113544046878815, 0.02396507002413273, -0.015425854362547398, 0.08074890077114105, 0.19378036260604858, -0.022142287343740463, -0.10348126292228699, -0.08824584633111954, 0.12159305065870285, -0.014745383523404598, -0.024976160377264023, 0.03615475818514824, -0.04177602007985115, 0.030547648668289185, 0.15119144320487976, 0.2674890458583832, -0.034228354692459106, 0.025584567338228226, 0.0073776752687990665, -0.00995378103107214, 0.01532132737338543, 0.10599812865257263, 0.009353714063763618, 0.1291366070508957, -0.026652323082089424, -0.08286064863204956, 0.037288956344127655, 0.022343484684824944, -0.13599003851413727, 0.05525726452469826, -0.06579013913869858, 0.017000358551740646, 0.016156334429979324, 0.06420544534921646, -0.10426713526248932, -0.1521473377943039, 0.007175527047365904, -0.09191009402275085, -0.10914070904254913, 0.007431648205965757, 0.07604055851697922, 0.057956527918577194, 0.03210429102182388, 0.013724989257752895, -0.00012400468403939158, 0.02407507412135601, -0.013558534905314445, -0.13945429027080536, 0.020499587059020996, 0.06316104531288147, -0.1201392337679863, 0.22866752743721008, -0.03105897456407547, 0.09655499458312988, 0.06052934378385544, 0.022388750687241554, -0.10791987925767899, 0.13256914913654327, 0.022554244846105576, -0.05898457393050194, 0.0942605659365654, -0.00267652771435678, -0.05668291822075844, 0.07548854500055313, 0.057726457715034485, 0.05154094472527504, 0.031983863562345505, 0.14048948884010315, -0.013268054462969303, -0.10472247004508972, 0.06062691658735275, -0.136432945728302, 0.13425591588020325, 0.061504513025283813, -0.04951757565140724, -0.029417719691991806, -0.05447913333773613, -0.012768332846462727, 0.08064011484384537, -0.02605941891670227, -0.015790628269314766, -0.12314842641353607, -0.052851080894470215, -0.1520140916109085, 0.06133516877889633, -0.10392912477254868, 0.028302820399403572, -0.07720997184515, -0.0023951923940330744, -0.11224570125341415, 0.08550859987735748, 0.05091574788093567, -0.004920111503452063, -0.03053535334765911, -0.01230962947010994, -0.036641549319028854, -0.0052808672189712524, -0.1315089613199234, -0.1552983671426773 ]
null
null
transformers
# hebrew-gpt_neo-xl-poetry Hebrew poetry text generation model which was fine tuned upon on [hebrew-gpt_neo-xl](https://huggingface.co/Norod78/hebrew-gpt_neo-xl). ## Datasets An assortment of various Hebrew books, magazines and poetry corpuses ## Training Config Similar to [this one](https://github.com/Norod/hebrew-gpt_neo/tree/main/hebrew-gpt_neo-xl/configs) <BR> ## Usage ### Google Colab Notebook Available [here ](https://colab.research.google.com/github/Norod/hebrew-gpt_neo/blob/main/hebrew-gpt_neo-xl/Norod78_hebrew_gpt_neo_xl_Colab.ipynb) <BR> #### Simple usage sample code ```python !pip install tokenizers==0.10.3 transformers==4.8.0 from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("Norod78/hebrew-gpt_neo-xl-poetry") model = AutoModelForCausalLM.from_pretrained("Norod78/hebrew-gpt_neo-xl-poetry", pad_token_id=tokenizer.eos_token_id) prompt_text = "אני אוהב שוקולד ועוגות" max_len = 512 sample_output_num = 3 seed = 1000 import numpy as np import torch device = torch.device("cuda" if torch.cuda.is_available() else "cpu") n_gpu = 0 if torch.cuda.is_available()==False else torch.cuda.device_count() print(f"device: {device}, n_gpu: {n_gpu}") np.random.seed(seed) torch.manual_seed(seed) if n_gpu > 0: torch.cuda.manual_seed_all(seed) model.to(device) encoded_prompt = tokenizer.encode( prompt_text, add_special_tokens=False, return_tensors="pt") encoded_prompt = encoded_prompt.to(device) if encoded_prompt.size()[-1] == 0: input_ids = None else: input_ids = encoded_prompt print("input_ids = " + str(input_ids)) if input_ids != None: max_len += len(encoded_prompt[0]) if max_len > 2048: max_len = 2048 print("Updated max_len = " + str(max_len)) stop_token = "<|endoftext|>" new_lines = "\n\n\n" sample_outputs = model.generate( input_ids, do_sample=True, max_length=max_len, top_k=50, top_p=0.95, num_return_sequences=sample_output_num ) print(100 * '-' + "\n\t\tOutput\n" + 100 * '-') for i, sample_output in enumerate(sample_outputs): text = tokenizer.decode(sample_output, skip_special_tokens=True) # Remove all text after the stop token text = text[: text.find(stop_token) if stop_token else None] # Remove all text after 3 newlines text = text[: text.find(new_lines) if new_lines else None] print("\n{}: {}".format(i, text)) print("\n" + 100 * '-') ```
{"language": "he", "license": "mit", "thumbnail": "https://avatars1.githubusercontent.com/u/3617152?norod.jpg", "widget": [{"text": "\u05e2\u05d5\u05d3 \u05d1\u05d9\u05de\u05d9 \u05e7\u05d3\u05dd"}, {"text": "\u05ea\u05e8\u05d9\u05e1\u05e8 \u05de\u05db\u05e9\u05e4\u05d5\u05ea \u05e1\u05d2"}, {"text": "\n\n\u05d4\u05d0\u05d9\u05e9 \u05d4\u05d0\u05d7\u05e8\u05d5\u05df \u05d1\u05e2\u05d5\u05dc\u05dd /"}, {"text": "\u05e4\u05e2\u05dd \u05d0\u05d7\u05ea, \u05dc\u05e4\u05e0\u05d9 \u05e9\u05e0\u05d9\u05dd \u05e8\u05d1\u05d5\u05ea"}, {"text": "\u05d4\u05e8\u05de\u05d9\u05d5\u05e0\u05d9 \u05d4\u05e1\u05ea\u05d9\u05e8\u05d4 \u05d0\u05ea"}, {"text": "\u05dc\u05e4\u05ea\u05e2, \u05d0\u05d5\u05e8 \u05d9\u05e8\u05d5\u05e7"}]}
text-generation
Norod78/hebrew-gpt_neo-xl-poetry
[ "transformers", "pytorch", "jax", "safetensors", "gpt_neo", "text-generation", "he", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "he" ]
TAGS #transformers #pytorch #jax #safetensors #gpt_neo #text-generation #he #license-mit #autotrain_compatible #endpoints_compatible #region-us
# hebrew-gpt_neo-xl-poetry Hebrew poetry text generation model which was fine tuned upon on hebrew-gpt_neo-xl. ## Datasets An assortment of various Hebrew books, magazines and poetry corpuses ## Training Config Similar to this one <BR> ## Usage ### Google Colab Notebook Available here <BR> #### Simple usage sample code
[ "# hebrew-gpt_neo-xl-poetry\n\nHebrew poetry text generation model which was fine tuned upon on hebrew-gpt_neo-xl.", "## Datasets\n\nAn assortment of various Hebrew books, magazines and poetry corpuses", "## Training Config\n\nSimilar to this one <BR>", "## Usage", "### Google Colab Notebook\n\nAvailable here <BR>", "#### Simple usage sample code" ]
[ "TAGS\n#transformers #pytorch #jax #safetensors #gpt_neo #text-generation #he #license-mit #autotrain_compatible #endpoints_compatible #region-us \n", "# hebrew-gpt_neo-xl-poetry\n\nHebrew poetry text generation model which was fine tuned upon on hebrew-gpt_neo-xl.", "## Datasets\n\nAn assortment of various Hebrew books, magazines and poetry corpuses", "## Training Config\n\nSimilar to this one <BR>", "## Usage", "### Google Colab Notebook\n\nAvailable here <BR>", "#### Simple usage sample code" ]
[ 54, 38, 21, 11, 3, 11, 6 ]
[ "passage: TAGS\n#transformers #pytorch #jax #safetensors #gpt_neo #text-generation #he #license-mit #autotrain_compatible #endpoints_compatible #region-us \n# hebrew-gpt_neo-xl-poetry\n\nHebrew poetry text generation model which was fine tuned upon on hebrew-gpt_neo-xl.## Datasets\n\nAn assortment of various Hebrew books, magazines and poetry corpuses## Training Config\n\nSimilar to this one <BR>## Usage### Google Colab Notebook\n\nAvailable here <BR>#### Simple usage sample code" ]
[ -0.040305495262145996, 0.15011757612228394, -0.0026194315869361162, 0.08841387182474136, 0.09604416787624359, 0.04048159345984459, 0.1952190399169922, 0.07942869514226913, -0.034052252769470215, -0.04189750552177429, 0.15734867751598358, 0.023869171738624573, -0.025269119068980217, 0.034441132098436356, 0.02454509772360325, -0.20927464962005615, -0.0655292347073555, 0.009896788746118546, 0.043567150831222534, 0.11646594852209091, 0.08330173790454865, 0.01911439746618271, 0.07203742861747742, -0.018082505092024803, 0.0194837786257267, -0.04262348264455795, 0.009945545345544815, -0.07964149862527847, 0.10749366879463196, 0.05652407929301262, 0.006119356956332922, 0.04371756315231323, -0.007027692627161741, -0.19409990310668945, 0.03590185195207596, -0.01925640180706978, -0.04166090115904808, -0.0258188433945179, 0.023537226021289825, -0.2414456456899643, 0.19757190346717834, -0.035318486392498016, -0.07936147600412369, 0.023795640096068382, -0.16742753982543945, -0.15423370897769928, -0.0531216636300087, 0.017347952350974083, 0.025143060833215714, 0.15810184180736542, -0.00812614243477583, 0.10129854828119278, 0.010309731587767601, 0.07812090963125229, 0.13795533776283264, -0.3279377222061157, -0.06246255338191986, 0.09465912729501724, 0.027511456981301308, 0.09997428953647614, -0.11151261627674103, 0.07759594172239304, 0.043858397752046585, 0.052030228078365326, -0.015369744040071964, -0.08873254060745239, -0.1016150638461113, 0.08572543412446976, -0.11388091742992401, -0.03452356904745102, 0.22806546092033386, -0.10322196781635284, 0.0020607721526175737, -0.027031147852540016, -0.11651834100484848, -0.015220368281006813, -0.0483844168484211, 0.019084567204117775, -0.049587447196245193, 0.02427712269127369, 0.0007915603346191347, -0.13812150061130524, -0.12458900362253189, -0.022466806694865227, -0.034266721457242966, 0.0033614872954785824, 0.005604700185358524, -0.0069733778946101665, 0.006115064024925232, 0.09034804999828339, 0.03630729764699936, -0.09075531363487244, -0.018767761066555977, -0.057640768587589264, 0.16591578722000122, 0.0033571957610547543, -0.0585426464676857, -0.07908991724252701, 0.061070989817380905, 0.17615215480327606, 0.12530098855495453, 0.026004277169704437, 0.026121240109205246, 0.01118821557611227, -0.012980171479284763, 0.07755755633115768, 0.16527189314365387, -0.10039909929037094, 0.07312365621328354, 0.045789189636707306, 0.12612561881542206, 0.017645616084337234, -0.13097770512104034, -0.05747302249073982, 0.06300675123929977, 0.02941366285085678, -0.05670303851366043, 0.13605427742004395, 0.06615661829710007, 0.010783211328089237, 0.05442221462726593, -0.0907769426703453, -0.03298651799559593, 0.02162119559943676, 0.03760763630270958, -0.1284761130809784, 0.050136685371398926, -0.028851760551333427, -0.09374137222766876, -0.029422476887702942, -0.08451865613460541, 0.014660329557955265, 0.004489954095333815, -0.04242440313100815, -0.0038892945740371943, -0.02685379423201084, 0.049943216145038605, -0.216142937541008, -0.2334173619747162, -0.005590231157839298, 0.05356607958674431, -0.009698730893433094, -0.0022658295929431915, -0.11794775724411011, -0.022869082167744637, 0.001972462981939316, -0.06215021014213562, -0.03436420112848282, -0.10604556649923325, 0.10272805392742157, -0.0045925104059278965, 0.020232999697327614, -0.06027738004922867, 0.008000842295587063, -0.191579669713974, -0.022746311500668526, -0.020940929651260376, 0.08692523837089539, -0.057981375604867935, -0.03925640881061554, -0.059276655316352844, -0.016467370092868805, 0.02098064497113228, -0.00898043718189001, -0.03825557231903076, 0.2605530023574829, -0.20026908814907074, -0.13203182816505432, 0.1682637482881546, -0.09242924302816391, -0.13782860338687897, 0.13464133441448212, -0.043348897248506546, 0.06160866469144821, 0.14268144965171814, 0.25441646575927734, 0.07843281328678131, 0.0213607307523489, -0.11224507540464401, 0.04854099825024605, 0.107872873544693, 0.03232772648334503, 0.011160234920680523, 0.04829232394695282, 0.0000690714514348656, 0.03152677044272423, -0.059417933225631714, 0.060270797461271286, 0.040196456015110016, -0.1225830614566803, -0.02698725461959839, 0.014455138705670834, 0.09290428459644318, 0.014886073768138885, 0.05075250566005707, -0.07688601315021515, -0.09146132320165634, -0.09902498126029968, 0.011504760943353176, -0.02404842898249626, 0.0950932428240776, -0.03825011104345322, 0.12030961364507675, 0.02335871197283268, 0.04383501410484314, -0.05437832698225975, -0.02169506810605526, -0.03564903885126114, 0.05993124470114708, 0.08245871216058731, -0.10879427939653397, 0.09082849323749542, 0.014583650976419449, -0.06940077990293503, 0.06976354122161865, 0.04155031964182854, -0.03878731280565262, -0.06201137602329254, -0.0785563737154007, 0.08444869518280029, 0.004547202493995428, 0.12816713750362396, -0.19511426985263824, 0.004741509910672903, 0.021170422434806824, 0.04559432342648506, -0.030901171267032623, 0.027638470754027367, 0.011889222078025341, 0.004571996629238129, -0.07375465333461761, -0.07016532868146896, 0.0566568486392498, 0.04521418735384941, -0.1286262571811676, 0.07677949965000153, -0.13869524002075195, 0.06852728128433228, 0.1946984827518463, -0.04708411172032356, -0.04682593420147896, -0.046441175043582916, -0.032867588102817535, 0.0028863977640867233, 0.08828698843717575, 0.07864246517419815, 0.23793962597846985, -0.043801240622997284, 0.12565301358699799, -0.0693599283695221, 0.013023319654166698, 0.003055380191653967, -0.07613096386194229, 0.033573586493730545, 0.14188659191131592, 0.04157190024852753, -0.10384700447320938, 0.16738998889923096, 0.23205257952213287, -0.08149126917123795, 0.2539274990558624, -0.0156475268304348, 0.024173058569431305, 0.026220770552754402, -0.02378944680094719, -0.013901419937610626, 0.08625161647796631, -0.15070170164108276, -0.02600623108446598, 0.013718494214117527, -0.01714785397052765, 0.0951848104596138, -0.11823031306266785, -0.04902241379022598, -0.04997743293642998, -0.05671196058392525, 0.076750747859478, 0.10024619102478027, -0.044863130897283554, 0.025396494194865227, -0.029003039002418518, 0.026237886399030685, 0.04434306547045708, 0.02699223905801773, -0.09681826084852219, 0.19250227510929108, -0.14092284440994263, -0.13677717745304108, -0.02880275808274746, -0.06963730603456497, -0.029672065749764442, 0.04850000515580177, 0.08325263857841492, -0.06975384056568146, -0.01587461121380329, -0.06004792079329491, 0.16672994196414948, -0.0038615295197814703, -0.05917317047715187, -0.1884552389383316, -0.01762368343770504, -0.10741967707872391, -0.11355765163898468, -0.02158406563103199, 0.024906039237976074, -0.17227408289909363, 0.04248025640845299, -0.0904199481010437, -0.0051842378452420235, 0.058203525841236115, 0.08087386935949326, -0.018515199422836304, -0.03517412394285202, 0.25719478726387024, -0.0709051713347435, -0.0009689488215371966, 0.07885558158159256, 0.03664132207632065, 0.0032391021959483624, 0.08752502501010895, -0.007932686246931553, -0.02662205509841442, 0.016318121924996376, -0.026975488290190697, -0.07800561189651489, -0.2166656255722046, -0.14449475705623627, -0.057810619473457336, 0.13954651355743408, 0.010023378767073154, 0.09115548431873322, -0.011414222419261932, 0.13067203760147095, -0.10681134462356567, 0.10871575772762299, -0.033292584121227264, 0.06534231454133987, 0.31847304105758667, 0.03270399942994118, 0.021234555169939995, -0.07001105695962906, -0.08203788101673126, 0.04707874357700348, 0.07486925274133682, 0.1397685408592224, -0.005557549651712179, 0.17869684100151062, 0.05429799482226372, 0.14977437257766724, 0.06338031589984894, -0.09255722165107727, 0.04366234317421913, 0.03386450558900833, -0.032618358731269836, -0.09114536643028259, -0.07797051966190338, 0.1105751171708107, -0.0830414742231369, -0.04601709544658661, 0.056442469358444214, -0.11326640099287033, 0.09569226950407028, -0.022719191387295723, 0.051942128688097, -0.22769473493099213, -0.07860571146011353, 0.09836526215076447, 0.0021036649122834206, -0.09114325046539307, 0.10802566260099411, -0.05473486706614494, -0.14015904068946838, 0.10032153874635696, -0.07725819945335388, 0.0654197409749031, -0.06743615120649338, 0.029656527563929558, -0.025147810578346252, -0.13332246243953705, 0.0037334775552153587, 0.09828277677297592, -0.2478790581226349, 0.22700001299381256, 0.0746745690703392, 0.07325423508882523, -0.0077695599757134914, -0.015512003563344479, 0.08475902676582336, 0.08534412086009979, 0.14215195178985596, 0.008634996600449085, 0.024602483958005905, 0.02202913723886013, -0.046309176832437515, 0.0697285458445549, 0.065302275121212, -0.08830725401639938, 0.061784494668245316, -0.020129594951868057, -0.0047084069810807705, -0.0698607936501503, 0.09255479276180267, -0.07886455208063126, -0.2425519824028015, -0.013012719340622425, 0.058120351284742355, 0.051814112812280655, -0.03338700532913208, -0.05821547657251358, -0.0006643052911385894, 0.10794761776924133, -0.08551096171140671, -0.15262705087661743, -0.018909601494669914, -0.052433401346206665, 0.007159920409321785, -0.06427258998155594, 0.02418314293026924, -0.01994396187365055, -0.06988302618265152, -0.09198296815156937, -0.15377464890480042, 0.022179512307047844, -0.02874016761779785, -0.05730628967285156, -0.034541741013526917, 0.24264076352119446, 0.011150805279612541, 0.027712399140000343, 0.0012137595331296325, 0.08187757432460785, -0.10387203097343445, -0.08729806542396545, -0.01763479970395565, -0.07843939960002899, 0.045696888118982315, -0.045315954834222794, -0.032616205513477325, -0.09768951684236526, -0.08291748911142349, -0.08800478279590607, 0.11198360472917557, 0.1124815121293068, -0.007019996643066406, 0.11154650151729584, 0.10927306115627289, -0.09557498246431351, -0.2546056807041168, -0.12634249031543732, -0.03153276816010475, 0.0037925918586552143, -0.03366325423121452, -0.22205238044261932, 0.03133109211921692, -0.08320485800504684, 0.017834670841693878, 0.18342924118041992, -0.22773587703704834, -0.09075859189033508, 0.1586206555366516, 0.003654118860140443, 0.1396813541650772, -0.17731298506259918, -0.10012988746166229, -0.014018159359693527, -0.12548984587192535, 0.1378210335969925, -0.057771842926740646, 0.14392870664596558, -0.020264359191060066, 0.09853653609752655, -0.03668490797281265, 0.0098938699811697, 0.092655710875988, 0.04302586242556572, 0.035184625536203384, -0.08426688611507416, -0.08361699432134628, 0.1046547070145607, 0.06897260993719101, -0.06837086379528046, -0.15348467230796814, 0.06195198744535446, -0.04728415608406067, -0.04751228168606758, -0.06395966559648514, 0.10097206383943558, -0.03758860006928444, -0.1416895091533661, -0.06024105101823807, 0.07096748054027557, -0.0616009384393692, -0.026128621771931648, 0.12118484824895859, -0.006917131133377552, 0.1759941726922989, 0.054884783923625946, 0.11368019133806229, 0.006052060052752495, -0.03969410061836243, -0.040051426738500595, -0.020617039874196053, 0.09482789039611816, -0.14661185443401337, -0.06622328609228134, 0.09601777046918869, 0.01536867581307888, 0.01518157683312893, 0.021013958379626274, -0.06240115314722061, 0.03549456596374512, 0.058850280940532684, -0.2696465849876404, -0.187956765294075, -0.1075996682047844, -0.13445107638835907, 0.01745240017771721, 0.05485907942056656, 0.18609429895877838, -0.10438640415668488, -0.05724915862083435, 0.0008866536663845181, -0.050740085542201996, 0.03547735884785652, 0.05411515384912491, -0.022020263597369194, -0.002856417093425989, -0.08380740880966187, -0.024416053667664528, 0.10464618355035782, -0.12640361487865448, 0.006352312862873077, 0.17747198045253754, -0.13664038479328156, -0.1039319559931755, -0.11097320914268494, 0.14964374899864197, -0.07335072010755539, 0.06082288920879364, -0.04383239150047302, -0.06605355441570282, -0.025106150656938553, 0.16766484081745148, 0.0917278304696083, 0.07924456149339676, -0.039993155747652054, 0.0001865553786046803, 0.06417796015739441, 0.07765720039606094, 0.05045190453529358, -0.020494641736149788, -0.0421481728553772, 0.029693445190787315, -0.05283466354012489, 0.0817050039768219, -0.03878101333975792, -0.01355821918696165, -0.07360846549272537, 0.002832416445016861, -0.11914722621440887, 0.05353275313973427, -0.09405334293842316, 0.0026752857957035303, -0.07600745558738708, 0.019782191142439842, -0.1406630575656891, -0.06202709302306175, -0.041029829531908035, -0.02157912217080593, -0.008497805334627628, 0.11077575385570526, -0.057094231247901917, 0.005795764736831188, 0.04259394481778145, -0.016907034441828728, 0.0625525638461113, 0.11770994961261749, 0.016843367367982864, 0.1352814882993698, -0.15659920871257782, 0.017374549061059952, -0.011548093520104885, 0.058798447251319885, -0.018745597451925278, 0.0434027723968029, -0.00793722365051508, 0.02953415922820568, 0.09900890290737152, 0.059005361050367355, 0.03981104493141174, -0.07223385572433472, -0.0013570510782301426, -0.0006105183274485171, 0.02856791764497757, -0.04896276071667671, 0.011381690390408039, 0.004115274176001549, 0.0375371091067791, 0.14947935938835144, -0.17242403328418732, 0.03390257805585861, -0.13529834151268005, 0.041766904294490814, -0.01340023335069418, -0.1514940857887268, -0.21113453805446625, -0.025061681866645813, 0.019175410270690918, -0.013082028366625309, 0.2191442847251892, 0.019587557762861252, -0.06794027984142303, 0.05285073071718216, 0.11959576606750488, 0.08182436227798462, -0.0640849620103836, 0.14753521978855133, 0.13564810156822205, -0.06484335660934448, -0.08236826211214066, 0.02286536805331707, -0.0012841317802667618, 0.02569449692964554, -0.003226167056709528, 0.10206606239080429, 0.10201778262853622, 0.11998383700847626, -0.0973626971244812, -0.05217873305082321, 0.03679965063929558, 0.023154448717832565, 0.022896697744727135, 0.10399667173624039, -0.07063297182321548, 0.15236172080039978, 0.1992778778076172, -0.036001432687044144, 0.009666326455771923, -0.060603294521570206, -0.05209265276789665, -0.12116659432649612, -0.19112363457679749, -0.06563039124011993, -0.10092772543430328, 0.028189700096845627, -0.024230897426605225, -0.0382956936955452, 0.07471189647912979, 0.057096317410469055, -0.06268389523029327, 0.13046373426914215, 0.12580662965774536, -0.15177373588085175, 0.08185867220163345, -0.03303677588701248, 0.04452667385339737, -0.030596580356359482, -0.03834870085120201, -0.04706871509552002, -0.11996344476938248, -0.05466217175126076, 0.030116351321339607, 0.04847690463066101, 0.05719174072146416, -0.10163738578557968, -0.05908606946468353, -0.05733002349734306, 0.07264983654022217, 0.050648145377635956, 0.12995417416095734, -0.011069552972912788, -0.035326384007930756, 0.09918844699859619, 0.2715729773044586, 0.04134461656212807, -0.11914288997650146, -0.014297127723693848, 0.11531314998865128, -0.005852298811078072, 0.011652407236397266, 0.0053597791120409966, 0.011255412362515926, 0.03414271026849747, 0.18945559859275818, 0.35342130064964294, -0.04178236797451973, 0.04606806859374046, 0.018242591992020607, 0.006514448206871748, 0.06345246732234955, 0.046865034848451614, 0.01730228029191494, 0.2282760888338089, -0.10688293725252151, -0.13362634181976318, -0.05178424343466759, -0.02937459945678711, -0.0934564545750618, 0.17525552213191986, -0.011592493392527103, 0.021220363676548004, 0.0066895452328026295, 0.07018594443798065, -0.16710874438285828, -0.016217771917581558, -0.12443184852600098, -0.20876134932041168, -0.10292302817106247, 0.0007856996962800622, 0.10031546652317047, 0.03148062154650688, 0.06125057116150856, 0.025651704519987106, -0.011044353246688843, -0.024296067655086517, -0.015844816341996193, -0.1894374042749405, 0.0354560986161232, 0.1062731072306633, -0.06534962356090546, 0.14300371706485748, -0.03354981914162636, -0.004688120912760496, 0.04550174996256828, 0.06395187973976135, 0.02505682036280632, 0.13045915961265564, 0.08219996094703674, 0.00028987685800530016, 0.07271472364664078, -0.009707126766443253, 0.005267556291073561, 0.03134264796972275, 0.07367294281721115, 0.07665224373340607, 0.132367342710495, 0.18500852584838867, -0.10185213387012482, -0.09258513152599335, 0.08653822541236877, -0.13160353899002075, 0.06549770385026932, 0.07649680227041245, -0.01790485717356205, -0.023154079914093018, -0.03079642914235592, -0.0225705374032259, 0.03393327072262764, 0.012172183953225613, 0.042957693338394165, -0.07208116352558136, -0.05957233905792236, -0.04703763127326965, -0.036166977137327194, -0.26451376080513, -0.02347153052687645, -0.06923742592334747, 0.05772557109594345, -0.12284554541110992, 0.11737729609012604, 0.030007928609848022, -0.027926305308938026, 0.017441440373659134, -0.019913537427783012, 0.022043688222765923, -0.02846355177462101, -0.09667077660560608, -0.15763424336910248 ]
null
null
transformers
# hebrew-gpt_neo-xl Hebrew text generation model based on [EleutherAI's gpt-neo](https://github.com/EleutherAI/gpt-neo). Each was trained on a TPUv3-8 which was made avilable to me via the [TPU Research Cloud](https://sites.research.google/trc/) Program. ## Datasets 1. An assortment of various Hebrew corpuses - I have made it available [here](https://mega.nz/folder/CodSSA4R#4INvMes-56m_WUi7jQMbJQ) 2. oscar / unshuffled_deduplicated_he - [Homepage](https://oscar-corpus.com) | [Dataset Permalink](https://huggingface.co/datasets/viewer/?dataset=oscar&config=unshuffled_deduplicated_he) The Open Super-large Crawled ALMAnaCH coRpus is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture. 3. CC100-Hebrew Dataset [Homepage](https://metatext.io/datasets/cc100-hebrew) Created by Conneau & Wenzek et al. at 2020, the CC100-Hebrew This dataset is one of the 100 corpora of monolingual data that was processed from the January-December 2018 Commoncrawl snapshots from the CC-Net repository. The size of this corpus is 6.1G., in Hebrew language. ## Training Config Available [here](https://github.com/Norod/hebrew-gpt_neo/tree/main/hebrew-gpt_neo-xl/configs) <BR> ## Usage ### Google Colab Notebook Available [here ](https://colab.research.google.com/github/Norod/hebrew-gpt_neo/blob/main/hebrew-gpt_neo-xl/Norod78_hebrew_gpt_neo_xl_Colab.ipynb) <BR> #### Simple usage sample code ```python !pip install tokenizers==0.10.3 transformers==4.8.0 from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("Norod78/hebrew-gpt_neo-xl") model = AutoModelForCausalLM.from_pretrained("Norod78/hebrew-gpt_neo-xl", pad_token_id=tokenizer.eos_token_id) prompt_text = "אני אוהב שוקולד ועוגות" max_len = 512 sample_output_num = 3 seed = 1000 import numpy as np import torch device = torch.device("cuda" if torch.cuda.is_available() else "cpu") n_gpu = 0 if torch.cuda.is_available()==False else torch.cuda.device_count() print(f"device: {device}, n_gpu: {n_gpu}") np.random.seed(seed) torch.manual_seed(seed) if n_gpu > 0: torch.cuda.manual_seed_all(seed) model.to(device) encoded_prompt = tokenizer.encode( prompt_text, add_special_tokens=False, return_tensors="pt") encoded_prompt = encoded_prompt.to(device) if encoded_prompt.size()[-1] == 0: input_ids = None else: input_ids = encoded_prompt print("input_ids = " + str(input_ids)) if input_ids != None: max_len += len(encoded_prompt[0]) if max_len > 2048: max_len = 2048 print("Updated max_len = " + str(max_len)) stop_token = "<|endoftext|>" new_lines = "\ \ \ " sample_outputs = model.generate( input_ids, do_sample=True, max_length=max_len, top_k=50, top_p=0.95, num_return_sequences=sample_output_num ) print(100 * '-' + "\ \t\tOutput\ " + 100 * '-') for i, sample_output in enumerate(sample_outputs): text = tokenizer.decode(sample_output, skip_special_tokens=True) # Remove all text after the stop token text = text[: text.find(stop_token) if stop_token else None] # Remove all text after 3 newlines text = text[: text.find(new_lines) if new_lines else None] print("\ {}: {}".format(i, text)) print("\ " + 100 * '-') ```
{"language": "he", "license": "mit", "thumbnail": "https://avatars1.githubusercontent.com/u/3617152?norod.jpg", "widget": [{"text": "\u05e2\u05d5\u05d3 \u05d1\u05d9\u05de\u05d9 \u05e7\u05d3\u05dd"}, {"text": "\u05e7\u05d5\u05e8\u05d0\u05d9\u05dd \u05dc\u05d9 \u05d3\u05d5\u05e8\u05d5\u05df \u05d5\u05d0\u05e0\u05d9 \u05de\u05e2\u05d5\u05e0\u05d9\u05d9\u05df \u05dc"}, {"text": "\u05e7\u05d5\u05e8\u05d0\u05d9\u05dd \u05dc\u05d9 \u05d0\u05d9\u05e6\u05d9\u05e7 \u05d5\u05d0\u05e0\u05d9 \u05d7\u05d5\u05e9\u05d1 \u05e9"}, {"text": "\u05d4\u05d7\u05ea\u05d5\u05dc \u05e9\u05dc\u05da \u05de\u05d0\u05d5\u05d3 \u05d7\u05de\u05d5\u05d3 \u05d5"}, {"text": "\u05d5\u05d1\u05d3\u05e8\u05da \u05e8\u05d0\u05d9\u05e0\u05d5 \u05e9\u05d4\u05d2\u05df"}]}
text-generation
Norod78/hebrew-gpt_neo-xl
[ "transformers", "pytorch", "jax", "onnx", "safetensors", "gpt_neo", "text-generation", "he", "license:mit", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "he" ]
TAGS #transformers #pytorch #jax #onnx #safetensors #gpt_neo #text-generation #he #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us
# hebrew-gpt_neo-xl Hebrew text generation model based on EleutherAI's gpt-neo. Each was trained on a TPUv3-8 which was made avilable to me via the TPU Research Cloud Program. ## Datasets 1. An assortment of various Hebrew corpuses - I have made it available here 2. oscar / unshuffled_deduplicated_he - Homepage | Dataset Permalink The Open Super-large Crawled ALMAnaCH coRpus is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture. 3. CC100-Hebrew Dataset Homepage Created by Conneau & Wenzek et al. at 2020, the CC100-Hebrew This dataset is one of the 100 corpora of monolingual data that was processed from the January-December 2018 Commoncrawl snapshots from the CC-Net repository. The size of this corpus is 6.1G., in Hebrew language. ## Training Config Available here <BR> ## Usage ### Google Colab Notebook Available here <BR> #### Simple usage sample code
[ "# hebrew-gpt_neo-xl\n\nHebrew text generation model based on EleutherAI's gpt-neo. Each was trained on a TPUv3-8 which was made avilable to me via the TPU Research Cloud Program.", "## Datasets\n\n1. An assortment of various Hebrew corpuses - I have made it available here\n\n\n2. oscar / unshuffled_deduplicated_he - Homepage | Dataset Permalink\n\nThe Open Super-large Crawled ALMAnaCH coRpus is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture.\n\n3. CC100-Hebrew Dataset Homepage \n\nCreated by Conneau & Wenzek et al. at 2020, the CC100-Hebrew This dataset is one of the 100 corpora of monolingual data that was processed from the January-December 2018 Commoncrawl snapshots from the CC-Net repository. The size of this corpus is 6.1G., in Hebrew language.", "## Training Config\n\nAvailable here <BR>", "## Usage", "### Google Colab Notebook\n\nAvailable here <BR>", "#### Simple usage sample code" ]
[ "TAGS\n#transformers #pytorch #jax #onnx #safetensors #gpt_neo #text-generation #he #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n", "# hebrew-gpt_neo-xl\n\nHebrew text generation model based on EleutherAI's gpt-neo. Each was trained on a TPUv3-8 which was made avilable to me via the TPU Research Cloud Program.", "## Datasets\n\n1. An assortment of various Hebrew corpuses - I have made it available here\n\n\n2. oscar / unshuffled_deduplicated_he - Homepage | Dataset Permalink\n\nThe Open Super-large Crawled ALMAnaCH coRpus is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture.\n\n3. CC100-Hebrew Dataset Homepage \n\nCreated by Conneau & Wenzek et al. at 2020, the CC100-Hebrew This dataset is one of the 100 corpora of monolingual data that was processed from the January-December 2018 Commoncrawl snapshots from the CC-Net repository. The size of this corpus is 6.1G., in Hebrew language.", "## Training Config\n\nAvailable here <BR>", "## Usage", "### Google Colab Notebook\n\nAvailable here <BR>", "#### Simple usage sample code" ]
[ 62, 54, 175, 9, 3, 11, 6 ]
[ "passage: TAGS\n#transformers #pytorch #jax #onnx #safetensors #gpt_neo #text-generation #he #license-mit #autotrain_compatible #endpoints_compatible #has_space #region-us \n# hebrew-gpt_neo-xl\n\nHebrew text generation model based on EleutherAI's gpt-neo. Each was trained on a TPUv3-8 which was made avilable to me via the TPU Research Cloud Program.## Datasets\n\n1. An assortment of various Hebrew corpuses - I have made it available here\n\n\n2. oscar / unshuffled_deduplicated_he - Homepage | Dataset Permalink\n\nThe Open Super-large Crawled ALMAnaCH coRpus is a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture.\n\n3. CC100-Hebrew Dataset Homepage \n\nCreated by Conneau & Wenzek et al. at 2020, the CC100-Hebrew This dataset is one of the 100 corpora of monolingual data that was processed from the January-December 2018 Commoncrawl snapshots from the CC-Net repository. The size of this corpus is 6.1G., in Hebrew language.## Training Config\n\nAvailable here <BR>## Usage### Google Colab Notebook\n\nAvailable here <BR>#### Simple usage sample code" ]
[ -0.08414071053266525, 0.2443516105413437, -0.0021477791015058756, 0.040175534784793854, 0.003136132378131151, 0.029096197336912155, 0.23362542688846588, 0.08832959830760956, -0.044416364282369614, 0.05854024365544319, 0.035963382571935654, -0.05113283917307854, 0.10012456774711609, 0.04492336884140968, 0.060796286910772324, -0.12083142250776291, -0.03483930230140686, -0.027214378118515015, -0.05627516284584999, 0.04040922969579697, 0.06442916393280029, -0.048869453370571136, 0.07836207747459412, -0.0573456846177578, -0.07288870215415955, 0.012678908184170723, -0.05809757485985756, -0.06033902242779732, 0.053667716681957245, 0.07067406922578812, 0.08394568413496017, -0.00501073757186532, 0.03940090164542198, -0.1541011482477188, 0.017303433269262314, 0.0690867081284523, -0.04058478772640228, 0.042044561356306076, 0.09198503196239471, -0.05217054858803749, 0.21824482083320618, -0.16768062114715576, -0.07084331661462784, 0.05552691966295242, -0.08942946046590805, -0.10466628521680832, -0.14109347760677338, 0.059815846383571625, -0.006980644538998604, 0.07762221246957779, -0.00034243022673763335, 0.10898177325725555, -0.00847272016108036, 0.0534665621817112, 0.040192797780036926, -0.1696823537349701, -0.0688730925321579, 0.1472306251525879, -0.05742860585451126, 0.13990916311740875, -0.08380862325429916, -0.03157961741089821, 0.040527988225221634, 0.0178342517465353, -0.016880758106708527, -0.04750450700521469, -0.09830187261104584, -0.038507092744112015, -0.07713095843791962, -0.06141948699951172, 0.16333913803100586, -0.025977853685617447, 0.011842898093163967, -0.07899046689271927, -0.07798801362514496, 0.0017270775279030204, 0.0695054903626442, 0.07842811942100525, 0.004948554094880819, -0.015335150063037872, -0.03909362107515335, -0.13309884071350098, -0.120609350502491, 0.038724932819604874, -0.01621881127357483, 0.062299687415361404, 0.011526454240083694, 0.060923732817173004, 0.02361566759645939, 0.061980050057172775, 0.0733133926987648, -0.13946208357810974, -0.009889403358101845, -0.03895779699087143, -0.016322901472449303, -0.038474295288324356, -0.0010521977674216032, -0.08723331242799759, 0.03377695754170418, 0.10735306888818741, -0.04096827283501625, 0.006103675812482834, -0.012808538042008877, -0.007805493660271168, 0.003655313979834318, 0.14210812747478485, -0.004571361932903528, -0.11722810566425323, 0.050000354647636414, -0.0224123764783144, 0.06980680674314499, 0.03007359802722931, -0.07375513017177582, -0.01969262771308422, -0.01217567641288042, 0.030889049172401428, -0.00675757322460413, 0.05096225440502167, 0.092974953353405, -0.05378781631588936, 0.1443934589624405, -0.13353599607944489, -0.017502201721072197, 0.05122143030166626, -0.01723848469555378, -0.0007919210474938154, 0.009351196698844433, -0.04535957798361778, -0.10281466692686081, -0.021599056199193, -0.03482385352253914, 0.001697263796813786, -0.04372107610106468, -0.08762739598751068, 0.053845275193452835, -0.10645880550146103, -0.017221977934241295, -0.15381461381912231, -0.10222988575696945, -0.041548676788806915, 0.030154749751091003, 0.0035718795843422413, 0.045784998685121536, -0.07870806008577347, -0.002496709581464529, -0.01566920056939125, -0.0211914274841547, -0.019164754077792168, -0.0431862473487854, 0.03928491845726967, -0.08168434351682663, 0.02018405683338642, -0.07043499499559402, 0.013851706869900227, -0.13176944851875305, -0.009674389846622944, -0.05147993564605713, 0.07588306814432144, -0.07836836576461792, 0.03750672936439514, -0.11385437101125717, -0.02896784618496895, 0.009628672152757645, -0.046963710337877274, 0.006657127290964127, 0.18813163042068481, -0.225515216588974, -0.06291142851114273, 0.20565935969352722, -0.1125330775976181, -0.07143250107765198, 0.0866355448961258, -0.023127885535359383, 0.07761789113283157, 0.10108431428670883, 0.16860869526863098, 0.1287553906440735, -0.0471954420208931, -0.1677357405424118, 0.014111065305769444, 0.05976608023047447, 0.023016467690467834, 0.03579586744308472, -0.04537398740649223, 0.1063886508345604, -0.012750527821481228, -0.0032621771097183228, 0.043097540736198425, 0.033220406621694565, -0.08792974054813385, 0.020851869136095047, -0.04885726049542427, -0.08680222183465958, 0.018952570855617523, 0.023226214572787285, -0.016425779089331627, -0.04670727625489235, -0.08663991838693619, 0.08451665192842484, -0.0740283653140068, 0.06226151064038277, -0.017670907080173492, 0.0059399958699941635, -0.06076068431138992, 0.012802238576114178, -0.07203381508588791, -0.07733403891324997, 0.04837068170309067, -0.07296236604452133, 0.10482989251613617, -0.032532356679439545, 0.08940152823925018, 0.052661169320344925, -0.044224243611097336, -0.0236225426197052, 0.01764453947544098, -0.021587861701846123, -0.058326974511146545, -0.08673621714115143, -0.02918124385178089, -0.010107835754752159, 0.1223076805472374, -0.11731500178575516, -0.008951512165367603, 0.041283175349235535, 0.11675748974084854, 0.03412442281842232, -0.04684362933039665, 0.017466308549046516, -0.008123367093503475, -0.04093329235911369, -0.13380609452724457, -0.01802959106862545, 0.02202831581234932, -0.09666892886161804, 0.043476443737745285, -0.10152594745159149, -0.06611411273479462, 0.12895341217517853, 0.05899183452129364, -0.007676251698285341, -0.08716998994350433, -0.0503624826669693, -0.026551591232419014, 0.026706937700510025, 0.002877657301723957, 0.14742636680603027, -0.0009386540623381734, 0.08129248768091202, -0.07119432836771011, -0.0326869897544384, -0.014757576398551464, 0.02048742026090622, -0.0014713386772200465, 0.07701334357261658, 0.030315039679408073, -0.123326376080513, 0.14050978422164917, 0.18341007828712463, -0.011007457040250301, 0.15907487273216248, -0.017657464370131493, -0.07142040878534317, 0.023290002718567848, 0.023794231936335564, 0.03128909319639206, 0.10738104581832886, -0.013750875368714333, -0.01562225166708231, -0.0051301149651408195, -0.005151465069502592, 0.05098606273531914, -0.06805779039859772, 0.043386105448007584, -0.030464982613921165, -0.04671362414956093, 0.07873266935348511, -0.005185266025364399, -0.031684935092926025, 0.06345967203378677, -0.010860173963010311, 0.029284248128533363, -0.03402290865778923, -0.058553267270326614, -0.06381300836801529, 0.14887920022010803, -0.12951986491680145, -0.1147008091211319, -0.08509353548288345, -0.037556231021881104, -0.06689620763063431, 0.04081464186310768, 0.03632345050573349, -0.05958276242017746, -0.04988475516438484, -0.09592834115028381, 0.055415503680706024, 0.030314190313220024, -0.062260620296001434, -0.0949050784111023, 0.016550272703170776, -0.04676758497953415, -0.15424002707004547, 0.039830636233091354, 0.0007813768461346626, -0.18668222427368164, 0.048697371035814285, -0.027418818324804306, 0.01955459825694561, 0.06087042763829231, 0.057740796357393265, -0.014584320597350597, -0.039891619235277176, 0.1440933495759964, -0.09286872297525406, 0.061010587960481644, 0.00586841581389308, 0.013014841824769974, 0.015083624981343746, 0.09435775130987167, 0.007420203648507595, -0.030540021136403084, -0.02271156944334507, -0.018045959994196892, -0.04258674755692482, -0.2734783887863159, -0.12722541391849518, -0.05559706687927246, 0.033357057720422745, 0.09909208863973618, 0.05600593611598015, 0.012810679152607918, 0.08036039024591446, -0.14237934350967407, 0.09442663937807083, 0.045954927802085876, 0.06990376859903336, -0.006202222313731909, 0.02453441545367241, -0.03766391798853874, -0.07656694203615189, -0.015342782251536846, 0.11384940892457962, 0.1507849097251892, 0.2119559496641159, -0.04158365726470947, 0.2366839498281479, 0.004016585182398558, 0.09641023725271225, -0.03940950706601143, 0.05684905871748924, 0.04321756958961487, 0.09444303810596466, -0.01766849495470524, -0.13667328655719757, -0.03747931122779846, 0.10998255759477615, 0.027804898098111153, -0.029442986473441124, 0.0812227874994278, -0.08120094239711761, 0.07793274521827698, 0.19164563715457916, 0.08524839580059052, -0.1472911387681961, -0.05003649368882179, 0.08127903193235397, 0.0016266601160168648, -0.0584738552570343, 0.0393759123980999, 0.13097451627254486, -0.07175390422344208, 0.07100468128919601, -0.003231913084164262, 0.07505656778812408, -0.1001763865351677, -0.01561770960688591, 0.023206079378724098, -0.014065011404454708, -0.00867130234837532, 0.08323908597230911, -0.21302524209022522, 0.1470976024866104, 0.07943658530712128, 0.052319642156362534, -0.09383436292409897, 0.009718982502818108, 0.022348565980792046, -0.05411110445857048, 0.13358990848064423, 0.057931262999773026, -0.04885929077863693, -0.03656921535730362, -0.14972183108329773, 0.023362236097455025, 0.07671179622411728, -0.04344640672206879, 0.07869520783424377, 0.06328757107257843, -0.007036656606942415, -0.056642379611730576, 0.0007692244253121316, -0.08235461264848709, -0.21166786551475525, 0.030116098001599312, 0.02931120991706848, -0.02664969488978386, -0.034437134861946106, -0.025101179257035255, -0.028382379561662674, 0.17835454642772675, -0.18176613748073578, -0.10895106196403503, -0.06192240118980408, -0.016238991171121597, 0.1344582438468933, -0.08801810443401337, -0.0005767233669757843, 0.010440084151923656, 0.011672988533973694, -0.08496101200580597, -0.14107631146907806, -0.008666972629725933, -0.0560566708445549, -0.0823364332318306, -0.03639496490359306, 0.16707055270671844, 0.06351068615913391, 0.0332973338663578, -0.0005230362294241786, 0.027300525456666946, -0.02432788722217083, -0.08887643367052078, -0.01679234765470028, 0.138465017080307, 0.0744432657957077, 0.07681253552436829, -0.011645552702248096, -0.15753640234470367, -0.09172695875167847, -0.006190381944179535, 0.03125263378024101, 0.15759743750095367, -0.023084597662091255, 0.04687867313623428, 0.134768545627594, -0.11262786388397217, -0.18441180884838104, -0.03507920354604721, -0.00676099443808198, -0.0003712422330863774, -0.07213915884494781, -0.20775824785232544, 0.05898718908429146, 0.043281204998493195, 0.007975560612976551, 0.17441441118717194, -0.21350418031215668, -0.0815160796046257, 0.06303545832633972, 0.0009229593561030924, -0.022974863648414612, -0.14311188459396362, -0.08034998923540115, -0.017860647290945053, -0.031543903052806854, 0.11811478435993195, -0.1394716203212738, 0.06059088930487633, 0.012682227417826653, 0.011026142165064812, 0.029519526287913322, -0.060628920793533325, 0.06648305058479309, 0.13508062064647675, 0.04231707379221916, -0.05599820613861084, 0.010481299832463264, 0.11317815631628036, -0.030635306611657143, 0.027232015505433083, -0.02691969834268093, 0.023410478606820107, -0.0942026898264885, 0.019023794680833817, -0.060405753552913666, 0.10234477370977402, -0.054473187774419785, -0.07480942457914352, -0.0730506107211113, 0.09331101924180984, 0.10507045686244965, 0.027278952300548553, 0.16360054910182953, 0.04794967547059059, 0.04423503205180168, 0.11870285123586655, 0.08047015219926834, 0.062207214534282684, 0.04862276464700699, -0.020201317965984344, -0.014040237292647362, 0.08100668340921402, -0.0628083124756813, -0.018380863592028618, 0.08136026561260223, 0.007122153881937265, 0.02346218377351761, -0.029438627883791924, -0.18299733102321625, -0.0004574551712721586, 0.04691388085484505, -0.1468242108821869, 0.008945571258664131, -0.05212291330099106, 0.02971077896654606, -0.02538638934493065, 0.053478874266147614, 0.1291482150554657, -0.06807848066091537, -0.031722284853458405, -0.028546186164021492, 0.030430858954787254, 0.005517172161489725, 0.0955299437046051, -0.0019602130632847548, -0.027137255296111107, -0.08253622055053711, 0.08850617706775665, 0.16496394574642181, -0.1311640739440918, -0.028443019837141037, 0.1627311408519745, -0.10677964985370636, -0.050590481609106064, -0.03398260846734047, -0.008083558641374111, -0.0865737721323967, -0.007937646470963955, 0.05778224393725395, -0.031907204538583755, -0.06727880984544754, 0.14436250925064087, 0.026001600548624992, 0.08760417997837067, 0.033093541860580444, 0.022779555991292, -0.040803831070661545, 0.05494898185133934, -0.034367308020591736, 0.026573605835437775, 0.015232134610414505, 0.07837510108947754, -0.009095768444240093, 0.01669776812195778, 0.01071322150528431, 0.005293338559567928, -0.0929599329829216, -0.040845852345228195, -0.08275923877954483, 0.05633251741528511, -0.0904315784573555, 0.015674227848649025, -0.04251402989029884, 0.018954159691929817, -0.011661804281175137, 0.0003125857620034367, -0.01201506145298481, 0.001393132726661861, -0.06004830822348595, 0.06573520600795746, -0.11674489080905914, -0.0344897136092186, -0.011522671207785606, -0.08490435779094696, 0.09716645628213882, 0.03448695316910744, 0.02838989533483982, 0.05182987451553345, -0.08112805336713791, -0.03307254984974861, -0.019772376865148544, 0.07149030268192291, 0.024275973439216614, -0.02871098928153515, -0.008783440105617046, 0.02084299363195896, 0.02650868147611618, -0.011488838121294975, -0.012897839769721031, -0.03477022424340248, 0.053915925323963165, -0.022218793630599976, 0.010968777351081371, -0.0558101087808609, 0.05485257878899574, 0.0575450137257576, 0.002636091783642769, 0.06453025341033936, -0.0781165137887001, 0.06738576292991638, -0.12067550420761108, -0.003595157992094755, 0.022736918181180954, -0.07018494606018066, -0.11947400122880936, 0.0016332672676071525, 0.07201481610536575, 0.02285289391875267, 0.17598731815814972, -0.008626582100987434, 0.005386320408433676, 0.012619469314813614, -0.018918730318546295, -0.1045379638671875, 0.011520801112055779, 0.1356874406337738, 0.026089197024703026, -0.011356659233570099, -0.013393083587288857, -0.040691375732421875, -0.03235167637467384, -0.005716252140700817, 0.12250027060508728, 0.13820651173591614, 0.10892435163259506, 0.06973294913768768, 0.012699308805167675, -0.12695397436618805, 0.017094602808356285, -0.054028015583753586, 0.03835245594382286, 0.03321658819913864, -0.07487647980451584, 0.021218329668045044, 0.08910571038722992, -0.1511288434267044, 0.016545584425330162, -0.0023897977080196142, -0.045221634209156036, -0.06973970681428909, -0.15263327956199646, -0.04777318984270096, 0.028364887461066246, 0.007320929318666458, -0.09028588980436325, 0.01888399012386799, 0.03241394832730293, 0.05194777622818947, -0.04055352136492729, 0.11112239956855774, 0.012388687580823898, -0.09615519642829895, 0.047005075961351395, 0.04884660243988037, 0.03355303779244423, 0.013174735940992832, -0.02964959666132927, -0.023009201511740685, 0.01831822656095028, 0.0323403999209404, 0.043414805084466934, 0.12011784315109253, 0.058342594653367996, -0.03553216531872749, -0.07002890855073929, -0.0324922576546669, 0.02136809192597866, 0.039625898003578186, 0.20622184872627258, 0.04949896037578583, -0.005455391947180033, 0.02111220546066761, 0.09263500571250916, -0.030279455706477165, -0.07460232824087143, -0.05026067793369293, 0.14576837420463562, -0.019706545397639275, -0.024608079344034195, 0.014577227644622326, -0.08891043812036514, 0.017507649958133698, 0.16011904180049896, 0.2821750342845917, -0.014267854392528534, 0.004866534378379583, -0.013090208172798157, 0.0010002633789554238, -0.017491530627012253, 0.08693181723356247, -0.003445147071033716, 0.18372994661331177, -0.03775688633322716, -0.012635813094675541, 0.05278528109192848, -0.0005030907923355699, -0.10915442556142807, 0.07795495539903641, -0.0835888534784317, -0.016175679862499237, 0.009473403915762901, 0.059612590819597244, -0.06890653818845749, -0.20563596487045288, 0.03993727266788483, -0.10688146203756332, -0.07443029433488846, 0.026499323546886444, 0.02804049476981163, 0.068538136780262, 0.040953103452920914, 0.00664184708148241, -0.014095529913902283, 0.1516658067703247, -0.020208626985549927, -0.11930791288614273, -0.04954714700579643, 0.07445889711380005, -0.19968828558921814, 0.22488537430763245, 0.009721805341541767, 0.08457841724157333, 0.0318308062851429, 0.0013064071536064148, -0.12188608199357986, 0.0386592261493206, 0.06318972259759903, -0.06334012001752853, -0.003547872183844447, 0.09290598332881927, -0.05908837914466858, 0.13861942291259766, 0.052432768046855927, 0.009688140824437141, 0.03194184601306915, 0.1967620849609375, 0.01839141920208931, -0.09037904441356659, 0.05127078294754028, -0.09666477143764496, 0.14695480465888977, 0.12555573880672455, -0.0337434820830822, 0.008651363663375378, -0.022334663197398186, -0.04695365950465202, 0.02142868936061859, 0.08174039423465729, -0.010355123318731785, -0.06995174288749695, -0.03975085914134979, -0.209289088845253, 0.06994033604860306, -0.13161683082580566, 0.037916842848062515, -0.039520230144262314, -0.02936408855021, -0.08074557036161423, 0.12079179286956787, -0.01127778273075819, -0.015304245054721832, -0.021622581407427788, 0.06299802660942078, -0.02445082925260067, 0.009919168427586555, -0.1099969819188118, -0.10091588646173477 ]
null
null
transformers
# hebrew_poetry-gpt_neo-small Hebrew poetry text generation model, fined tuned upon [hebrew-gpt_neo-small](https://huggingface.co/Norod78/hebrew-gpt_neo-small) which was trained using [EleutherAI's gpt-neo](https://github.com/EleutherAI/gpt-neo). Fine-tuning was done using [@minimaxir](https://twitter.com/minimaxir)'s [aitextgen](https://github.com/minimaxir/aitextgen). ## Datasets 1. Text from [New stage](http://stage.co.il/) 2. A dataset containing Hebrew lyrics
{"language": "he", "license": "mit", "thumbnail": "https://avatars1.githubusercontent.com/u/3617152?norod.jpg", "widget": [{"text": "\u05e4\u05e2\u05dd \u05d0\u05d7\u05ea \u05dc\u05e4\u05e0\u05d9 \u05e9\u05e0"}, {"text": "\u05d4\u05d9\u05dd \u05db\u05d7\u05d5\u05dc \u05d5\u05d0\u05e0\u05d9 \u05d7"}, {"text": "\u05e9\u05dd \u05d4\u05d9\u05e6\u05d9\u05e8\u05d4:"}, {"text": "\u05db\u05e9\u05d4\u05de\u05db\u05d5\u05e0\u05d5\u05ea"}]}
text-generation
Norod78/hebrew_poetry-gpt_neo-small
[ "transformers", "pytorch", "jax", "safetensors", "gpt_neo", "text-generation", "he", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "he" ]
TAGS #transformers #pytorch #jax #safetensors #gpt_neo #text-generation #he #license-mit #autotrain_compatible #endpoints_compatible #region-us
# hebrew_poetry-gpt_neo-small Hebrew poetry text generation model, fined tuned upon hebrew-gpt_neo-small which was trained using EleutherAI's gpt-neo. Fine-tuning was done using @minimaxir's aitextgen. ## Datasets 1. Text from New stage 2. A dataset containing Hebrew lyrics
[ "# hebrew_poetry-gpt_neo-small\n\nHebrew poetry text generation model, fined tuned upon hebrew-gpt_neo-small which was trained using EleutherAI's gpt-neo. \nFine-tuning was done using @minimaxir's aitextgen.", "## Datasets\n\n1. Text from New stage\n2. A dataset containing Hebrew lyrics" ]
[ "TAGS\n#transformers #pytorch #jax #safetensors #gpt_neo #text-generation #he #license-mit #autotrain_compatible #endpoints_compatible #region-us \n", "# hebrew_poetry-gpt_neo-small\n\nHebrew poetry text generation model, fined tuned upon hebrew-gpt_neo-small which was trained using EleutherAI's gpt-neo. \nFine-tuning was done using @minimaxir's aitextgen.", "## Datasets\n\n1. Text from New stage\n2. A dataset containing Hebrew lyrics" ]
[ 54, 71, 18 ]
[ "passage: TAGS\n#transformers #pytorch #jax #safetensors #gpt_neo #text-generation #he #license-mit #autotrain_compatible #endpoints_compatible #region-us \n# hebrew_poetry-gpt_neo-small\n\nHebrew poetry text generation model, fined tuned upon hebrew-gpt_neo-small which was trained using EleutherAI's gpt-neo. \nFine-tuning was done using @minimaxir's aitextgen.## Datasets\n\n1. Text from New stage\n2. A dataset containing Hebrew lyrics" ]
[ -0.026854436844587326, 0.07951466739177704, -0.004652809351682663, 0.09018399566411972, 0.0801687017083168, 0.0025408603250980377, 0.19545963406562805, 0.05285925790667534, -0.0410558320581913, -0.053997963666915894, 0.16818754374980927, 0.04330594092607498, 0.0005676037399098277, 0.04853442311286926, -0.0008420488447882235, -0.21653355658054352, -0.05011507496237755, 0.01964886672794819, 0.11888273060321808, 0.0987808108329773, 0.11448732763528824, -0.004909608978778124, 0.09318146109580994, -0.029682310298085213, -0.01224722433835268, -0.0029043774120509624, 0.009048671461641788, -0.0685369223356247, 0.10815320163965225, 0.05986057221889496, -0.032154716551303864, 0.03958171606063843, -0.04182074964046478, -0.1473984718322754, 0.0416160486638546, -0.007716025225818157, -0.015411853790283203, -0.05634326487779617, 0.06256826967000961, -0.13727101683616638, 0.27138835191726685, -0.08536725491285324, -0.08851397782564163, -0.013805241324007511, -0.1441957950592041, -0.08895052224397659, -0.03810100257396698, 0.03585760295391083, 0.11802574247121811, 0.1445203274488449, -0.0197602566331625, 0.1676015704870224, -0.02127210795879364, 0.09528714418411255, 0.06811650842428207, -0.35251766443252563, -0.03741970285773277, 0.057400427758693695, -0.041788674890995026, 0.11137928813695908, -0.09880822151899338, 0.06759102642536163, 0.0567656010389328, 0.061060234904289246, -0.028906486928462982, -0.08117914944887161, -0.10984932631254196, 0.04871084541082382, -0.12048781663179398, -0.028589680790901184, 0.2557906210422516, -0.08019447326660156, -0.009513177908957005, -0.05302780121564865, -0.10198196768760681, 0.02063673362135887, 0.02609770931303501, -0.029169421643018723, -0.060123734176158905, 0.029058340936899185, 0.021664736792445183, -0.1357031911611557, -0.1370527744293213, 0.015980953350663185, -0.1260763704776764, 0.032087575644254684, 0.015040644444525242, -0.0007265549502335489, -0.0706707239151001, 0.0407436266541481, -0.054788943380117416, -0.09731433540582657, -0.03076542727649212, -0.0721755102276802, 0.18686266243457794, -0.05739768594503403, -0.07533000409603119, -0.09384087473154068, 0.017431920394301414, 0.1618209332227707, 0.08830905705690384, 0.021730227395892143, 0.012946439906954765, 0.03747538477182388, 0.027128828689455986, 0.06821198016405106, 0.16088242828845978, -0.06522643566131592, 0.07244046032428741, 0.011685093864798546, 0.16812445223331451, -0.002326139947399497, -0.13049322366714478, -0.03619026765227318, 0.042682092636823654, 0.024638889357447624, -0.07175812870264053, 0.09658598154783249, 0.05778196081519127, -0.003734975354745984, 0.05963557958602905, -0.1037292405962944, -0.038490261882543564, -0.026270467787981033, 0.04367992654442787, -0.04520965740084648, -0.043845873326063156, 0.006569872610270977, -0.09339118748903275, 0.03263239562511444, -0.08514910936355591, 0.03551880642771721, 0.058059532195329666, -0.01609690859913826, 0.025763604789972305, -0.04143116623163223, 0.10187076777219772, -0.2383507341146469, -0.19261585175991058, 0.017088571563363075, 0.008931045420467854, 0.014383712783455849, -0.025110434740781784, -0.08459749817848206, -0.0008683608612045646, 0.01627560704946518, -0.09119372069835663, -0.08989182859659195, -0.08236685395240784, 0.08331143856048584, 0.012178454548120499, 0.02806130237877369, -0.0636332631111145, 0.014486941508948803, -0.23052194714546204, -0.05455411970615387, -0.02925264835357666, 0.050892747938632965, 0.03326927870512009, -0.07785158604383469, -0.06746753305196762, -0.01664767414331436, 0.05269967392086983, -0.0035798996686935425, -0.01817881315946579, 0.28998249769210815, -0.20476685464382172, -0.12595510482788086, 0.13840371370315552, -0.12032219022512436, -0.1287645846605301, 0.16748514771461487, 0.005667032208293676, 0.010228062979876995, 0.18840070068836212, 0.2916065454483032, -0.0055442131124436855, 0.0015260301297530532, -0.14758028090000153, 0.09298357367515564, 0.08985096961259842, -0.00836676824837923, 0.08345211297273636, 0.04749641194939613, -0.011425974778831005, -0.0013804465997964144, -0.05088998004794121, 0.055327240377664566, 0.026194032281637192, -0.10953988134860992, -0.024945467710494995, -0.0012692348100245, 0.07115329802036285, 0.004876691848039627, 0.0418335385620594, -0.0873197540640831, -0.05166499689221382, -0.12087543308734894, 0.06121910735964775, 0.012247717007994652, 0.11608630418777466, -0.06999955326318741, 0.08185373246669769, -0.04300495609641075, 0.02503274381160736, -0.06295675784349442, -0.02627142332494259, -0.07607634365558624, 0.10691346228122711, 0.10256516933441162, -0.0780717134475708, 0.07401104271411896, 0.010506436228752136, -0.06536858528852463, 0.05103537440299988, 0.03891042619943619, -0.039252907037734985, -0.08570549637079239, -0.09501280635595322, 0.08383290469646454, -0.006448812782764435, 0.06666912138462067, -0.13715501129627228, -0.048663269728422165, 0.020687373355031013, 0.05482591688632965, 0.008267107419669628, 0.03148575499653816, -0.010658984072506428, 0.007181831169873476, -0.07115896046161652, -0.08485634624958038, 0.028855936601758003, 0.07152716815471649, -0.14360278844833374, 0.13324351608753204, -0.15735404193401337, 0.10067389160394669, 0.19617511332035065, -0.05092901363968849, -0.00036805515992455184, -0.00852520763874054, -0.03245912492275238, 0.009192840196192265, 0.06861048936843872, 0.07877156883478165, 0.1482209861278534, -0.05032363161444664, 0.1153985857963562, -0.062131986021995544, 0.004321919288486242, -0.015928152948617935, -0.09223330020904541, -0.016310226172208786, 0.12261076271533966, 0.05468936264514923, -0.16681550443172455, 0.17747828364372253, 0.29687052965164185, -0.06807834655046463, 0.2557408809661865, 0.0031801897566765547, 0.005829020868986845, 0.029945889487862587, -0.03295036032795906, -0.036424458026885986, 0.03699337691068649, -0.10507908463478088, -0.02222919650375843, 0.024204585701227188, 0.017711423337459564, 0.09729745239019394, -0.09922773391008377, -0.05096028745174408, -0.03336145728826523, -0.005158065352588892, 0.041632555425167084, 0.12304560095071793, -0.069913849234581, 0.06459546089172363, -0.049376167356967926, -0.0154722286388278, 0.059519845992326736, 0.0759715586900711, -0.11407549679279327, 0.15239477157592773, -0.14346729218959808, -0.1916504055261612, -0.028983693569898605, -0.1573752462863922, -0.11511936783790588, 0.05421830341219902, 0.12504726648330688, -0.12370295077562332, -0.0015398795949295163, -0.046997662633657455, 0.14955773949623108, -0.07812276482582092, -0.024866754189133644, -0.09651584178209305, -0.06959362328052521, -0.1373445987701416, -0.07371202856302261, -0.057387590408325195, -0.010537144728004932, -0.12377440184354782, 0.07936722785234451, -0.07589637488126755, 0.001110017648898065, 0.10589069873094559, 0.08817145973443985, -0.010397757403552532, -0.053313594311475754, 0.309200257062912, -0.08464469015598297, -0.005120971705764532, 0.20729118585586548, 0.053657058626413345, 0.03648072108626366, 0.10218000411987305, -0.029411500319838524, -0.05436133220791817, 0.03784538432955742, 0.001930101541802287, -0.0638280138373375, -0.2222977727651596, -0.12384208291769028, -0.06627780199050903, 0.09493715316057205, 0.005626375786960125, 0.06208501756191254, -0.013098889030516148, 0.12105977535247803, -0.11972326785326004, 0.10311122983694077, -0.04629991576075554, 0.0981324091553688, 0.249899223446846, 0.028103215619921684, 0.04522290080785751, -0.05712409317493439, -0.10520651191473007, 0.11388741433620453, 0.027956489473581314, 0.07643954455852509, 0.028954189270734787, 0.05894933640956879, 0.06433473527431488, 0.11514189094305038, 0.028223350644111633, -0.08074511587619781, 0.01223727222532034, 0.005801938008517027, -0.041382838040590286, -0.09491149336099625, -0.07422524690628052, 0.07911461591720581, -0.07757149636745453, -0.05677369609475136, -0.004287612624466419, 0.029192306101322174, 0.07222242653369904, -0.0575786791741848, 0.0773007944226265, -0.20824392139911652, -0.056468892842531204, 0.10370224714279175, 0.010953426361083984, -0.07615461945533752, 0.0861336886882782, -0.02953285165131092, -0.09688877314329147, 0.12888509035110474, -0.03684104233980179, 0.07559654861688614, -0.01222056895494461, 0.056960977613925934, -0.09581358730792999, -0.11941162496805191, 0.024120453745126724, 0.10494840890169144, -0.336072713136673, 0.1743251532316208, 0.07138736546039581, 0.10303836315870285, -0.04691382497549057, -0.016520492732524872, 0.06038525328040123, 0.15576477348804474, 0.175887331366539, 0.048177268356084824, -0.07983891665935516, -0.011019903235137463, -0.039559341967105865, 0.03838836029171944, 0.10287878662347794, -0.03561253845691681, 0.052792225033044815, -0.018444666638970375, 0.04200414568185806, -0.07073596864938736, 0.10630222409963608, -0.10227177292108536, -0.21957574784755707, 0.012670730240643024, 0.13180001080036163, 0.0036316365003585815, 0.03168090805411339, -0.05566385015845299, -0.0256718248128891, 0.1198936179280281, -0.04632199555635452, -0.15097437798976898, -0.044378023594617844, -0.06440752744674683, -0.006632338743656874, -0.10416386276483536, -0.046276722103357315, -0.0191622544080019, -0.004969310015439987, -0.11289700865745544, -0.1578114926815033, 0.00681346096098423, -0.07281355559825897, -0.004518951289355755, 0.017836255952715874, 0.23624473810195923, -0.002267198171466589, 0.03667927533388138, 0.02548540011048317, 0.03233101963996887, -0.09579619020223618, -0.086149200797081, -0.0011635525152087212, -0.03803315758705139, 0.06467878073453903, -0.05173467472195625, 0.051470596343278885, -0.05280458554625511, -0.11862772703170776, -0.12447770684957504, 0.14709942042827606, 0.1816222369670868, 0.04014494642615318, 0.09973104298114777, 0.1319986879825592, -0.059409335255622864, -0.22009091079235077, -0.14471788704395294, -0.07655685395002365, 0.002281132386997342, 0.008650216273963451, -0.2120121717453003, 0.05143112689256668, -0.0987718403339386, 0.020047122612595558, 0.10059253871440887, -0.28702878952026367, -0.08832854777574539, 0.1545422524213791, -0.02814728207886219, 0.3078398108482361, -0.11467231065034866, -0.08858274668455124, -0.03906935453414917, -0.08644390106201172, 0.11285006254911423, -0.05544354394078255, 0.12194205820560455, -0.008375688455998898, 0.12240130454301834, -0.005110043566673994, 0.020091334357857704, 0.11949580162763596, 0.06728699058294296, 0.06066032499074936, -0.09724689275026321, -0.062170471996068954, 0.07724842429161072, 0.037980373948812485, -0.03742020204663277, -0.14731885492801666, 0.023058850318193436, -0.12547393143177032, -0.06461641937494278, -0.07495340704917908, 0.06999467313289642, 0.005903895478695631, -0.14848323166370392, -0.051642872393131256, 0.0604986771941185, -0.024039631709456444, -0.011554768308997154, 0.17006312310695648, -0.004988296423107386, 0.06384450197219849, -0.0821836069226265, 0.09756723791360855, -0.08034748584032059, -0.044962845742702484, -0.06699736416339874, -0.03476686775684357, 0.10984104871749878, -0.11764275282621384, -0.03741388022899628, 0.10140097141265869, -0.020579595118761063, 0.0030297390185296535, 0.01984834484755993, -0.05747710540890694, 0.014515741728246212, 0.0831732451915741, -0.24140986800193787, -0.11737305670976639, -0.11143574863672256, 0.011778231710195541, 0.07016205042600632, 0.07834260165691376, 0.20585472881793976, -0.07127539813518524, -0.07148562371730804, -0.03449934720993042, -0.025782376527786255, 0.015750069171190262, 0.030179720371961594, -0.04390202462673187, -0.037466537207365036, -0.10620042681694031, 0.03817285969853401, 0.0957358106970787, -0.16534127295017242, 0.04299887642264366, 0.11437255144119263, -0.11715962737798691, -0.08818641304969788, -0.061521291732788086, 0.09942715615034103, -0.14345704019069672, -0.01099190954118967, -0.003474320052191615, -0.07131130993366241, -0.007656245492398739, 0.09776246547698975, 0.09783676266670227, 0.06313755363225937, -0.06657497584819794, -0.029454190284013748, 0.00021781852410640568, 0.02612515166401863, 0.06717035174369812, -0.0027400613762438297, -0.12224103510379791, 0.08121801167726517, -0.05864504724740982, 0.13494299352169037, -0.017685601487755775, -0.014952591620385647, -0.024487433955073357, 0.007894874550402164, -0.02525433711707592, 0.03759392350912094, -0.0942222997546196, -0.014835026115179062, -0.03550855070352554, 0.02991441823542118, -0.14341290295124054, -0.021030951291322708, -0.02746669016778469, 0.008221698924899101, -0.012533576227724552, 0.08519575744867325, -0.07571273297071457, -0.011238674633204937, 0.017430206760764122, -0.031238829717040062, 0.08032343536615372, 0.08810001611709595, 0.022608546540141106, 0.13696040213108063, -0.06618662178516388, -0.02486184425652027, 0.009654955938458443, 0.058509767055511475, -0.007856318727135658, 0.03866397216916084, -0.032850924879312515, 0.07837432622909546, 0.0709402859210968, 0.054727520793676376, 0.011781850829720497, -0.05439909175038338, -0.03692365065217018, 0.01071460172533989, -0.03313622251152992, -0.056806404143571854, 0.024675412103533745, 0.001113285543397069, 0.024940362200140953, 0.17556659877300262, -0.11565254628658295, -0.0005059895920567214, -0.08839146792888641, 0.050770629197359085, -0.010427364148199558, -0.1204632818698883, -0.15480466187000275, -0.026060717180371284, 0.021791938692331314, -0.008572475053369999, 0.14387871325016022, 0.05330386012792587, -0.060154978185892105, 0.04636874794960022, 0.121935173869133, 0.03813454881310463, -0.0927310585975647, 0.19097298383712769, 0.0951792448759079, -0.06598059833049774, -0.05035742372274399, -0.009255698882043362, 0.025889132171869278, 0.1622990071773529, 0.12135042995214462, 0.1017444059252739, 0.15020713210105896, 0.13239218294620514, -0.14760448038578033, -0.023753052577376366, -0.07211878150701523, -0.1432323157787323, 0.05023874342441559, 0.052752841264009476, -0.012886503711342812, 0.09869135916233063, 0.22327762842178345, -0.012733353301882744, -0.008778286166489124, -0.07617948204278946, -0.04771618917584419, -0.1710374355316162, -0.2515398859977722, -0.07507530599832535, -0.03660290688276291, 0.03538614511489868, -0.07116004824638367, 0.01174548827111721, 0.08330797404050827, 0.06535018980503082, -0.07683050632476807, 0.11251990497112274, 0.014385669492185116, -0.16004562377929688, 0.06258013099431992, -0.07230155169963837, 0.03952017426490784, 0.04439736157655716, -0.03340704366564751, -0.03756493330001831, -0.14848682284355164, -0.0038842957001179457, 0.013202592730522156, 0.09098581224679947, 0.04504590854048729, -0.11211816966533661, -0.07441060245037079, -0.07881274074316025, 0.060169484466314316, 0.08376763761043549, 0.14159135520458221, -0.002044741064310074, -0.01657962054014206, 0.06695091724395752, 0.25347834825515747, 0.040846288204193115, -0.12554140388965607, 0.00039943106821738183, 0.2132708728313446, 0.02771434560418129, 0.026709433645009995, 0.0067489733919501305, -0.00530324038118124, 0.027323104441165924, 0.18661557137966156, 0.23706717789173126, -0.04971054196357727, 0.008874056860804558, -0.025580894201993942, 0.019990170374512672, 0.01398409716784954, 0.07814843207597733, 0.03134216368198395, 0.17695748805999756, -0.10211367905139923, -0.06184229254722595, -0.09129102528095245, -0.005421950947493315, -0.06398816406726837, 0.18247629702091217, 0.030756698921322823, 0.011996343731880188, 0.027593307197093964, 0.05820711702108383, -0.11264362931251526, 0.06315762549638748, -0.13483978807926178, -0.15565305948257446, -0.07527273148298264, 0.04246070608496666, 0.11328698694705963, 0.0684618130326271, 0.10473804175853729, 0.006456618197262287, -0.01600259728729725, -0.05335868149995804, 0.01946808397769928, -0.18765921890735626, 0.036446359008550644, 0.1345982551574707, -0.027333088219165802, 0.11493885517120361, -0.007686029188334942, 0.07261434197425842, 0.032997287809848785, 0.051039181649684906, 0.008943548426032066, 0.13045556843280792, 0.030718239024281502, 0.075676329433918, 0.06181972101330757, 0.04564574360847473, -0.01613738387823105, 0.01005468051880598, 0.0691947489976883, 0.016804415732622147, 0.05123094096779823, 0.2424250692129135, -0.048032887279987335, -0.056180838495492935, 0.11840619146823883, -0.14532750844955444, 0.07394111156463623, 0.03371764346957207, -0.002813582308590412, -0.01563071645796299, -0.025144170969724655, 0.012488177046179771, 0.043997153639793396, 0.04351493716239929, 0.04457763582468033, -0.06136072054505348, -0.0555904358625412, -0.05053706839680672, -0.00533894682303071, -0.22982515394687653, -0.018456146121025085, -0.09093134105205536, 0.05023860186338425, -0.16972441971302032, 0.09313415735960007, 0.00573156401515007, -0.052640266716480255, 0.009770467877388, -0.009427018463611603, 0.014638116583228111, -0.04617668315768242, -0.06651024520397186, -0.13094046711921692 ]
null
null
transformers
# hebrew_stories-gpt_neo-small Hebrew story-text generation model, fined tuned upon [hebrew-gpt_neo-small](https://huggingface.co/Norod78/hebrew-gpt_neo-small) which was trained using [EleutherAI's gpt-neo](https://github.com/EleutherAI/gpt-neo). ## Dataset Text from various Hebrew books
{"language": "he", "license": "mit", "thumbnail": "https://avatars1.githubusercontent.com/u/3617152?norod.jpg", "widget": [{"text": "\u05ea\u05e8\u05d9\u05e1\u05e8 \u05de\u05db\u05e9\u05e4\u05d5\u05ea \u05e1\u05d2"}, {"text": "\n\n\u05d4\u05d0\u05d9\u05e9 \u05d4\u05d0\u05d7\u05e8\u05d5\u05df \u05d1\u05e2\u05d5\u05dc\u05dd /"}, {"text": "\u05e4\u05e2\u05dd \u05d0\u05d7\u05ea, \u05dc\u05e4\u05e0\u05d9 \u05e9\u05e0\u05d9\u05dd \u05e8\u05d1\u05d5\u05ea"}, {"text": "\u05d4\u05e8\u05de\u05d9\u05d5\u05e0\u05d9 \u05d4\u05e1\u05ea\u05d9\u05e8\u05d4 \u05d0\u05ea"}, {"text": "\u05dc\u05e4\u05ea\u05e2, \u05d0\u05d5\u05e8 \u05d9\u05e8\u05d5\u05e7"}]}
text-generation
Norod78/hebrew_stories-gpt_neo-small
[ "transformers", "pytorch", "jax", "safetensors", "gpt_neo", "text-generation", "he", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "he" ]
TAGS #transformers #pytorch #jax #safetensors #gpt_neo #text-generation #he #license-mit #autotrain_compatible #endpoints_compatible #region-us
# hebrew_stories-gpt_neo-small Hebrew story-text generation model, fined tuned upon hebrew-gpt_neo-small which was trained using EleutherAI's gpt-neo. ## Dataset Text from various Hebrew books
[ "# hebrew_stories-gpt_neo-small\n\nHebrew story-text generation model, fined tuned upon hebrew-gpt_neo-small which was trained using EleutherAI's gpt-neo.", "## Dataset\n\nText from various Hebrew books" ]
[ "TAGS\n#transformers #pytorch #jax #safetensors #gpt_neo #text-generation #he #license-mit #autotrain_compatible #endpoints_compatible #region-us \n", "# hebrew_stories-gpt_neo-small\n\nHebrew story-text generation model, fined tuned upon hebrew-gpt_neo-small which was trained using EleutherAI's gpt-neo.", "## Dataset\n\nText from various Hebrew books" ]
[ 54, 52, 9 ]
[ "passage: TAGS\n#transformers #pytorch #jax #safetensors #gpt_neo #text-generation #he #license-mit #autotrain_compatible #endpoints_compatible #region-us \n# hebrew_stories-gpt_neo-small\n\nHebrew story-text generation model, fined tuned upon hebrew-gpt_neo-small which was trained using EleutherAI's gpt-neo.## Dataset\n\nText from various Hebrew books" ]
[ -0.049285635352134705, 0.06700899451971054, -0.0037139456253498793, 0.07702337950468063, 0.06422193348407745, 0.043976154178380966, 0.24461698532104492, 0.07521187514066696, -0.010528549551963806, -0.09473501890897751, 0.1736890822649002, 0.05003129318356514, 0.010870653204619884, 0.11769156157970428, -0.01796419359743595, -0.20977653563022614, -0.042705174535512924, 0.05549290403723717, 0.13500092923641205, 0.10795246064662933, 0.09910214692354202, -0.017136476933956146, 0.07976531982421875, -0.018865065649151802, -0.030107004567980766, -0.0026319478638470173, 0.05018242076039314, -0.10187668353319168, 0.1450922191143036, 0.024648863822221756, 0.014684741385281086, 0.0022096869070082903, -0.020049892365932465, -0.13226249814033508, 0.045254409313201904, 0.009596679359674454, -0.022657854482531548, -0.028554042801260948, 0.045798204839229584, -0.09768424183130264, 0.2736815810203552, -0.02910616062581539, -0.073513463139534, -0.027113160118460655, -0.14523330330848694, -0.14138978719711304, -0.013074767775833607, 0.08540842682123184, 0.05275151878595352, 0.14653056859970093, -0.009143409319221973, 0.1733638346195221, -0.038831669837236404, 0.10174033790826797, 0.11260775476694107, -0.3239104449748993, -0.044271133840084076, 0.056133024394512177, -0.051532015204429626, 0.08203165978193283, -0.08639320731163025, 0.08380040526390076, 0.07667553424835205, 0.049534592777490616, -0.04983768239617348, -0.07131651788949966, -0.11815816909074783, 0.04146188870072365, -0.13176998496055603, -0.047845494002103806, 0.20360824465751648, -0.10627491027116776, 0.025302622467279434, -0.06675984710454941, -0.10827115923166275, 0.005210816860198975, 0.031064705923199654, 0.028012214228510857, -0.07306179404258728, 0.028309710323810577, 0.017248202115297318, -0.09576151520013809, -0.1026967242360115, -0.014324340038001537, -0.1264478713274002, 0.08314575999975204, 0.041761402040719986, 0.05548720434308052, -0.0939917340874672, 0.059529341757297516, 0.004431223962455988, -0.1070670336484909, -0.018780115991830826, -0.07672569155693054, 0.204867422580719, -0.0671781376004219, -0.02543569914996624, -0.004375269170850515, 0.04463048651814461, 0.11607321351766586, 0.03198417276144028, -0.006869457196444273, -0.02221699431538582, 0.05553460866212845, -0.017053455114364624, 0.05633080378174782, 0.15669536590576172, -0.07121752947568893, 0.07959339022636414, -0.00429918197914958, 0.1373845487833023, 0.01205122098326683, -0.12342485040426254, -0.00491290632635355, 0.010404369793832302, 0.044661007821559906, -0.08246666938066483, 0.13156820833683014, 0.09332896023988724, 0.003110079327598214, -0.03078603744506836, -0.09379061311483383, -0.05788044258952141, -0.03894583135843277, 0.03724540397524834, -0.002275758422911167, -0.051514286547899246, -0.03888876736164093, -0.08823560178279877, 0.0685911476612091, -0.09906034171581268, 0.006079602055251598, 0.012724620290100574, -0.020794294774532318, 0.028066158294677734, -0.034051183611154556, 0.09410055726766586, -0.2315550297498703, -0.26477310061454773, 0.03194195777177811, -0.010453934781253338, 0.041499409824609756, -0.0178227461874485, -0.06773196160793304, 0.012808329425752163, 0.012501794844865799, -0.07444724440574646, -0.07451196014881134, -0.10778137296438217, 0.08903500437736511, -0.008105101063847542, 0.01630907878279686, -0.0002086018503177911, -0.018192794173955917, -0.2139873504638672, -0.04356113448739052, -0.035606708377599716, 0.016354359686374664, 0.01584981195628643, 0.010786319151520729, -0.02277992106974125, -0.03125010058283806, 0.07264646887779236, 0.020495181903243065, -0.07199691236019135, 0.27506858110427856, -0.14952422678470612, -0.13725703954696655, 0.23536798357963562, -0.14848017692565918, -0.17936839163303375, 0.1468934267759323, -0.009569717571139336, 0.051174264401197433, 0.20270214974880219, 0.30125701427459717, 0.028523625805974007, -0.005648380611091852, -0.12597717344760895, 0.0739603042602539, 0.05751638114452362, 0.012093666009604931, 0.059523895382881165, 0.04494832083582878, -0.09724549949169159, -0.009527341462671757, -0.07902996242046356, 0.06114897504448891, -0.012889026664197445, -0.09278840571641922, 0.0014030163874849677, -0.02318236045539379, 0.11438465118408203, 0.015864135697484016, 0.04659891501069069, -0.1138630136847496, -0.048669662326574326, -0.10992773622274399, 0.0950796827673912, -0.011648916639387608, 0.09696820378303528, -0.06845728307962418, 0.09068187326192856, -0.03112255595624447, 0.030980277806520462, -0.017133941873908043, -0.050345104187726974, -0.08580675721168518, 0.0753701776266098, 0.05976205691695213, -0.042335595935583115, 0.08388862013816833, 0.0009751502657309175, -0.09040923416614532, 0.024300269782543182, 0.04401479288935661, -0.035496171563863754, -0.033902350813150406, -0.09647100418806076, 0.09448955208063126, 0.01128383632749319, 0.0630410686135292, -0.1542249172925949, -0.0174191165715456, 0.006123602390289307, 0.055711302906274796, -0.011319468729197979, 0.05058402195572853, -0.04272804409265518, -0.015338829718530178, -0.04116439446806908, -0.12215668708086014, 0.0321829579770565, 0.05185101553797722, -0.12749527394771576, 0.10083680599927902, -0.17799730598926544, 0.20016437768936157, 0.19166751205921173, -0.08488734811544418, -0.014144180342555046, -0.02990552969276905, -0.03902055323123932, 0.029766060411930084, 0.013551642186939716, 0.04964258894324303, 0.13931581377983093, -0.035086434334516525, 0.11206760257482529, -0.049268320202827454, -0.022318992763757706, -0.012197266332805157, -0.0617571622133255, 0.03629012778401375, 0.10172870010137558, 0.06718729436397552, -0.20232005417346954, 0.2058285027742386, 0.2893863022327423, -0.03292117640376091, 0.2952875792980194, -0.003856595605611801, 0.023229122161865234, 0.05311262980103493, -0.06062290444970131, -0.008984693326056004, 0.042487964034080505, -0.140659362077713, -0.03372625634074211, 0.034614197909832, 0.00719717051833868, 0.08495413511991501, -0.09203243255615234, -0.06559804826974869, -0.05441765859723091, -0.014524677768349648, 0.02005375735461712, 0.12666870653629303, -0.07541103661060333, 0.07877743989229202, -0.05137656629085541, -0.011076039634644985, 0.06935667246580124, 0.06556285917758942, -0.09430143237113953, 0.18849539756774902, -0.11097396165132523, -0.2022469937801361, -0.068194180727005, -0.16101446747779846, -0.07392670214176178, 0.05485503375530243, 0.11575127393007278, -0.10575108230113983, -0.008489611558616161, -0.002133070956915617, 0.17639876902103424, -0.07846225798130035, 0.002699271310120821, -0.05853227153420448, -0.049577511847019196, -0.14474186301231384, -0.11544477194547653, -0.07535199075937271, -0.026173504069447517, -0.17392437160015106, 0.08379823714494705, -0.09226325899362564, -0.029287977144122124, 0.12275322526693344, 0.06901142001152039, 0.008876854553818703, -0.08866804093122482, 0.22513426840305328, -0.12533262372016907, -0.0252479687333107, 0.23264554142951965, 0.007116280030459166, 0.04038667678833008, 0.09280956536531448, -0.01582530327141285, -0.06242460384964943, 0.05707981437444687, -0.013481158763170242, -0.06542486697435379, -0.2566215395927429, -0.11643064767122269, -0.05987373739480972, 0.14515270292758942, 0.023772740736603737, 0.05548267066478729, 0.08075550198554993, 0.11054160445928574, -0.12595123052597046, 0.06507277488708496, -0.026469171047210693, 0.10274816304445267, 0.2233401983976364, 0.02014883980154991, 0.03599865734577179, -0.04389621689915657, -0.11288587749004364, 0.09270849078893661, 0.004684366751462221, 0.12419168651103973, 0.0325046144425869, 0.024335335940122604, 0.06793031096458435, 0.04162413999438286, 0.05852324888110161, -0.0018373839557170868, 0.009687719866633415, -0.005393095314502716, -0.024640198796987534, -0.06877202540636063, -0.07163143157958984, 0.053935859352350235, -0.10080814361572266, -0.01622006669640541, -0.007741634733974934, -0.025398971512913704, 0.10018011182546616, -0.04808378592133522, 0.0722232237458229, -0.1775069236755371, -0.05550471693277359, 0.10090100020170212, -0.02227621339261532, -0.1139618307352066, 0.10039671510457993, -0.038510579615831375, -0.13983923196792603, 0.1352134644985199, -0.008774815127253532, 0.07939626276493073, -0.04137957841157913, 0.06710754334926605, -0.08604609221220016, -0.1167633906006813, -0.018936313688755035, 0.11744292825460434, -0.3241819441318512, 0.21205544471740723, 0.06118069961667061, 0.07951639592647552, -0.0843876525759697, -0.01753288134932518, 0.0662464126944542, 0.18737974762916565, 0.16648750007152557, 0.0647876039147377, -0.0669865682721138, -0.025640642270445824, -0.014204313047230244, 0.020365355536341667, 0.07366499304771423, -0.05322200059890747, 0.029174109920859337, -0.034139618277549744, 0.03278201445937157, -0.06286032497882843, 0.057465117424726486, -0.034461989998817444, -0.17286601662635803, 0.024833211675286293, 0.09833656996488571, 0.04741953685879707, -0.007833204232156277, -0.07686550915241241, -0.05194646120071411, 0.0981326475739479, -0.058657899498939514, -0.16115312278270721, -0.05174848064780235, -0.04084604233503342, -0.018875498324632645, -0.08982260525226593, -0.016404667869210243, 0.013067230582237244, 0.03773649409413338, -0.11104734241962433, -0.15672965347766876, 0.02047325111925602, -0.0759056881070137, -0.059682950377464294, 0.010470570996403694, 0.23018193244934082, -0.024603448808193207, 0.03291986510157585, 0.019940998405218124, 0.009749475866556168, -0.09842883795499802, -0.10728292167186737, -0.008711711503565311, -0.0064998515881598, 0.08154641091823578, -0.03487903252243996, 0.06675314158201218, -0.020337730646133423, -0.09475446492433548, -0.14291194081306458, 0.14841003715991974, 0.17990145087242126, 0.043643634766340256, 0.07005774974822998, 0.1122753769159317, -0.056166570633649826, -0.24273452162742615, -0.10086700320243835, -0.09822042286396027, -0.03614751249551773, -0.03514498099684715, -0.1338638961315155, 0.07520997524261475, -0.06711344420909882, 0.022968394681811333, 0.12968479096889496, -0.2419639527797699, -0.06260567903518677, 0.17415539920330048, -0.032431744039058685, 0.2952966094017029, -0.12576600909233093, -0.049266938120126724, -0.058958522975444794, -0.077144555747509, 0.09298839420080185, -0.13618257641792297, 0.08691844344139099, -0.02121126838028431, 0.136260524392128, -0.005957395303994417, -0.0022066147066652775, 0.0922851637005806, 0.034125279635190964, 0.032028064131736755, -0.10975252836942673, -0.06630844622850418, 0.12365923821926117, 0.03887831047177315, -0.056486137211322784, -0.13209174573421478, -0.004209102131426334, -0.11477109789848328, -0.06319215148687363, -0.04120989516377449, 0.09719569981098175, 0.002702727448195219, -0.16132859885692596, -0.02688649669289589, 0.05225837603211403, -0.04059530794620514, -0.014323590323328972, 0.19303712248802185, -0.013575609773397446, 0.11861943453550339, -0.13187867403030396, 0.10357807576656342, -0.11872895807027817, -0.018949849531054497, -0.0811440572142601, -0.043010976165533066, 0.10079169273376465, -0.11921330541372299, -0.03708379715681076, 0.09371745586395264, -0.03216074779629707, 0.006593844387680292, 0.05333288386464119, -0.027034549042582512, 0.012275082059204578, 0.06454456597566605, -0.22364076972007751, -0.06349149346351624, -0.10929887741804123, 0.04746353253722191, 0.033629920333623886, 0.1257825493812561, 0.17934346199035645, -0.05097099766135216, -0.057872872799634933, -0.03757355734705925, -0.03536687046289444, -0.00633521331474185, 0.057771679013967514, -0.04927018657326698, -0.016500815749168396, -0.13171203434467316, 0.037393152713775635, 0.07499326020479202, -0.10424415022134781, 0.016262678429484367, 0.08274315297603607, -0.11795736104249954, -0.08737055212259293, -0.0319012813270092, 0.14776697754859924, -0.11557935923337936, -0.039509642869234085, -0.010734503157436848, -0.10988423973321915, 0.022452952340245247, 0.13264498114585876, 0.08782398700714111, 0.056555476039648056, -0.06052110716700554, -0.005540627054870129, 0.0029715457931160927, 0.022663870826363564, 0.02757764235138893, 0.009236491285264492, -0.10833700746297836, 0.1584581732749939, -0.050481121987104416, 0.12403413653373718, -0.025774355977773666, 0.008809174410998821, -0.08234265446662903, 0.050583623349666595, -0.0036186312790960073, 0.03180630877614021, -0.09194092452526093, 0.009231707081198692, -0.03711908683180809, 0.016541216522455215, -0.14026980102062225, -0.010668570175766945, -0.0523652583360672, 0.04871378093957901, 0.0337589755654335, 0.06464110314846039, -0.04908337444067001, -0.006349526345729828, 0.034208424389362335, -0.035727694630622864, 0.09547481685876846, 0.08847785741090775, 0.055911000818014145, 0.1801902800798416, -0.05056476965546608, -0.01792083866894245, 0.014379981905221939, 0.05343194678425789, -0.024475762620568275, 0.10303572565317154, -0.011492371559143066, 0.07085118442773819, 0.02081216499209404, 0.08435898274183273, -0.0694953128695488, -0.03715578839182854, 0.01727743074297905, 0.039047494530677795, -0.03461084142327309, -0.013192557729780674, -0.004823829513043165, 0.028218260034918785, -0.01379604171961546, 0.1541759967803955, -0.11506807059049606, 0.00710575096309185, -0.09194090217351913, 0.05313585326075554, 0.0012879109708592296, -0.1676369607448578, -0.06814625859260559, -0.05305938795208931, 0.03129122406244278, 0.01908082887530327, 0.23699459433555603, 0.06542308628559113, 0.0041618007235229015, 0.0320974737405777, 0.1428481638431549, 0.05413799360394478, -0.09364263713359833, 0.22824805974960327, 0.09101429581642151, -0.03987276554107666, -0.08392442017793655, -0.033000338822603226, 0.03987368196249008, 0.07998844981193542, 0.11898788809776306, 0.036280788481235504, 0.08941523730754852, 0.04311053082346916, -0.1366831660270691, -0.004483819007873535, -0.08603575825691223, -0.24532341957092285, 0.07384207099676132, 0.0038670538924634457, -0.00032835963065735996, 0.1069030687212944, 0.20516246557235718, 0.0011495365761220455, -0.022378064692020416, -0.0767231434583664, -0.04171105474233627, -0.18622037768363953, -0.22030781209468842, -0.05130068212747574, -0.09401580691337585, 0.05274442955851555, -0.08131870627403259, 0.035738907754421234, 0.09379054605960846, 0.08348240703344345, -0.08504483848810196, 0.12580525875091553, 0.019122403115034103, -0.14314348995685577, 0.03378429263830185, -0.037209637463092804, 0.039963994175195694, -0.0336558073759079, -0.049632634967565536, -0.07476890087127686, -0.12613044679164886, -0.0076949018985033035, -0.025855891406536102, 0.03850680589675903, 0.03959246352314949, -0.09520743042230606, -0.054115284234285355, -0.06266364455223083, 0.01760573498904705, 0.0437622144818306, 0.10492333024740219, -0.005904471036046743, -0.00986285787075758, 0.0686035007238388, 0.22586821019649506, -0.0005039421957917511, -0.19645312428474426, 0.009353573434054852, 0.1500018984079361, 0.028627339750528336, 0.03950467333197594, -0.00730988709256053, 0.003162156092002988, 0.03457774221897125, 0.24588122963905334, 0.2786824405193329, -0.019749652594327927, 0.025726595893502235, -0.017836159095168114, 0.013109390623867512, 0.00809598807245493, 0.08532445132732391, 0.027369847521185875, 0.1358235478401184, -0.09748639166355133, -0.0441819466650486, -0.02492433786392212, -0.00026538994279690087, -0.07546284049749374, 0.14504477381706238, 0.010305344127118587, 0.012386085465550423, 0.02713898941874504, 0.06506093591451645, -0.11245977133512497, 0.057263635098934174, -0.12307119369506836, -0.17413491010665894, -0.0675610825419426, 0.018188316375017166, 0.15092121064662933, 0.049983371049165726, 0.07729388028383255, 0.02166021429002285, -0.024821018800139427, -0.05102954059839249, 0.032042764127254486, -0.20718437433242798, -0.07755466550588608, 0.14433683454990387, -0.04921405017375946, 0.14943289756774902, -0.035289764404296875, 0.005752847529947758, 0.036108970642089844, 0.022770464420318604, 0.004573888145387173, 0.11942005157470703, 0.029040640220046043, 0.010325787588953972, 0.05022048577666283, 0.020601240918040276, -0.0581883043050766, -0.04023819789290428, 0.05758414417505264, -0.022488735616207123, 0.03840970620512962, 0.22737635672092438, -0.03168005496263504, -0.058615487068891525, 0.0975910946726799, -0.09791579097509384, 0.07897999882698059, 0.032854314893484116, -0.01311227586120367, 0.002543419599533081, -0.039374131709337234, 0.0250981617718935, 0.0791415348649025, 0.04657025635242462, 0.031524792313575745, -0.05635363608598709, -0.08336637914180756, -0.041082557290792465, -0.023038974031805992, -0.1683196723461151, 0.0014885986456647515, -0.10161753743886948, 0.0375983864068985, -0.19330348074436188, 0.04203241690993309, 0.0353463813662529, -0.04371963441371918, 0.020810481160879135, -0.044766899198293686, 0.008197912946343422, -0.006189333274960518, -0.07483017444610596, -0.11827214062213898 ]
null
null
transformers
# hewiki-articles-distilGPT2py-il ## A tiny GPT2 model for generating Hebrew text A distilGPT2 sized model. <br> Training data was hewiki-20200701-pages-articles-multistream.xml.bz2 from https://dumps.wikimedia.org/hewiki/20200701/ <br> XML has been converted to plain text using Wikipedia Extractor http://medialab.di.unipi.it/wiki/Wikipedia_Extractor <br> I then added <|startoftext|> and <|endoftext|> markers and deleted empty lines. <br> #### How to use ```python import torch import torch.nn as nn from transformers import GPT2Tokenizer, GPT2LMHeadModel tokenizer = GPT2Tokenizer.from_pretrained("Norod78/hewiki-articles-distilGPT2py-il") model = GPT2LMHeadModel.from_pretrained("Norod78/hewiki-articles-distilGPT2py-il").eval() bos_token = tokenizer.bos_token #Beginning of sentace eos_token = tokenizer.eos_token #End of sentence def generate_word(model, tokens_tensor, temperature=1.0): """ Sample a word given a tensor of tokens of previous words from a model. Given the words we have, sample a plausible word. Temperature is used for controlling randomness. If using temperature==0 we simply use a greedy arg max. Else, we sample from a multinomial distribution using a lower inverse temperature to allow for more randomness to escape repetitions. """ with torch.no_grad(): outputs = model(tokens_tensor) predictions = outputs[0] if temperature>0: # Make the distribution more or less skewed based on the temperature predictions = outputs[0]/temperature # Sample from the distribution softmax = nn.Softmax(dim=0) predicted_index = torch.multinomial(softmax(predictions[0,-1,:]),1).item() # Simply take the arg-max of the distribution else: predicted_index = torch.argmax(predictions[0, -1, :]).item() # Decode the encoding to the corresponding word predicted_text = tokenizer.decode([predicted_index]) return predicted_text def generate_sentence(model, tokenizer, initial_text, temperature=1.0): """ Generate a sentence given some initial text using a model and a tokenizer. Returns the new sentence. """ # Encode a text inputs text = "" sentence = text # We avoid an infinite loop by setting a maximum range for i in range(0,84): indexed_tokens = tokenizer.encode(initial_text + text) # Convert indexed tokens in a PyTorch tensor tokens_tensor = torch.tensor([indexed_tokens]) new_word = generate_word(model, tokens_tensor, temperature=temperature) # Here the temperature is slowly decreased with each generated word, # this ensures that the sentence (ending) makes more sense. # We don't decrease to a temperature of 0.0 to leave some randomness in. if temperature<(1-0.008): temperature += 0.008 else: temperature = 0.996 text = text+new_word # Stop generating new words when we have reached the end of the line or the poem if eos_token in new_word: # returns new sentence and whether poem is done return (text.replace(eos_token,"").strip(), True) elif '/' in new_word: return (text.strip(), False) elif bos_token in new_word: return (text.replace(bos_token,"").strip(), False) return (text, True) for output_num in range(1,5): init_text = "בוקר טוב" text = bos_token + init_text for i in range(0,84): sentence = generate_sentence(model, tokenizer, text, temperature=0.9) text = init_text + sentence[0] print(text) if (sentence[1] == True): break ```
{"language": "he", "license": "mit", "thumbnail": "https://avatars1.githubusercontent.com/u/3617152?norod.jpg", "widget": [{"text": "<|startoftext|>\u05d4\u05d7\u05d5\u05e7 \u05d4\u05e9\u05e0\u05d9 \u05e9\u05dc \u05de\u05d5\u05e2\u05d3\u05d5\u05df \u05e7\u05e8\u05d1 \u05d4\u05d5\u05d0"}, {"text": "<|startoftext|>\u05e8\u05d0\u05e9 \u05d4\u05de\u05de\u05e9\u05dc\u05d4 \u05d1\u05df \u05d2\u05d5\u05e8\u05d9\u05d5\u05df"}, {"text": "<|startoftext|>\u05dc\u05de\u05d9\u05d3\u05ea \u05de\u05db\u05d5\u05e0\u05d4 (\u05e1\u05e8\u05d8)"}, {"text": "<|startoftext|>\u05de\u05e0\u05e9\u05d4 \u05e4\u05d5\u05de\u05e4\u05e8\u05e0\u05d9\u05e7\u05dc"}, {"text": "<|startoftext|>\u05d0\u05d9 \u05e9\u05d5\u05d5\u05d9\u05d5\u05df "}]}
text-generation
Norod78/hewiki-articles-distilGPT2py-il
[ "transformers", "pytorch", "tf", "jax", "safetensors", "gpt2", "text-generation", "he", "license:mit", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "he" ]
TAGS #transformers #pytorch #tf #jax #safetensors #gpt2 #text-generation #he #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# hewiki-articles-distilGPT2py-il ## A tiny GPT2 model for generating Hebrew text A distilGPT2 sized model. <br> Training data was URL.bz2 from URL <br> XML has been converted to plain text using Wikipedia Extractor URL <br> I then added <|startoftext|> and <|endoftext|> markers and deleted empty lines. <br> #### How to use
[ "# hewiki-articles-distilGPT2py-il", "## A tiny GPT2 model for generating Hebrew text\n\nA distilGPT2 sized model. <br>\nTraining data was URL.bz2 from URL <br>\nXML has been converted to plain text using Wikipedia Extractor URL <br>\nI then added <|startoftext|> and <|endoftext|> markers and deleted empty lines. <br>", "#### How to use" ]
[ "TAGS\n#transformers #pytorch #tf #jax #safetensors #gpt2 #text-generation #he #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# hewiki-articles-distilGPT2py-il", "## A tiny GPT2 model for generating Hebrew text\n\nA distilGPT2 sized model. <br>\nTraining data was URL.bz2 from URL <br>\nXML has been converted to plain text using Wikipedia Extractor URL <br>\nI then added <|startoftext|> and <|endoftext|> markers and deleted empty lines. <br>", "#### How to use" ]
[ 65, 15, 85, 5 ]
[ "passage: TAGS\n#transformers #pytorch #tf #jax #safetensors #gpt2 #text-generation #he #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# hewiki-articles-distilGPT2py-il## A tiny GPT2 model for generating Hebrew text\n\nA distilGPT2 sized model. <br>\nTraining data was URL.bz2 from URL <br>\nXML has been converted to plain text using Wikipedia Extractor URL <br>\nI then added <|startoftext|> and <|endoftext|> markers and deleted empty lines. <br>#### How to use" ]
[ -0.09180368483066559, 0.03675542026758194, -0.0014716375153511763, 0.09212670475244522, 0.1569482386112213, 0.009634044952690601, 0.20884586870670319, 0.14053979516029358, 0.039453860372304916, -0.021118098869919777, 0.1546362340450287, 0.07564833760261536, 0.017023460939526558, 0.10303780436515808, -0.04789156839251518, -0.1700480580329895, 0.0010546358535066247, 0.0298895426094532, 0.05475439503788948, 0.11366956681013107, 0.09824514389038086, -0.03393402323126793, 0.10736487060785294, -0.014310206286609173, -0.12896405160427094, 0.0289186779409647, 0.04840858280658722, -0.06954224407672882, 0.11219385266304016, 0.059839457273483276, -0.004914001561701298, 0.03363705053925514, 0.01971563696861267, -0.16869953274726868, 0.041457273066043854, 0.041441913694143295, -0.01846294477581978, 0.060586195439100266, 0.044756121933460236, -0.052366867661476135, 0.1308036893606186, -0.07425505667924881, -0.10117042809724808, 0.006889819633215666, -0.1077042818069458, -0.09016149491071701, -0.10217322409152985, 0.12285131961107254, 0.05753876641392708, 0.07973000407218933, 0.010502966120839119, 0.06706433743238449, -0.002822618931531906, 0.1119893416762352, 0.22357559204101562, -0.2874920070171356, -0.009559309110045433, 0.036608487367630005, -0.05309031531214714, 0.07199768722057343, -0.03841492906212807, 0.07501929998397827, 0.019084913656115532, 0.029677480459213257, 0.01384417712688446, -0.05185561999678612, -0.11336714774370193, 0.033969540148973465, -0.05938687548041344, -0.06769292801618576, 0.20594792068004608, -0.005391926970332861, -0.026343317702412605, -0.059329137206077576, -0.08131290227174759, -0.015680793672800064, -0.034761372953653336, 0.05292071774601936, -0.0621182918548584, 0.00520958099514246, 0.004710138309746981, -0.19344837963581085, -0.1272493153810501, -0.028785351663827896, -0.054451603442430496, 0.17963837087154388, 0.020257942378520966, 0.06968893855810165, -0.08308087289333344, 0.12397538125514984, -0.001672897720709443, -0.11058315634727478, -0.05265329033136368, -0.05033637955784798, 0.13739760220050812, -0.04406493157148361, -0.026501713320612907, -0.10847631096839905, 0.04985273256897926, 0.13869072496891022, -0.018275685608386993, 0.01897869072854519, -0.010007601231336594, 0.0731489434838295, -0.0676427036523819, 0.09836458414793015, -0.026223117485642433, -0.059313371777534485, 0.16846154630184174, 0.009262408129870892, 0.03926117345690727, -0.004734278190881014, -0.1518803834915161, -0.09750071167945862, 0.09254969656467438, 0.07103648036718369, -0.014067434705793858, 0.14037694036960602, 0.0533088743686676, -0.027513466775417328, 0.04204430431127548, -0.15213118493556976, -0.029819529503583908, 0.009806694462895393, 0.003514214651659131, -0.028590677306056023, 0.11464860290288925, -0.055178675800561905, -0.13078303635120392, 0.01563205011188984, -0.012004461139440536, 0.04387035220861435, -0.005708852782845497, -0.08607879281044006, -0.020127059891819954, -0.057939328253269196, 0.02692953124642372, -0.20178425312042236, -0.2892155349254608, 0.054284293204545975, 0.04687589406967163, 0.043118689209222794, 0.05839140713214874, -0.059493888169527054, 0.020548595115542412, -0.021707946434617043, -0.06570307165384293, -0.022382864728569984, -0.038330450654029846, 0.0939415767788887, -0.028098605573177338, 0.06311744451522827, -0.18730874359607697, 0.03156130760908127, -0.12938712537288666, 0.0037276125513017178, -0.08846456557512283, 0.09538275748491287, -0.011847637593746185, -0.02788681909441948, -0.1092718318104744, -0.03189707174897194, -0.014855763874948025, -0.01853388175368309, 0.07193364948034286, 0.21021196246147156, -0.12026901543140411, -0.07123098522424698, 0.20477421581745148, -0.08777813613414764, -0.17198888957500458, 0.11700399219989777, -0.05757366120815277, 0.20422320067882538, 0.13301971554756165, 0.2025277465581894, 0.029984278604388237, -0.023659998551011086, -0.02888425812125206, 0.10951746255159378, -0.006803473923355341, 0.02473117597401142, -0.0016956819454208016, -0.02911948971450329, -0.08596278727054596, 0.03536248207092285, -0.1229291558265686, 0.10128341615200043, -0.006736880633980036, -0.07386299967765808, -0.009172319434583187, -0.02027263678610325, 0.14201441407203674, -0.024617018178105354, 0.09232732653617859, -0.004636966623365879, -0.06587809324264526, 0.031365349888801575, 0.06089369207620621, -0.11748403310775757, 0.10860224813222885, -0.06305866688489914, 0.11617232114076614, -0.1386238932609558, 0.036041099578142166, -0.11363495886325836, -0.021748371422290802, -0.059684012085199356, 0.10513880848884583, 0.053281594067811966, -0.08889209479093552, 0.09680156409740448, 0.0030625094659626484, -0.06498205661773682, -0.034501150250434875, 0.03366165608167648, 0.032215315848588943, -0.09634276479482651, -0.09141707420349121, -0.04972650855779648, 0.006830924656242132, 0.0022401176393032074, -0.09711132943630219, 0.020291436463594437, -0.020387783646583557, 0.0219095591455698, 0.03879256173968315, 0.015195096842944622, -0.023933904245495796, -0.020979272201657295, -0.05733368918299675, -0.13552133738994598, 0.05031357333064079, 0.027556108310818672, -0.10525184869766235, 0.05476318299770355, -0.04673952981829643, 0.058118052780628204, 0.1507083624601364, -0.03262145817279816, -0.06301876902580261, -0.016756350174546242, -0.016023164615035057, 0.01267378218472004, 0.013775967061519623, -0.025184419006109238, 0.1120046004652977, 0.04827570170164108, 0.14553067088127136, -0.0730750635266304, -0.023950429633259773, -0.016860228031873703, -0.0836935043334961, 0.06501763314008713, 0.025421470403671265, 0.09583951532840729, -0.2401684820652008, 0.1419571042060852, 0.16378071904182434, -0.01629900559782982, 0.20134663581848145, 0.026782071217894554, -0.05473896116018295, 0.04876081272959709, 0.07351062446832657, 0.01184197049587965, 0.05272399261593819, -0.11255568265914917, -0.010600969195365906, 0.03122987411916256, -0.0047753117978572845, 0.09543315321207047, -0.09962724894285202, 0.00023873636382631958, 0.018451541662216187, -0.06717900186777115, 0.03010552190244198, 0.06782182306051254, -0.06612599641084671, 0.07191064953804016, 0.015113716013729572, 0.01691441424190998, 0.09365927428007126, 0.0404086671769619, -0.1170731782913208, 0.1745768040418625, -0.06167547032237053, -0.19689756631851196, -0.08470515161752701, -0.17504382133483887, -0.09994994848966599, 0.04564111307263374, 0.08420819789171219, -0.06063306704163551, -0.026303615421056747, -0.05515832453966141, 0.04119793698191643, 0.030326778069138527, 0.005550111178308725, -0.04012128710746765, -0.01555303018540144, -0.059062130749225616, -0.11365441232919693, -0.020174454897642136, 0.02784763090312481, -0.06777439266443253, 0.08792169392108917, -0.0940336361527443, 0.009803625755012035, 0.06675402820110321, -0.027910802513360977, 0.06571982055902481, -0.03741999343037605, 0.20500338077545166, -0.05703581124544144, 0.06337942183017731, 0.17780256271362305, -0.05626346170902252, 0.039383336901664734, 0.07503187656402588, 0.007432443089783192, -0.037621334195137024, -0.009065556339919567, -0.04215219244360924, -0.05817999690771103, -0.19791653752326965, -0.08372901380062103, -0.09070782363414764, 0.03857624530792236, 0.08940023928880692, 0.05025104433298111, 0.02719130739569664, 0.13110139966011047, -0.10890162736177444, 0.17347005009651184, 0.02822301536798477, 0.10665731132030487, 0.03839059919118881, 0.012584703043103218, 0.02116508036851883, -0.054528236389160156, -0.054918818175792694, 0.07992184162139893, 0.1178213581442833, 0.16465558111667633, -0.026539696380496025, 0.12769627571105957, 0.041396018117666245, 0.06772620975971222, 0.0003224005049560219, 0.1392257660627365, -0.04777051508426666, 0.02545107714831829, -0.00698607973754406, -0.09518828988075256, -0.03305227309465408, 0.03883545845746994, -0.13748662173748016, -0.01003444567322731, -0.004151485860347748, 0.011899844743311405, 0.095413938164711, 0.11159896850585938, 0.07445754110813141, -0.21699558198451996, -0.09804059565067291, 0.03455211967229843, -0.0012428179616108537, -0.12202157080173492, -0.008439741097390652, 0.055357154458761215, -0.14059774577617645, 0.04090026393532753, -0.03620970994234085, 0.0895349383354187, -0.003321942640468478, 0.03212439641356468, 0.0024532887618988752, -0.04954301938414574, -0.06362979859113693, 0.12045840919017792, -0.281177818775177, 0.14712153375148773, 0.08429024368524551, 0.07288756966590881, -0.11572540551424026, 0.014535615220665932, 0.07329045236110687, 0.1164528876543045, 0.15974733233451843, 0.03359498828649521, 0.04899616912007332, -0.10268901288509369, -0.04073905199766159, 0.04473267123103142, 0.0657215416431427, -0.014695544727146626, 0.05274534970521927, -0.012272002175450325, 0.012583576142787933, -0.025312969461083412, 0.0355369932949543, -0.11558351665735245, -0.19655293226242065, 0.02443500980734825, 0.05994119867682457, 0.03832513093948364, -0.03150558099150658, -0.02641334757208824, 0.012983186170458794, 0.2945783734321594, -0.03601572662591934, -0.1944073885679245, -0.1052936315536499, 0.04485713690519333, 0.0030909350607544184, -0.11864707618951797, 0.043891213834285736, 0.013272249139845371, 0.006875833962112665, -0.10481975227594376, -0.1917973756790161, 0.0523102730512619, -0.059545766562223434, -0.03670733422040939, 0.0013837219448760152, 0.13690713047981262, -0.022141804918646812, -0.04949125647544861, 0.05243053287267685, -0.05151854828000069, -0.03579157218337059, -0.13211709260940552, -0.02863764949142933, 0.014560079202055931, 0.014051507227122784, 0.028324628248810768, -0.034012336283922195, -0.060596346855163574, -0.06697388738393784, -0.04446827620267868, 0.2106882631778717, 0.10177379846572876, -0.028905663639307022, 0.07917926460504532, 0.09348998218774796, -0.050200145691633224, -0.23795869946479797, -0.034946225583553314, -0.011838771402835846, -0.005625816527754068, -0.00677100382745266, -0.14209042489528656, 0.04941284656524658, -0.07309746742248535, 0.00041777509613893926, 0.2346951961517334, -0.19069701433181763, -0.1057811751961708, 0.15655259788036346, 0.07031673192977905, 0.1601061373949051, -0.07486878335475922, -0.05049910768866539, -0.08157012611627579, -0.12186014652252197, 0.10606131702661514, -0.15627767145633698, 0.09830638021230698, -0.0016833672998473048, 0.15209291875362396, 0.01892932690680027, -0.033757731318473816, 0.0681832805275917, 0.03758895397186279, 0.055110082030296326, -0.05503160133957863, -0.04599182680249214, 0.12206371873617172, -0.028211476281285286, 0.08513455837965012, -0.10998217761516571, 0.04105958715081215, -0.07102139294147491, -0.03581143915653229, -0.08547474443912506, 0.08112183958292007, 0.002800267655402422, -0.13869725167751312, -0.004495088942348957, 0.0014663900947198272, 0.0414087176322937, 0.028796125203371048, 0.08890499174594879, -0.004682390484958887, 0.014548218809068203, 0.08454256504774094, 0.04161179065704346, -0.04290780425071716, -0.011028922162950039, -0.020508425310254097, -0.06313427537679672, 0.11232051253318787, -0.15711019933223724, 0.047999124974012375, -0.013023019768297672, 0.013294154778122902, 0.07304000854492188, 0.05135340616106987, -0.07220396399497986, 0.03353999927639961, 0.028295021504163742, -0.2359514683485031, -0.025446640327572823, -0.09909878671169281, -0.04741409793496132, -0.03375910595059395, 0.09436725825071335, 0.15296724438667297, -0.07543040812015533, -0.04631635546684265, -0.03004199080169201, -0.004238995257765055, -0.0872078686952591, 0.06987294554710388, 0.06701831519603729, -0.017907431349158287, -0.10351889580488205, -0.019936716184020042, 0.028515245765447617, 0.023138009011745453, 0.03750922158360481, 0.07136276364326477, -0.15334907174110413, -0.09888960421085358, -0.04755166918039322, 0.12619279325008392, -0.0686853900551796, -0.028706081211566925, 0.011567390523850918, -0.013634773902595043, -0.04827665165066719, 0.04700370877981186, 0.052058372646570206, 0.0512080118060112, -0.02517206408083439, -0.04071936383843422, -0.06893876194953918, 0.07532734423875809, 0.040442947298288345, 0.07078898698091507, -0.06736636906862259, 0.10899651795625687, -0.0708639994263649, 0.12557782232761383, -0.03141641616821289, 0.03674115985631943, -0.08352382481098175, 0.023454295471310616, -0.15612146258354187, 0.05556212365627289, -0.07714622467756271, 0.010277161374688148, -0.02380233258008957, 0.036618951708078384, -0.02055678330361843, 0.027265166863799095, -0.07123371958732605, -0.008254455402493477, 0.001313915941864252, -0.034583400934934616, -0.08168882876634598, -0.0046886270865798, 0.011286533437669277, -0.0752774253487587, 0.13529281318187714, 0.06949988007545471, -0.034619495272636414, 0.10220518708229065, -0.20382943749427795, -0.06401737034320831, 0.029519716277718544, 0.019090542569756508, 0.02158481255173683, 0.035142362117767334, 0.045636825263500214, 0.019078221172094345, 0.03852317854762077, 0.0061521162278950214, 0.1056162565946579, -0.0868406817317009, 0.05488516762852669, -0.12138276547193527, 0.10488803684711456, -0.05286966636776924, 0.08941900730133057, -0.012080789543688297, 0.06786554306745529, 0.0904332622885704, -0.1123223528265953, 0.06416371464729309, -0.12480565905570984, 0.027335138991475105, 0.014949407428503036, -0.07713930308818817, -0.10023444890975952, -0.04062274843454361, 0.037151653319597244, -0.028243929147720337, 0.15465697646141052, 0.038019098341464996, 0.0014178925193846226, -0.023132765665650368, 0.007663407362997532, 0.024404376745224, -0.059840261936187744, 0.22355669736862183, -0.00015843765868339688, -0.026578038930892944, -0.10936502367258072, 0.012963536195456982, 0.0354379341006279, 0.03209126740694046, 0.14167320728302002, 0.06841353327035904, -0.06980916112661362, 0.08181017637252808, -0.10499420017004013, -0.02531426213681698, -0.06315715610980988, -0.1610013097524643, 0.04907476529479027, 0.030735032632946968, -0.03801972419023514, 0.02109028771519661, 0.2244812697172165, -0.08410901576280594, 0.0016104384558275342, 0.04471389949321747, -0.06534279137849808, -0.13998480141162872, -0.24476811289787292, -0.05248651280999184, -0.06001090630888939, 0.04357721656560898, -0.06605574488639832, -0.019704744219779968, 0.008620880544185638, 0.048895213752985, -0.06549567729234695, 0.1363365203142166, 0.08223709464073181, -0.13424474000930786, 0.06788422167301178, -0.03364057093858719, -0.005447862669825554, 0.022837286815047264, -0.03612091764807701, 0.028020745143294334, -0.0803803876042366, 0.055847980082035065, 0.025675255805253983, 0.034779153764247894, 0.08507008850574493, -0.07828624546527863, -0.08028731495141983, -0.05843846872448921, 0.06321468204259872, 0.06211584433913231, 0.17833152413368225, 0.009700662456452847, -0.026780741289258003, 0.005858801305294037, 0.1331738978624344, -0.025981733575463295, -0.15937943756580353, -0.03514073044061661, 0.1240692064166069, 0.04367785155773163, 0.01944963075220585, -0.029946044087409973, -0.08945267647504807, 0.0578635074198246, 0.23278573155403137, 0.22675827145576477, -0.04268936812877655, 0.036403387784957886, -0.05907542258501053, 0.012880295515060425, 0.04054378345608711, 0.1179356724023819, -0.039405833929777145, 0.11623168736696243, -0.06194562464952469, -0.031790271401405334, 0.0034011814277619123, -0.030946344137191772, -0.1079101637005806, 0.1389375627040863, 0.00948975421488285, -0.03398365154862404, -0.02726144716143608, 0.04363575950264931, -0.10986235737800598, -0.10737667977809906, -0.00959057081490755, -0.059555962681770325, -0.06171364337205887, 0.007646801881492138, 0.06113320589065552, 0.03715543448925018, 0.05705927684903145, -0.020887702703475952, 0.009098767302930355, 0.08889558166265488, 0.02320898324251175, -0.16181394457817078, -0.0019020563922822475, 0.07782750576734543, -0.04024757072329521, 0.07419449090957642, 0.007729534525424242, 0.08894804120063782, 0.01990717463195324, 0.03492421656847, -0.08108692616224289, 0.10553498566150665, -0.014268034137785435, 0.08556754142045975, 0.08709120750427246, 0.04221809282898903, -0.03109847567975521, -0.07162421196699142, 0.0005580330034717917, -0.05006138235330582, 0.026560289785265923, 0.15834614634513855, -0.08260980993509293, -0.09507075697183609, -0.004654393065720797, -0.0453607551753521, 0.10646989941596985, 0.05891699716448784, -0.06721723079681396, 0.02105717919766903, -0.08334501832723618, 0.012811336666345596, 0.07462561130523682, -0.02300569787621498, -0.049235861748456955, -0.05329619348049164, -0.06399483233690262, -0.026291418820619583, -0.0006543791387230158, -0.24345506727695465, 0.05482693761587143, -0.10254403203725815, -0.02751361019909382, -0.13614706695079803, 0.11104270815849304, 0.12628206610679626, 0.004998593591153622, -0.003872639499604702, 0.00972045585513115, -0.017511917278170586, 0.020378386601805687, -0.07424018532037735, -0.09963516145944595 ]
null
null
transformers
#Lelouch DialoGPT model
{"tags": ["conversational"]}
text-generation
Nova/DialoGPT-medium-Lelouch
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
#Lelouch DialoGPT model
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 51 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.009697278961539268, 0.03208012506365776, -0.007204889785498381, 0.004809224978089333, 0.16726240515708923, 0.014898733235895634, 0.09765533357858658, 0.13672804832458496, -0.007841327227652073, -0.031050153076648712, 0.14490588009357452, 0.20411323010921478, -0.006439372431486845, 0.0661218985915184, -0.07572533935308456, -0.2683109939098358, 0.05759621039032936, 0.046649303287267685, 0.016515716910362244, 0.1200079694390297, 0.08573378622531891, -0.05473608896136284, 0.08714032918214798, -0.014583407901227474, -0.150366872549057, 0.017733458429574966, 0.043394338339567184, -0.12260226160287857, 0.11910516023635864, 0.05462685227394104, 0.07063519209623337, 0.014929565601050854, -0.07541623711585999, -0.1631229966878891, 0.03031250834465027, 0.01425902172923088, -0.0594632662832737, 0.04757995903491974, 0.059961482882499695, -0.10165371745824814, 0.10819483548402786, 0.09530027210712433, -0.013078106567263603, 0.06798283755779266, -0.16849711537361145, -0.020869607105851173, -0.01446688175201416, 0.009899779222905636, 0.05550243332982063, 0.09964893013238907, -0.03413357585668564, 0.10497362166643143, -0.09214533120393753, 0.11017382889986038, 0.10932035744190216, -0.32057443261146545, -0.005767723545432091, 0.09167823940515518, 0.039358653128147125, 0.07352814823389053, -0.04467793554067612, 0.06258884817361832, 0.018015462905168533, 0.017986174672842026, -0.014015024527907372, -0.07283061742782593, -0.11612214148044586, 0.04717336222529411, -0.08668071031570435, -0.059868961572647095, 0.2244078367948532, -0.05464440956711769, 0.06881742179393768, -0.05281897634267807, -0.10522868484258652, -0.04308144748210907, -0.029833965003490448, 0.00475557055324316, -0.07660607248544693, 0.08692064881324768, 0.00869679357856512, -0.09547875821590424, -0.1376667022705078, -0.02496783249080181, -0.1776352822780609, 0.16140350699424744, 0.02465328387916088, 0.05232657864689827, -0.2027255892753601, 0.09623090922832489, 0.017906051129102707, -0.08045592904090881, 0.022091427817940712, -0.10046248883008957, 0.029131146147847176, 0.013760408386588097, -0.04754498973488808, -0.061387211084365845, 0.0843690037727356, 0.11199145019054413, -0.01731434464454651, 0.025486016646027565, -0.039331406354904175, 0.08100687712430954, 0.03553595021367073, 0.09077847748994827, 0.007288969587534666, -0.028338588774204254, 0.025842782109975815, -0.13719046115875244, -0.003647835226729512, -0.07116208970546722, -0.16572439670562744, -0.021088803187012672, 0.02994808368384838, 0.08289173990488052, 0.015449047088623047, 0.11682453751564026, -0.03272046521306038, -0.025152435526251793, 0.03602350503206253, -0.047656361013650894, -0.012649794109165668, 0.016648368909955025, 0.013163427822291851, 0.12399329990148544, -0.0022096503525972366, 0.03235051408410072, -0.13653022050857544, 0.031423524022102356, -0.06793295592069626, -0.003740974934771657, -0.03486552834510803, -0.040637075901031494, 0.009043924510478973, -0.06862333416938782, 0.003486064961180091, -0.15030112862586975, -0.15063877403736115, 0.007587034720927477, -0.007836631499230862, -0.04107699543237686, -0.06370922178030014, -0.06952770054340363, -0.013550350442528725, 0.04251532256603241, -0.07093454152345657, -0.011352915316820145, -0.06403283774852753, 0.11004766076803207, -0.03197755664587021, 0.07921615242958069, -0.11953279376029968, 0.08390819281339645, -0.11260783672332764, -0.02386913076043129, -0.060801517218351364, 0.09317506104707718, -0.0006014376995153725, 0.09549830108880997, -0.006563255097717047, -0.017931854352355003, -0.07981178909540176, 0.06445012241601944, -0.042872510850429535, 0.21701598167419434, -0.0615808479487896, -0.11181682348251343, 0.28781595826148987, -0.052628401666879654, -0.1370542049407959, 0.11647392809391022, 0.008682746440172195, 0.05777018144726753, 0.10703510791063309, 0.19733482599258423, -0.015276194550096989, 0.004040541127324104, 0.09471915662288666, 0.11263324320316315, -0.11276852339506149, -0.033160366117954254, 0.013019153848290443, -0.04081077128648758, -0.10867965966463089, 0.04689536616206169, 0.09810488671064377, 0.07090286910533905, -0.04786505550146103, -0.03377414867281914, -0.01366397924721241, 0.0052589005790650845, 0.08885077387094498, -0.007157256826758385, 0.10962837189435959, -0.05819983780384064, -0.03796621412038803, -0.029282379895448685, -0.012126247398555279, -0.03951939567923546, 0.03137664496898651, -0.043376367539167404, 0.10821941494941711, -0.011204327456653118, 0.06364280730485916, -0.16185984015464783, -0.07691477984189987, -0.017002692446112633, 0.1581239402294159, 0.024538565427064896, 0.09859629720449448, 0.0552486926317215, -0.040398042649030685, -0.0012767292791977525, 0.012792680412530899, 0.15581141412258148, -0.022091681137681007, -0.065607450902462, -0.052166227251291275, 0.08642971515655518, -0.05641226842999458, 0.04504093527793884, -0.05937713757157326, 0.012367865070700645, 0.05064384639263153, 0.10342344641685486, -0.00018274025933351368, 0.03323284164071083, -0.008164864964783192, 0.002145637758076191, -0.058205123990774155, 0.007405933458358049, 0.10799351334571838, 0.00036868182360194623, -0.07365862280130386, 0.22074243426322937, -0.17796069383621216, 0.1765957772731781, 0.1893044263124466, -0.299345999956131, 0.017949223518371582, -0.10759581625461578, -0.04561871662735939, 0.014407722279429436, 0.05567655712366104, -0.0454222597181797, 0.1703362911939621, -0.009871348738670349, 0.18874616920948029, -0.04946064203977585, -0.04464937001466751, -0.0200483538210392, -0.05118836089968681, -0.0024189651012420654, 0.07781197130680084, 0.10685696452856064, -0.13992026448249817, 0.1964332014322281, 0.1621224284172058, 0.048237916082143784, 0.19945049285888672, 0.015346456319093704, -0.011589210480451584, 0.0909530371427536, 0.005220826715230942, -0.058739423751831055, -0.07409929484128952, -0.2594851851463318, -0.030033592134714127, 0.07992640137672424, 0.0422382652759552, 0.1212305948138237, -0.11349532753229141, -0.038956157863140106, -0.01763172075152397, -0.023146281018853188, 0.021672505885362625, 0.0914369598031044, 0.06075398623943329, 0.13201528787612915, -0.001710098935291171, -0.007300339173525572, 0.10524573177099228, 0.01783694699406624, -0.09354141354560852, 0.18308524787425995, -0.13652534782886505, -0.37097251415252686, -0.13911493122577667, -0.18057456612586975, -0.05449081212282181, 0.05712554603815079, 0.11679314076900482, -0.12011238187551498, -0.018752124160528183, 0.01578843593597412, 0.10931742936372757, -0.08449502289295197, 0.0021454424131661654, -0.06880278885364532, 0.0321490578353405, -0.10310184955596924, -0.09194442629814148, -0.055416494607925415, -0.031392451375722885, -0.08001253753900528, 0.1423761546611786, -0.10777941346168518, 0.04476889222860336, 0.20262959599494934, 0.04653622955083847, 0.05625178664922714, -0.044105201959609985, 0.19377262890338898, -0.11264272034168243, -0.01661740615963936, 0.19215328991413116, -0.048360925167798996, 0.07476246356964111, 0.1232115849852562, -0.006348740309476852, -0.08765771239995956, 0.03011748194694519, -0.02085109055042267, -0.07988511025905609, -0.23219464719295502, -0.13938382267951965, -0.12429051846265793, 0.09477275609970093, 0.028005298227071762, 0.056365787982940674, 0.17219258844852448, 0.06577219814062119, -0.038416244089603424, 0.006410336587578058, 0.02959546446800232, 0.08237514644861221, 0.23417828977108002, -0.06035616248846054, 0.1364797055721283, -0.03420931473374367, -0.14982740581035614, 0.08169995993375778, 0.0713929831981659, 0.10213395953178406, 0.06678459793329239, 0.0804823637008667, 0.0149586396291852, 0.06188136339187622, 0.1311223804950714, 0.08191446959972382, 0.019586285576224327, -0.02480296604335308, -0.03388110175728798, -0.025523077696561813, -0.05937909707427025, 0.040128443390131, 0.06589099019765854, -0.16763372719287872, -0.039227183908224106, -0.09338314831256866, 0.09657008945941925, 0.0873042419552803, 0.06609832495450974, -0.1842060089111328, -0.008006223477423191, 0.08488986641168594, -0.03854905813932419, -0.13727426528930664, 0.09535189718008041, 0.01523482333868742, -0.15144726634025574, 0.03139317408204079, -0.04061909019947052, 0.12188644707202911, -0.07804752141237259, 0.09809603542089462, -0.08108244836330414, -0.07448557764291763, 0.02123199962079525, 0.1261177361011505, -0.30527687072753906, 0.20240111649036407, -0.0024993624538183212, -0.06486981362104416, -0.1243603527545929, -0.0032166161108762026, 0.002410882618278265, 0.07357452809810638, 0.10519039630889893, -0.007196315098553896, 0.001897757756523788, -0.06300821900367737, -0.01829923689365387, 0.032471053302288055, 0.13080233335494995, -0.0401318334043026, -0.021158374845981598, -0.050194524228572845, -0.001653497340157628, -0.03173094615340233, -0.06934895366430283, 0.02002747356891632, -0.19509181380271912, 0.08751901984214783, 0.04166261479258537, 0.09648149460554123, 0.029994789510965347, 0.004265148192644119, -0.09651939570903778, 0.24698667228221893, -0.07148019969463348, -0.10072879493236542, -0.10919588059186935, -0.046813901513814926, 0.03569883480668068, -0.05628936365246773, 0.04309194162487984, -0.0788632407784462, 0.028997479006648064, -0.06352769583463669, -0.19235502183437347, 0.12410202622413635, -0.09027006477117538, -0.04412810131907463, -0.02371402643620968, 0.2110891044139862, -0.05598580464720726, 0.010335659608244896, 0.02930437959730625, 0.01208863127976656, -0.11645778268575668, -0.09678568691015244, 0.031018631532788277, -0.007351789623498917, 0.050603240728378296, 0.041841957718133926, -0.05915454775094986, -0.017138581722974777, -0.052199993282556534, -0.022926922887563705, 0.3496883809566498, 0.14231905341148376, -0.043836336582899094, 0.19347235560417175, 0.12347975373268127, -0.07452994585037231, -0.3159443140029907, -0.1066238060593605, -0.10937739163637161, -0.04680149629712105, -0.07012093812227249, -0.2002030611038208, 0.06474938243627548, 0.00662544509395957, -0.013415241613984108, 0.12749312818050385, -0.2561831772327423, -0.07571036368608475, 0.15906259417533875, -0.017980827018618584, 0.3745945692062378, -0.1168576180934906, -0.10926306992769241, -0.03950892388820648, -0.14175476133823395, 0.16968177258968353, -0.01989765651524067, 0.11221715062856674, -0.009765521623194218, 0.14388824999332428, 0.05548359826207161, -0.023479344323277473, 0.08544106781482697, 0.004999885335564613, -0.03290518373250961, -0.10304180532693863, -0.05676887184381485, 0.007092386484146118, 0.02477436140179634, 0.018026655539870262, -0.041834570467472076, 0.02227151393890381, -0.11731979995965958, -0.04657655209302902, -0.08982590585947037, 0.04431166127324104, 0.03899754583835602, -0.07325074821710587, -0.002380647463724017, -0.07165111601352692, -0.012272949330508709, 0.022334342822432518, 0.20356793701648712, -0.08029330521821976, 0.16448934376239777, 0.09239562600851059, 0.12419285625219345, -0.14376309514045715, -0.00019283240544609725, -0.0762530043721199, -0.05611240118741989, 0.07737895101308823, -0.09433035552501678, 0.058893077075481415, 0.10901971161365509, -0.04567738622426987, 0.08828683942556381, 0.10377411544322968, 0.008936077356338501, 0.003213887568563223, 0.10916902124881744, -0.2667325437068939, -0.0296600554138422, -0.07532413303852081, 0.000883326749317348, 0.09092561900615692, 0.08562852442264557, 0.18840822577476501, 0.025361526757478714, -0.04293036088347435, -0.002770674182102084, 0.028597986325621605, -0.039021048694849014, 0.051667019724845886, 0.001123449532315135, 0.01947369985282421, -0.1530752182006836, 0.072522833943367, 0.01490565575659275, -0.15215420722961426, 0.021316176280379295, 0.16572684049606323, -0.11656328290700912, -0.1283872276544571, -0.06520111113786697, 0.08313824236392975, -0.11755692958831787, -0.01578943058848381, -0.03279297426342964, -0.13145680725574493, 0.07992171496152878, 0.12629036605358124, 0.05557859688997269, 0.0972496047616005, -0.06061713397502899, -0.020469192415475845, -0.018721895292401314, -0.014099318534135818, -0.012384648434817791, -0.007667020428925753, -0.055978111922740936, 0.0590752474963665, -0.026677248999476433, 0.1425808072090149, -0.09221141785383224, -0.1037059873342514, -0.16142144799232483, 0.0374140702188015, -0.11013076454401016, -0.08825794607400894, -0.08821134269237518, -0.050188567489385605, 0.002360827289521694, -0.019856395199894905, -0.04037635400891304, -0.05829505994915962, -0.12300454825162888, 0.0338277705013752, -0.040771447122097015, 0.024727050215005875, -0.07512269169092178, 0.015856385231018066, 0.08507686108350754, -0.03285100311040878, 0.15655414760112762, 0.1450488418340683, -0.1006515845656395, 0.10741901397705078, -0.14806775748729706, -0.09138492494821548, 0.11116421222686768, 0.015329592861235142, 0.0449691042304039, 0.09723787009716034, 0.013362943194806576, 0.0635865181684494, 0.032776717096567154, 0.05308786407113075, 0.027619892731308937, -0.11959987878799438, 0.06483134627342224, -0.03626115620136261, -0.14700546860694885, -0.049338050186634064, -0.05282869189977646, 0.01647452637553215, 0.013054544106125832, 0.09622690081596375, -0.05301849544048309, 0.10698331147432327, -0.04055701196193695, 0.0346808135509491, 0.017554637044668198, -0.1730053424835205, -0.03816922754049301, -0.08538098633289337, 0.03681723028421402, 0.014741539023816586, 0.25266793370246887, 0.030072299763560295, 0.012416383251547813, 0.032671261578798294, 0.08285367488861084, 0.03899408504366875, 0.010228337720036507, 0.17482228577136993, 0.1162426546216011, -0.06621865928173065, -0.10445023328065872, 0.0729617029428482, 0.016332454979419708, 0.01286179106682539, 0.13617953658103943, 0.008365051820874214, 0.005795429926365614, 0.08649782836437225, -0.016865963116288185, 0.009968153201043606, -0.10052056610584259, -0.13426925241947174, -0.022176474332809448, 0.05151832848787308, -0.04655967652797699, 0.11727844923734665, 0.1406494379043579, -0.01806013658642769, 0.03222079202532768, -0.021771740168333054, -0.05699979141354561, -0.1683429479598999, -0.1429590880870819, -0.06883849948644638, -0.13416796922683716, 0.00897989235818386, -0.11180389672517776, 0.05395037308335304, 0.06001098081469536, 0.06750501692295074, -0.06899319589138031, 0.10220931470394135, 0.04626858979463577, -0.11440542340278625, 0.06264589726924896, -0.0296088308095932, 0.09430401772260666, -0.02759445086121559, -0.019505485892295837, -0.09039592742919922, 0.014574515633285046, 0.011419114656746387, 0.06245238706469536, -0.04707273095846176, 0.007463190704584122, -0.14696238934993744, -0.08972041308879852, -0.0523175448179245, 0.0718572810292244, -0.050409089773893356, 0.14282815158367157, 0.00775480642914772, -0.0170906875282526, 0.039554283022880554, 0.22787313163280487, -0.07476283609867096, -0.04778539761900902, -0.05269690603017807, 0.20717895030975342, 0.02975541539490223, 0.1171872541308403, -0.022938819602131844, -0.006106364540755749, -0.0919521227478981, 0.3764844834804535, 0.30030161142349243, -0.09031439572572708, 0.011794124729931355, 0.02137952297925949, 0.04502861574292183, 0.1316293478012085, 0.1216534823179245, 0.10318691283464432, 0.3006802201271057, -0.07452366501092911, -0.04653361067175865, -0.012629742734134197, -0.023858042433857918, -0.09059546142816544, 0.1021224707365036, 0.04839762672781944, -0.06382183730602264, -0.03313443064689636, 0.0954432487487793, -0.25862133502960205, 0.1277991235256195, -0.12311873584985733, -0.17578600347042084, -0.06654827296733856, 0.009760108776390553, 0.10465722531080246, 0.015642458572983742, 0.0946015790104866, 0.007128213066607714, -0.11252258718013763, 0.06305865943431854, 0.03397420793771744, -0.22762253880500793, 0.0006893770187161863, 0.06642123311758041, -0.07006710022687912, -0.0024247700348496437, -0.026499588042497635, 0.05657242611050606, 0.0656052976846695, 0.054629553109407425, -0.00971333310008049, 0.03816632181406021, 0.0034184439573436975, -0.0585215799510479, 0.016623929142951965, 0.05121519789099693, 0.02472509816288948, -0.09763528406620026, 0.06927435845136642, -0.1574270874261856, 0.04766253009438515, -0.0030655991286039352, -0.04124255105853081, 0.006064958870410919, 0.008823691867291927, -0.06491616368293762, 0.05165379121899605, 0.07916834205389023, -0.0016257909592241049, -0.0062433634884655476, -0.057178743183612823, -0.02632102556526661, -0.027755750343203545, -0.09291748702526093, -0.10495562851428986, -0.14682936668395996, -0.11640441417694092, 0.09368976950645447, -0.01011267676949501, -0.1848134547472, 0.022154374048113823, -0.08606051653623581, 0.08319322764873505, -0.1670055389404297, 0.08040720224380493, 0.07041648775339127, 0.013038921169936657, -0.0031511052511632442, -0.02002427540719509, 0.054132770746946335, 0.086809903383255, -0.10407156497240067, -0.07400695979595184 ]
null
null
transformers
# My Awesome Model
{"tags": ["conversational"]}
text-generation
NovaChrono/twervy
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# My Awesome Model
[ "# My Awesome Model" ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# My Awesome Model" ]
[ 51, 4 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# My Awesome Model" ]
[ -0.05259015038609505, 0.05521034821867943, -0.005910294596105814, 0.017722278833389282, 0.15250112116336823, 0.02286236733198166, 0.07657632976770401, 0.09513414651155472, -0.025391526520252228, -0.047348517924547195, 0.15119488537311554, 0.19781284034252167, -0.020334534347057343, 0.101333387196064, -0.04688440263271332, -0.3143521845340729, 0.06439975649118423, 0.05463787540793419, -0.015605635941028595, 0.12023304402828217, 0.09468326717615128, -0.0530015267431736, 0.08742043375968933, -0.012155864387750626, -0.1293085366487503, -0.0027921805158257484, -0.002384399762377143, -0.10180269181728363, 0.11194873601198196, 0.033712033182382584, 0.05166437849402428, 0.0182647667825222, -0.05843055993318558, -0.139859139919281, 0.03845210000872612, -0.015005595050752163, -0.05602653697133064, 0.05648263916373253, 0.059830192476511, -0.07164353132247925, 0.1669619083404541, 0.13275989890098572, -0.04237370565533638, 0.056127581745386124, -0.17620700597763062, 0.017941240221261978, 0.01800798624753952, 0.019184142351150513, 0.05306641012430191, 0.10830496996641159, -0.03932326287031174, 0.09217294305562973, -0.11410652846097946, 0.08313368260860443, 0.07800983637571335, -0.29151955246925354, -0.025312699377536774, 0.10440942645072937, 0.06437138468027115, 0.048375632613897324, -0.013386772945523262, 0.0621674507856369, 0.02149512618780136, 0.008602659218013287, 0.02225899137556553, -0.06727100163698196, -0.05789240449666977, 0.032748885452747345, -0.0967593789100647, -0.03634428232908249, 0.19753605127334595, -0.024647634476423264, 0.053590498864650726, -0.06265407055616379, -0.11300963163375854, -0.039751436561346054, -0.050429005175828934, -0.029761891812086105, -0.05090925097465515, 0.09489558637142181, 0.004352911841124296, -0.09534718841314316, -0.13405443727970123, -0.01370926946401596, -0.1618979275226593, 0.15892250835895538, 0.012579603120684624, 0.046201955527067184, -0.19210097193717957, 0.11465331166982651, -0.03857925534248352, -0.08259090781211853, 0.030513519421219826, -0.12010065466165543, 0.03160654753446579, -0.008132083341479301, -0.019599268212914467, -0.049325279891490936, 0.061037879437208176, 0.08101806789636612, 0.018783701583743095, 0.005755073390901089, 0.018167443573474884, 0.05343452841043472, 0.05891622602939606, 0.10033947974443436, -0.02891627699136734, -0.0625043511390686, 0.0025436533614993095, -0.12051084637641907, -0.01122665498405695, -0.05357983708381653, -0.18095199763774872, 0.002246231772005558, 0.02455340512096882, 0.05192234739661217, 0.011778532527387142, 0.09955989569425583, -0.028496338054537773, -0.026898741722106934, 0.06898727267980576, 0.002862759632989764, -0.015707949176430702, -0.005368964280933142, -0.010934269987046719, 0.11485416442155838, -0.023099146783351898, 0.04774846136569977, -0.12022071331739426, 0.020393015816807747, -0.07851235568523407, -0.0019349842332303524, -0.06214260309934616, -0.04864754155278206, -0.0019346009939908981, -0.06985589861869812, 0.021118074655532837, -0.14833110570907593, -0.17990200221538544, -0.005064866971224546, 0.021302316337823868, -0.052403319627046585, -0.09162671118974686, -0.0982397273182869, -0.02586611732840538, 0.03574685752391815, -0.05873546749353409, 0.013170980848371983, -0.06884536147117615, 0.06542801111936569, 0.0029820678755640984, 0.05682007595896721, -0.14085575938224792, 0.08719147741794586, -0.12582023441791534, -0.023288866505026817, -0.061977192759513855, 0.1109607070684433, 0.024780582636594772, 0.1267160177230835, 0.004311583004891872, -0.0033308975398540497, -0.08729329705238342, 0.08271238207817078, -0.04243258014321327, 0.22770646214485168, -0.10479787737131119, -0.08809807151556015, 0.2632525563240051, -0.05423165112733841, -0.16432519257068634, 0.10179096460342407, -0.014350244775414467, 0.12198644131422043, 0.13850919902324677, 0.16080057621002197, 0.007628654129803181, 0.03313867375254631, 0.10115300863981247, 0.08631709218025208, -0.08573295921087265, -0.0611947737634182, 0.023627014830708504, -0.011463395319879055, -0.10670105367898941, 0.046802595257759094, 0.04794782027602196, 0.08188598603010178, -0.04982871189713478, -0.028600862249732018, -0.01972118206322193, -0.044152840971946716, 0.05264130234718323, 0.007675500120967627, 0.13217447698116302, -0.03674980252981186, -0.03692879155278206, -0.023745311424136162, 0.01699630729854107, -0.03115241602063179, 0.007061392068862915, -0.05687357112765312, 0.11091547459363937, -0.03406180441379547, 0.051789235323667526, -0.16953988373279572, -0.04873261600732803, -0.02087729424238205, 0.1402055323123932, 0.04973345249891281, 0.1329866498708725, 0.06287940591573715, -0.010758201591670513, 0.00859389640390873, 0.007998145185410976, 0.13181665539741516, 0.007865442894399166, -0.07660657912492752, -0.047718439251184464, 0.09176599979400635, -0.05973208695650101, 0.06147782504558563, -0.098741315305233, -0.004747362341731787, -0.01433002483099699, 0.08674649894237518, 0.006352655589580536, 0.029382232576608658, -0.006192679051309824, 0.003654100699350238, -0.06161240115761757, 0.017873648554086685, 0.12492607533931732, -0.01421504095196724, -0.07439801841974258, 0.22084392607212067, -0.15798072516918182, 0.18006981909275055, 0.18165533244609833, -0.3081994652748108, 0.024602634832262993, -0.08860466629266739, -0.036338552832603455, 0.03426366671919823, 0.0491504967212677, -0.034147560596466064, 0.16587987542152405, -0.016766328364610672, 0.201018825173378, -0.03547777235507965, -0.01287798210978508, -0.010399105958640575, -0.03656993433833122, -0.010632630437612534, 0.09065473079681396, 0.15122920274734497, -0.1677125245332718, 0.18270380795001984, 0.1660280078649521, 0.06873020529747009, 0.17776396870613098, 0.034313347190618515, -0.006856906693428755, 0.07112615555524826, -0.022670727223157883, -0.07675548642873764, -0.049287427216768265, -0.26302891969680786, -0.027947327122092247, 0.06471601128578186, 0.04510856419801712, 0.11924877762794495, -0.10971947014331818, -0.037208184599876404, 0.010892451740801334, -0.013165894895792007, 0.02132410928606987, 0.09682225435972214, 0.01171150617301464, 0.11804302036762238, -0.021027036011219025, -0.05209195241332054, 0.0898953229188919, 0.02727191150188446, -0.0787680521607399, 0.19168277084827423, -0.10074768215417862, -0.3233809769153595, -0.11354339867830276, -0.18166927993297577, -0.017843691632151604, 0.05878754332661629, 0.08049646019935608, -0.09228580445051193, -0.02625267766416073, -0.01639235019683838, 0.0758359357714653, -0.09145816415548325, -0.015880629420280457, -0.09367848187685013, 0.034986745566129684, -0.10827737301588058, -0.07011983543634415, -0.05141967162489891, -0.03368452936410904, -0.04457031562924385, 0.13157756626605988, -0.12242637574672699, 0.06396433711051941, 0.2076517641544342, 0.06227295100688934, 0.05622440204024315, -0.0229496993124485, 0.23288212716579437, -0.10842552781105042, 0.02383521944284439, 0.1717897206544876, -0.03566030040383339, 0.0727933868765831, 0.13435456156730652, 0.006721907295286655, -0.08144525438547134, 0.03465581312775612, -0.04592517390847206, -0.08630958944559097, -0.20441576838493347, -0.14156180620193481, -0.12814727425575256, 0.07913564145565033, 0.03285396471619606, 0.05478321388363838, 0.15024253726005554, 0.11386489123106003, 0.007987297140061855, 0.00976672861725092, -0.006888182368129492, 0.05438044294714928, 0.17482298612594604, -0.05838097631931305, 0.10041683167219162, -0.037591226398944855, -0.1924494504928589, 0.08022978901863098, 0.04309763014316559, 0.08280511945486069, 0.07474655658006668, 0.0856199786067009, 0.013537914492189884, 0.03723837807774544, 0.10897084325551987, 0.1165735274553299, 0.031679023057222366, -0.038079675287008286, -0.04882059991359711, -0.026300756260752678, -0.03285675123333931, 0.05745977535843849, 0.07790146768093109, -0.1608346849679947, -0.06348084658384323, -0.06350091099739075, 0.07662643492221832, 0.09017108380794525, 0.11811108142137527, -0.21219493448734283, 0.01579318381845951, 0.092556893825531, -0.0494147390127182, -0.1304239183664322, 0.07402537018060684, -0.00466050673276186, -0.1397053301334381, 0.037663187831640244, -0.014095795340836048, 0.1359514445066452, -0.0778401643037796, 0.10336452722549438, -0.08307972550392151, -0.06147889420390129, 0.03632286190986633, 0.1355396956205368, -0.30774354934692383, 0.2137020230293274, -0.022472934797406197, -0.05296783149242401, -0.10508129745721817, -0.011727629229426384, 0.020913105458021164, 0.09079049527645111, 0.10090240091085434, -0.0025442070327699184, 0.0061299679800868034, -0.0345483273267746, -0.053218815475702286, 0.024456629529595375, 0.07957815378904343, -0.08542889356613159, 0.0017540202243253589, -0.02361489273607731, -0.004407065454870462, -0.032844748347997665, -0.01189463958144188, -0.011617658659815788, -0.16786961257457733, 0.06556065380573273, -0.002625665394589305, 0.11129079759120941, 0.03491498529911041, 0.0024013579823076725, -0.1009332686662674, 0.19977013766765594, 0.01796281896531582, -0.08052749931812286, -0.08830537647008896, -0.03254766762256622, 0.03660419583320618, -0.06121435388922691, 0.027481911703944206, -0.06916457414627075, 0.033381566405296326, -0.06441576033830643, -0.18325145542621613, 0.1268530637025833, -0.10945470631122589, -0.03609596937894821, -0.04321056231856346, 0.18323224782943726, -0.00929707009345293, -0.0011623724130913615, 0.05866571143269539, 0.0032208464108407497, -0.1347510665655136, -0.10740556567907333, 0.020214511081576347, -0.015275230631232262, 0.009142245166003704, 0.05559912323951721, -0.009665844030678272, 0.00045268211397342384, -0.039558928459882736, -0.023234419524669647, 0.32348164916038513, 0.10732097923755646, -0.04944206401705742, 0.17007054388523102, 0.13087597489356995, -0.0827672928571701, -0.30699312686920166, -0.10971353948116302, -0.10529600828886032, -0.026918673887848854, -0.037983208894729614, -0.19617970287799835, 0.09504909813404083, -0.03528566658496857, -0.022136637941002846, 0.11253651231527328, -0.2759084105491638, -0.0770430713891983, 0.1826775223016739, 0.003314757253974676, 0.3998824954032898, -0.10265109688043594, -0.08777514100074768, -0.06741699576377869, -0.1120782196521759, 0.2033512443304062, -0.05560711398720741, 0.08663415163755417, -0.00517998356372118, 0.15513743460178375, 0.055607251822948456, -0.02176513522863388, 0.08932057023048401, -0.005811662413179874, -0.0546204075217247, -0.1219351515173912, -0.03444604203104973, -0.009159418754279613, 0.007239421829581261, 0.03589896112680435, -0.04242607578635216, 0.01279151439666748, -0.1399589478969574, -0.045490626245737076, -0.0764620453119278, 0.024699507281184196, 0.021008269861340523, -0.0652410089969635, -0.01643640361726284, -0.03945036977529526, -0.012804778292775154, 0.03164318576455116, 0.15236099064350128, -0.06478006392717361, 0.1476556956768036, 0.04904455319046974, 0.15412139892578125, -0.14745712280273438, -0.02258288487792015, -0.06896031647920609, -0.05498642474412918, 0.04900865629315376, -0.10053684562444687, 0.050061121582984924, 0.1202658861875534, -0.0742902010679245, 0.0987328365445137, 0.0922594666481018, -0.01938629150390625, 0.0012483424507081509, 0.1226617842912674, -0.2489612102508545, -0.07742628455162048, -0.10509459674358368, 0.013337249867618084, 0.10138551890850067, 0.06995654851198196, 0.17304721474647522, -0.0037713919300585985, -0.036284226924180984, -0.0064643872901797295, 0.025414984673261642, -0.03540204465389252, 0.05724727362394333, -0.002706433180719614, 0.016663886606693268, -0.15213344991207123, 0.060368724167346954, -0.00024176653823815286, -0.1438901126384735, -0.013603870756924152, 0.16073721647262573, -0.11208858340978622, -0.15145981311798096, -0.007263668347150087, 0.13685113191604614, -0.13171035051345825, -0.03302847594022751, -0.03708777576684952, -0.170182466506958, 0.07439173012971878, 0.1024777740240097, 0.08549231290817261, 0.08025266975164413, -0.06620611250400543, -0.00807863101363182, -0.011656313203275204, -0.026087598875164986, 0.031810320913791656, -0.023377234116196632, -0.09044221043586731, 0.03872343525290489, -0.026654237881302834, 0.13591371476650238, -0.09607382118701935, -0.09331836551427841, -0.135749951004982, 0.039314381778240204, -0.12405620515346527, -0.08138058334589005, -0.12200927734375, -0.0591500885784626, 0.00224387738853693, -0.0001289021165575832, -0.035674065351486206, -0.06687422841787338, -0.13582271337509155, 0.04366770386695862, -0.04484611004590988, 0.0013091047294437885, -0.040241483598947525, 0.04561002552509308, 0.06766383349895477, -0.03493715822696686, 0.13722217082977295, 0.11722734570503235, -0.07864081114530563, 0.08946478366851807, -0.16657429933547974, -0.0683990865945816, 0.08854512125253677, 0.008173754438757896, 0.06165994703769684, 0.06743349134922028, 0.033807408064603806, 0.06109451875090599, 0.04151686280965805, 0.03488299250602722, 0.01739438995718956, -0.09271225333213806, 0.015541021712124348, 0.022296719253063202, -0.1294609159231186, -0.04801803454756737, -0.029226921498775482, 0.00939185917377472, 0.008117396384477615, 0.11003357172012329, -0.0426274873316288, 0.09439733624458313, -0.05888751894235611, 0.036728594452142715, 0.016222506761550903, -0.16461637616157532, -0.020102784037590027, -0.11915475130081177, 0.028684545308351517, -0.0033096212428063154, 0.25625869631767273, 0.06346847862005234, 0.020517030730843544, 0.01250078622251749, 0.08567021042108536, 0.07241600006818771, 0.02562166005373001, 0.1956365555524826, 0.10854171961545944, -0.05020022392272949, -0.12334850430488586, 0.09686340391635895, 0.034720368683338165, 0.06432123482227325, 0.13385434448719025, -0.026959087699651718, 0.002498799469321966, 0.11019360274076462, 0.011678861454129219, 0.04961980879306793, -0.09859088063240051, -0.16400282084941864, -0.00994415208697319, 0.061864156275987625, -0.04559077322483063, 0.12240655720233917, 0.11382720619440079, -0.020697353407740593, 0.03180128335952759, -0.010503606870770454, -0.05694027617573738, -0.16998925805091858, -0.1630837321281433, -0.08357038348913193, -0.11794789135456085, -0.0027763545513153076, -0.11386270076036453, 0.013879159465432167, 0.06452289968729019, 0.0604364387691021, -0.09019444137811661, 0.08891061693429947, 0.0687386617064476, -0.11843101680278778, 0.08828350901603699, -0.033263903111219406, 0.07249268144369125, 0.0015160300536081195, 0.003872724948450923, -0.13800905644893646, 0.032393742352724075, -0.008493867702782154, 0.04159298539161682, -0.09244006127119064, 0.022458361461758614, -0.11297028511762619, -0.07659684121608734, -0.07971972227096558, 0.05093973129987717, -0.03541257977485657, 0.1390930563211441, 0.001295371213927865, -0.035233911126852036, 0.024190181866288185, 0.22729112207889557, -0.06350252777338028, -0.030667411163449287, -0.0618741400539875, 0.21414142847061157, 0.024466563016176224, 0.10703565180301666, -0.016775688156485558, 0.019240234047174454, -0.0764411985874176, 0.3689337372779846, 0.344390869140625, -0.1225387305021286, -0.0015968306688591838, 0.031062176451086998, 0.036916591227054596, 0.11621878296136856, 0.12602226436138153, 0.057955991476774216, 0.2995031177997589, -0.08396036922931671, -0.002026971662417054, -0.02688612788915634, -0.03624163940548897, -0.04409930482506752, 0.10547586530447006, 0.06835740804672241, -0.03330419585108757, -0.027012333273887634, 0.1376710683107376, -0.2966996431350708, 0.12323499470949173, -0.15714547038078308, -0.1487535685300827, -0.06873904913663864, -0.005042468197643757, 0.08589684963226318, 0.04748665541410446, 0.1069009080529213, -0.019124338403344154, -0.08203735202550888, 0.05766449123620987, 0.0320524163544178, -0.22844897210597992, 0.011852608993649483, 0.08361081779003143, -0.06153005734086037, 0.011767351068556309, -0.017906347289681435, 0.038472190499305725, 0.07790610194206238, 0.025976579636335373, -0.032770540565252304, 0.06325861811637878, -0.005814229138195515, -0.05033424496650696, 0.04302205145359039, 0.05059972032904625, 0.017107632011175156, -0.1511564701795578, 0.07320158183574677, -0.1762860119342804, 0.0566408596932888, -0.005331212189048529, -0.04948166385293007, 0.000018263708625454456, 0.01998119056224823, -0.06808236241340637, 0.05880929157137871, 0.0952666699886322, -0.012173139490187168, -0.002317852806299925, -0.056667573750019073, 0.007662574760615826, -0.0679154172539711, -0.0747012197971344, -0.10497893393039703, -0.1338900774717331, -0.11392296850681305, 0.10846775025129318, -0.011928223073482513, -0.19833622872829437, 0.02906924858689308, -0.11258108913898468, 0.04933213070034981, -0.13360801339149475, 0.08599711954593658, 0.1282832771539688, 0.021543797105550766, -0.01265349704772234, 0.04020093381404877, 0.01591683179140091, 0.08550478518009186, -0.09200563281774521, -0.10515180230140686 ]
null
null
transformers
# Genji-JP 6B Please check our blog post for more details, samples, evaluations and more: [Blogpost](https://blog.novelai.net/data-efficient-language-transfer-with-gpt-j-45daedaaf35a) ## Model Description Genji-JP 6B is a model finetuned on our Japanese storytelling dataset based on EleutherAI's GPT-J 6B model. This particular model is trained on Japanese web novels. | Hyperparameter | Value | |-------------------|--------| | n_parameters | 6,053,381,344 | | n_layers | 28* | | d_model | 4,096 | | d_ff | 16,384 | | n_heads | 16 | | d_head | 256 | | n_ctx | 2,048 | | n_vocab | 50,400 (same tokenizer as GPT-2/3) | | position encoding | [Rotary position encodings (RoPE)](https://arxiv.org/abs/2104.09864) | | RoPE dimensions | [64](https://github.com/kingoflolz/mesh-transformer-jax/blob/f2aa66e0925de6593dcbb70e72399b97b4130482/mesh_transformer/layers.py#L223) | `*` each layer consists of one feedforward block and one self attention block The model consists of 28 layers with a model dimension of 4096, and a feedforward dimension of 16384. The model dimension is split into 16 heads, each with a dimension of 256. Rotary position encodings (RoPE) was applied to 64 dimensions of each head. The model is trained with a tokenization vocabulary of 50257, using the same set of BPEs as GPT-2/GPT-3. ## Training data GPT-J 6B was pretrained on the [Pile](pile.eleuther.ai), a large scale curated dataset created by EleutherAI for the purpose of training this model. After the pre-training, it's finetuned on our Japanese storytelling dataset. Check our blog post for more details. ### How to use ``` from transformers import AutoTokenizer, AutoModelForCausalLM import torch tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-j-6B") model = AutoModelForCausalLM.from_pretrained("NovelAI/genji-jp", torch_dtype=torch.float16, low_cpu_mem_usage=True).eval().cuda() text = '''あらすじ:あなたは異世界に転生してしまいました。勇者となって、仲間を作り、異世界を冒険しよう! *** 転生すると、ある能力を手に入れていた。それは、''' tokens = tokenizer(text, return_tensors="pt").input_ids generated_tokens = model.generate(tokens.long().cuda(), use_cache=True, do_sample=True, temperature=1, top_p=0.9, repetition_penalty=1.125, min_length=1, max_length=len(tokens[0]) + 400, pad_token_id=tokenizer.eos_token_id) last_tokens = generated_tokens[0] generated_text = tokenizer.decode(last_tokens).replace("�", "") print("Generation:\n" + generated_text) ``` When run, produces output like this: ``` Generation: あらすじ:あなたは異世界に転生してしまいました。勇者となって、仲間を作り、異世界を冒険しよう! *** 転生すると、ある能力を手に入れていた。それは、『予知』だ。過去から未来のことを、誰も知らない出来事も含めて見通すことが出来る。 悪魔の欠片と呼ばれる小さな結晶を取り込んで、使役することが出来る。人を惹きつけ、堕落させる。何より、俺は男なんて居なかったし、女に興味もない。……そんなクズの片棒を担ぎ上げる奴が多くなると思うと、ちょっと苦しい。 だが、一部の人間には協力者を得ることが出来る。目立たない街にある寺の中で、常に家に引きこもっている老人。そんなヤツの魂をコントロールすることが出来るのだ。便利な能力だ。しかし、裏切り者は大勢いる。気を抜けば、狂う。だから注意が必要だ。 ――「やってやるよ」  アーロンは不敵に笑った。この ``` ## Acknowledgements This project was possible because of the compute provided by the [TPU Research Cloud](https://sites.research.google/trc/) Thanks [EleutherAI](https://eleuther.ai/) for pretraining the GPT-J 6B model. Thanks to everyone who contributed to this project! - [Finetune](https://github.com/finetuneanon) - [Aero](https://github.com/AeroScripts) - [Kurumuz](https://github.com/kurumuz)
{"language": ["ja", "en"], "license": "apache-2.0", "tags": ["pytorch", "causal-lm"]}
text-generation
NovelAI/genji-jp
[ "transformers", "pytorch", "gptj", "text-generation", "causal-lm", "ja", "en", "arxiv:2104.09864", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2104.09864" ]
[ "ja", "en" ]
TAGS #transformers #pytorch #gptj #text-generation #causal-lm #ja #en #arxiv-2104.09864 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us
Genji-JP 6B =========== Please check our blog post for more details, samples, evaluations and more: Blogpost Model Description ----------------- Genji-JP 6B is a model finetuned on our Japanese storytelling dataset based on EleutherAI's GPT-J 6B model. This particular model is trained on Japanese web novels. '\*' each layer consists of one feedforward block and one self attention block The model consists of 28 layers with a model dimension of 4096, and a feedforward dimension of 16384. The model dimension is split into 16 heads, each with a dimension of 256. Rotary position encodings (RoPE) was applied to 64 dimensions of each head. The model is trained with a tokenization vocabulary of 50257, using the same set of BPEs as GPT-2/GPT-3. Training data ------------- GPT-J 6B was pretrained on the Pile, a large scale curated dataset created by EleutherAI for the purpose of training this model. After the pre-training, it's finetuned on our Japanese storytelling dataset. Check our blog post for more details. ### How to use When run, produces output like this: Acknowledgements ---------------- This project was possible because of the compute provided by the TPU Research Cloud Thanks EleutherAI for pretraining the GPT-J 6B model. Thanks to everyone who contributed to this project! * Finetune * Aero * Kurumuz
[ "### How to use\n\n\nWhen run, produces output like this:\n\n\nAcknowledgements\n----------------\n\n\nThis project was possible because of the compute provided by the\nTPU Research Cloud\n\n\nThanks EleutherAI for pretraining the GPT-J 6B model.\n\n\nThanks to everyone who contributed to this project!\n\n\n* Finetune\n* Aero\n* Kurumuz" ]
[ "TAGS\n#transformers #pytorch #gptj #text-generation #causal-lm #ja #en #arxiv-2104.09864 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n", "### How to use\n\n\nWhen run, produces output like this:\n\n\nAcknowledgements\n----------------\n\n\nThis project was possible because of the compute provided by the\nTPU Research Cloud\n\n\nThanks EleutherAI for pretraining the GPT-J 6B model.\n\n\nThanks to everyone who contributed to this project!\n\n\n* Finetune\n* Aero\n* Kurumuz" ]
[ 68, 71 ]
[ "passage: TAGS\n#transformers #pytorch #gptj #text-generation #causal-lm #ja #en #arxiv-2104.09864 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n### How to use\n\n\nWhen run, produces output like this:\n\n\nAcknowledgements\n----------------\n\n\nThis project was possible because of the compute provided by the\nTPU Research Cloud\n\n\nThanks EleutherAI for pretraining the GPT-J 6B model.\n\n\nThanks to everyone who contributed to this project!\n\n\n* Finetune\n* Aero\n* Kurumuz" ]
[ -0.05668982118368149, -0.015098558738827705, 0.00003580578413675539, 0.09675341099500656, 0.07482470571994781, 0.02813628688454628, 0.0992436558008194, 0.07903452217578888, -0.05200038105249405, -0.007009000983089209, 0.19366028904914856, 0.1764100193977356, 0.005754760466516018, 0.11232306808233261, 0.008116519078612328, -0.2378089427947998, 0.037789829075336456, 0.05924413725733757, -0.007526607718318701, 0.11526435613632202, 0.13938872516155243, -0.032931797206401825, 0.1211838349699974, 0.0156297218054533, -0.13507603108882904, 0.00123754539526999, 0.001968957483768463, -0.09773267805576324, 0.14116770029067993, 0.10212327539920807, 0.004405033774673939, 0.044938042759895325, 0.0061226715333759785, -0.017934689298272133, 0.036058489233255386, 0.015251316130161285, -0.08209441602230072, 0.07086146622896194, 0.0013784003676846623, 0.053199652582407, 0.23894140124320984, 0.03395558148622513, -0.058187663555145264, -0.007120856083929539, -0.10130266845226288, -0.08387865871191025, -0.11390635371208191, 0.1099812462925911, 0.05510356277227402, 0.061023786664009094, 0.022885898128151894, 0.17909806966781616, -0.11653998494148254, 0.07011178880929947, 0.2239951342344284, -0.31211620569229126, -0.07504823803901672, 0.0777626633644104, 0.06859029084444046, -0.004552945494651794, 0.009414048865437508, 0.018302394077181816, 0.0791676938533783, -0.009622796438634396, -0.011005408130586147, -0.05742667242884636, -0.12249582260847092, 0.04015839099884033, -0.111216701567173, -0.08185405284166336, 0.29546815156936646, -0.02064795233309269, 0.0037130822893232107, 0.01579757034778595, -0.09998304396867752, 0.012402309104800224, 0.07933111488819122, -0.05018820986151695, -0.025678517296910286, 0.02526640146970749, 0.07793349027633667, -0.049100715667009354, -0.1309407502412796, -0.07269872725009918, -0.1208348274230957, 0.1935587376356125, 0.017516378313302994, 0.053967271000146866, -0.04205578938126564, 0.16523262858390808, -0.05632928013801575, -0.09934693574905396, 0.003231707261875272, -0.07817038148641586, 0.09741079062223434, 0.04378706216812134, -0.015550085343420506, -0.04352273792028427, 0.05388045310974121, 0.14218902587890625, 0.11285180598497391, -0.04416652396321297, 0.1266917586326599, 0.0813722237944603, -0.025022845715284348, 0.10936835408210754, -0.1261897087097168, -0.10226123034954071, 0.09157861024141312, -0.06478720903396606, 0.07406186312437057, -0.03567766398191452, -0.13738442957401276, -0.05896668881177902, 0.020159093663096428, 0.02191293239593506, 0.06777830421924591, 0.08135268092155457, -0.030043741688132286, -0.025672951713204384, 0.21115835011005402, -0.013102608732879162, -0.01358810719102621, -0.07193246483802795, -0.008672487922012806, -0.036008886992931366, 0.07441175729036331, 0.008044388145208359, -0.03149980306625366, 0.062003243714571, -0.06153690069913864, -0.04455619305372238, -0.05974718928337097, -0.05761481449007988, 0.030674131587147713, -0.10670984536409378, 0.058698415756225586, -0.17684312164783478, -0.10617276281118393, 0.004430289380252361, 0.06850483268499374, -0.0334906168282032, -0.14196045696735382, 0.06527547538280487, -0.03958879038691521, 0.06017430126667023, -0.04747586324810982, 0.1243918389081955, -0.06578207015991211, 0.04808260500431061, -0.0425821989774704, 0.09673089534044266, -0.1690666824579239, 0.044898759573698044, -0.0773695707321167, 0.01572558283805847, -0.09210164099931717, -0.05099327117204666, -0.042243242263793945, 0.011099458672106266, -0.0453912727534771, -0.048835910856723785, -0.0008637360297143459, -0.004173434805124998, 0.08090566098690033, 0.08095322549343109, -0.1259799748659134, 0.008626158349215984, 0.015705382451415062, -0.08356684446334839, -0.16331425309181213, 0.10829931497573853, 0.03794929385185242, 0.09612835198640823, -0.00820021890103817, 0.10897606611251831, 0.06328804045915604, -0.06989569216966629, -0.01641959324479103, 0.06264396011829376, -0.09731350094079971, -0.10627973079681396, 0.0850340947508812, 0.07281934469938278, -0.14848478138446808, 0.03281986713409424, -0.02071308344602585, 0.08830532431602478, -0.02176040969789028, -0.03793869912624359, -0.0881287157535553, -0.06260606646537781, -0.014239651151001453, 0.010659538209438324, 0.04569403454661369, -0.034271903336048126, -0.05238283425569534, -0.10051587969064713, 0.08606017380952835, 0.02221139892935753, 0.031347014009952545, -0.05022101476788521, 0.13360509276390076, -0.10185514390468597, 0.03295344114303589, -0.07049932330846786, 0.013955295085906982, -0.00704225292429328, 0.07714595645666122, 0.010997531935572624, 0.12771037220954895, 0.023594222962856293, 0.000820219749584794, -0.025193430483341217, -0.022183898836374283, 0.05232531577348709, 0.030226297676563263, -0.09120771288871765, -0.11073309183120728, 0.020168758928775787, -0.03646696358919144, -0.04805099964141846, -0.03519583120942116, -0.006801469251513481, 0.02266116999089718, 0.06311435997486115, -0.04416878521442413, 0.0809393897652626, 0.0033472992945462465, 0.002800457878038287, -0.10419129580259323, -0.046618830412626266, 0.031161410734057426, 0.017869051545858383, -0.041725318878889084, 0.18722465634346008, -0.059606146067380905, 0.17749349772930145, 0.14173059165477753, -0.031474463641643524, -0.0015154278371483088, 0.10775227844715118, -0.06762427091598511, -0.04139377176761627, 0.05040378123521805, 0.027766551822423935, 0.0334891714155674, -0.0545060969889164, 0.06876800954341888, -0.06161065027117729, 0.006337375845760107, 0.022461501881480217, -0.05720356106758118, -0.015529877506196499, 0.09253542870283127, 0.1467811018228531, -0.11761844158172607, 0.13162857294082642, 0.26116904616355896, -0.03891662880778313, 0.106125108897686, 0.05813199654221535, -0.04266638681292534, -0.004487562924623489, -0.029794542118906975, -0.0030282868538051844, 0.14172816276550293, -0.1241835206747055, 0.004491762258112431, 0.09188757091760635, -0.0141685726121068, 0.059333302080631256, -0.13687089085578918, -0.047413941472768784, 0.0010940078645944595, 0.030419928953051567, 0.03567028418183327, 0.13003034889698029, -0.07190444320440292, 0.11565694212913513, -0.03926002234220505, -0.10473703593015671, 0.0331014022231102, 0.07073326408863068, -0.02036444842815399, 0.13229669630527496, -0.04486953094601631, -0.22174116969108582, -0.09230930358171463, -0.016321808099746704, -0.0782582014799118, 0.009158670902252197, 0.11737336963415146, -0.03383158519864082, -0.04137418791651726, -0.03802456706762314, 0.01328116562217474, 0.020459823310375214, 0.022455738857388496, -0.051428861916065216, -0.04314394295215607, -0.03681386262178421, -0.12739506363868713, -0.022043177857995033, -0.012731418944895267, -0.06543716043233871, 0.1442125290632248, -0.024388911202549934, 0.15051712095737457, 0.10975904762744904, -0.00908689759671688, -0.01659306138753891, 0.00034229832817800343, 0.22304163873195648, -0.08385935425758362, 0.07471616566181183, 0.2041480988264084, 0.013150415383279324, 0.04844866320490837, 0.11261541396379471, -0.01280270703136921, -0.05113689973950386, 0.031721051782369614, -0.06475309282541275, -0.09967900067567825, -0.17274734377861023, -0.06644664704799652, -0.10006017237901688, 0.0636688843369484, 0.00488685630261898, 0.07586243748664856, 0.03159995377063751, 0.14414535462856293, -0.008585629984736443, 0.12538209557533264, -0.08116985857486725, 0.0633329451084137, 0.07827389240264893, -0.0343688502907753, 0.10258790105581284, -0.07435046136379242, -0.04988737031817436, 0.15572617948055267, 0.06135321035981178, 0.10147766768932343, -0.027306819334626198, 0.002349614165723324, 0.08149898052215576, 0.2008669376373291, 0.020991923287510872, 0.14063198864459991, -0.03694469481706619, -0.02674078196287155, -0.08238009363412857, -0.02023433893918991, -0.042831532657146454, 0.07808829098939896, -0.0371038056910038, -0.04164142906665802, 0.03385019302368164, -0.007767106872051954, 0.0367623008787632, 0.127247154712677, 0.05476510152220726, -0.1748638153076172, -0.06251206994056702, 0.02549971081316471, 0.0064608268439769745, -0.08854330331087112, 0.025140462443232536, 0.04660169407725334, -0.07497099786996841, -0.005396844819188118, 0.012300598435103893, 0.08660086244344711, 0.008569721132516861, 0.016254903748631477, -0.022525573149323463, -0.021520860493183136, -0.00020717011648230255, 0.13862250745296478, -0.30997395515441895, 0.16300101578235626, -0.02367282658815384, -0.03954348713159561, -0.11650438606739044, -0.001230392837896943, 0.04160574451088905, 0.17077474296092987, 0.09457441419363022, 0.023605827242136, 0.012898867949843407, 0.009532979689538479, -0.11239969730377197, 0.05192188918590546, -0.02255193144083023, -0.032898545265197754, -0.016615279018878937, -0.004242243245244026, 0.03818488121032715, 0.010725725442171097, 0.16501638293266296, -0.14796552062034607, -0.08992277085781097, 0.1019781082868576, 0.04033662751317024, 0.03549521788954735, -0.04309235140681267, -0.06508223712444305, -0.07751739770174026, 0.15435945987701416, 0.04803978279232979, -0.039626654237508774, -0.07879404723644257, -0.09709611535072327, 0.10193033516407013, -0.10722562670707703, 0.009080147370696068, -0.038032401353120804, -0.04806051403284073, -0.01477457769215107, -0.09742017835378647, 0.1198529303073883, -0.13580282032489777, -0.12273599207401276, -0.01653463765978813, -0.007507838774472475, -0.055939849466085434, 0.03220035135746002, -0.002151170279830694, 0.003916190005838871, -0.18235798180103302, -0.1357191950082779, -0.010675568133592606, 0.010948645882308483, 0.06825751811265945, -0.0992782935500145, 0.030322369188070297, 0.03438292071223259, 0.016441909596323967, -0.09928862005472183, 0.14858384430408478, 0.15009166300296783, -0.09032509475946426, 0.10094079375267029, 0.07064507156610489, -0.032918814569711685, -0.27475279569625854, -0.1516982465982437, -0.07527733594179153, 0.009344692341983318, -0.04366372898221016, -0.06218230724334717, 0.0560280941426754, 0.039010826498270035, -0.05209087207913399, 0.10779468715190887, -0.18656356632709503, -0.10291621088981628, 0.15840354561805725, 0.037291258573532104, 0.33181247115135193, -0.09433627873659134, 0.014980136416852474, -0.06717182695865631, -0.16116835176944733, 0.08127668499946594, -0.06588716804981232, 0.0965728685259819, -0.0963607206940651, 0.030816322192549706, -0.01833755522966385, -0.0647437795996666, 0.12711061537265778, -0.08087323606014252, 0.02434946782886982, -0.14346487820148468, -0.05854935944080353, 0.04347463697195053, -0.04570821672677994, 0.08908392488956451, -0.0841948539018631, 0.04751238226890564, -0.13071459531784058, -0.02420652098953724, -0.08676783740520477, 0.06870212405920029, -0.015144525095820427, -0.10212886333465576, -0.06242835521697998, 0.0007460400811396539, -0.018081074580550194, -0.02543267048895359, 0.08781149983406067, 0.04009205847978592, -0.006936581339687109, 0.05088803544640541, 0.040367741137742996, -0.18833690881729126, 0.009038211777806282, 0.014291731640696526, -0.024421796202659607, 0.11185211688280106, -0.15899351239204407, -0.007494237739592791, 0.06642122566699982, -0.046534162014722824, 0.024098630994558334, 0.0693005621433258, -0.052352793514728546, 0.03876826912164688, 0.10296875983476639, -0.21305838227272034, 0.007700575981289148, -0.073513925075531, 0.030278468504548073, 0.08749621361494064, 0.07988335192203522, 0.09735803306102753, -0.10458954423666, -0.05171051621437073, -0.0054186442866921425, 0.012740050442516804, -0.07497912645339966, 0.07733047008514404, 0.09454593807458878, -0.01850023865699768, -0.08504622429609299, 0.03482728824019432, 0.036039333790540695, -0.09217078238725662, -0.042905330657958984, 0.009195427410304546, -0.09783429652452469, -0.12457859516143799, 0.0413396991789341, -0.010691836476325989, -0.12085038423538208, -0.08016125112771988, -0.03923166170716286, -0.03525248542428017, 0.07721468806266785, 0.006159518379718065, 0.08826526254415512, 0.03531661257147789, -0.02887679822742939, -0.018065715208649635, -0.015289363451302052, 0.025001484900712967, 0.017723379656672478, 0.09983228147029877, -0.14033184945583344, -0.02562340535223484, -0.005705897230654955, 0.1444445252418518, -0.06076318025588989, 0.0403619222342968, -0.11550852656364441, -0.02943195030093193, -0.06665173172950745, -0.04070965573191643, -0.09177573025226593, -0.01338671613484621, -0.010058561339974403, -0.07700324058532715, -0.05818282440304756, 0.0540955550968647, -0.09684791415929794, 0.001583061646670103, -0.02276572585105896, 0.0678514689207077, -0.0517888218164444, -0.02503669448196888, 0.08650609850883484, -0.01174849085509777, 0.12975700199604034, 0.02223587967455387, -0.023007862269878387, -0.00852882955223322, -0.09402771294116974, 0.002351837232708931, 0.04124664515256882, 0.038604553788900375, 0.0571536086499691, -0.05642002448439598, 0.010308980941772461, 0.03060022182762623, -0.007072689011693001, -0.01890401728451252, 0.045715343207120895, -0.08224648982286453, 0.0008831657469272614, -0.00801414530724287, -0.12353716790676117, 0.002952620619907975, -0.005778323393315077, 0.04432208836078644, 0.045359544456005096, 0.11210495978593826, 0.022371048107743263, -0.00024554194533266127, -0.07869032025337219, 0.01635289378464222, 0.0024311207234859467, -0.11948870867490768, -0.10520369559526443, -0.07940130680799484, -0.0007818862213753164, 0.02762090414762497, 0.1624779850244522, 0.07471518218517303, -0.1297704428434372, -0.035955775529146194, 0.11550407111644745, 0.032611194998025894, -0.049010585993528366, 0.14598716795444489, -0.027495061978697777, 0.04548973962664604, -0.10016247630119324, 0.09061136841773987, 0.02192394807934761, 0.03163466230034828, 0.06179362162947655, -0.009691519662737846, 0.07497832924127579, 0.08339184522628784, -0.05161150172352791, -0.052678659558296204, -0.12036988139152527, -0.1414167433977127, -0.04412156715989113, 0.08253514021635056, -0.040867824107408524, 0.021194513887166977, 0.1376202255487442, -0.039400745183229446, -0.0257728174328804, -0.023040635511279106, -0.015274632722139359, -0.11040326952934265, -0.1320256143808365, -0.07687754184007645, -0.131339892745018, 0.0018886225298047066, -0.0620887391269207, 0.03682105243206024, 0.10363861918449402, 0.016130579635500908, -0.03647620230913162, 0.10241460800170898, 0.05955135077238083, -0.04109090194106102, 0.006289663724601269, -0.0007257751422002912, -0.02033691667020321, -0.09281427413225174, 0.040994882583618164, -0.11222022771835327, -0.02549929916858673, 0.013303103856742382, 0.016204142943024635, -0.017742332071065903, 0.023392416536808014, -0.02580139972269535, 0.00044954041368328035, -0.06317248195409775, 0.023161275312304497, 0.0442158468067646, 0.09790320694446564, -0.010157301090657711, 0.04787955805659294, 0.057232681661844254, 0.1856873780488968, -0.048554323613643646, -0.12129859626293182, -0.047752127051353455, 0.1590631902217865, -0.0321538932621479, 0.026169786229729652, -0.027633201330900192, 0.00036607193760573864, 0.002231573686003685, 0.21894769370555878, 0.2366020679473877, -0.05097454786300659, 0.026313921436667442, 0.012684891931712627, 0.027087464928627014, 0.002214215463027358, 0.14521542191505432, 0.1375282257795334, 0.14590217173099518, -0.09387773275375366, -0.004952226299792528, -0.11756566166877747, 0.01658475399017334, -0.03469123691320419, 0.12562710046768188, 0.023824913427233696, -0.10735798627138138, -0.0313141755759716, 0.04340570420026779, -0.04407376050949097, 0.05374235287308693, -0.014545436948537827, -0.09965837746858597, -0.08273797482252121, 0.04527153819799423, 0.18071697652339935, -0.04169557988643646, 0.06571480631828308, -0.053945139050483704, 0.013160602189600468, 0.038008395582437515, 0.015214839950203896, -0.17293785512447357, 0.03943371772766113, 0.1108512356877327, -0.0034127088729292154, 0.16598089039325714, -0.036409202963113785, 0.07156344503164291, 0.08694480359554291, 0.04477699100971222, -0.14702551066875458, 0.05439836159348488, 0.012065423652529716, -0.09205339103937149, 0.026410045102238655, -0.06353854387998581, 0.01691700890660286, -0.1207619234919548, 0.018597206100821495, -0.07601414620876312, 0.0477612279355526, 0.13448086380958557, 0.07903659343719482, -0.07662015408277512, 0.053731609135866165, -0.10631134361028671, 0.1371060013771057, 0.0359620563685894, -0.055570345371961594, -0.03824241831898689, -0.11607544869184494, 0.0025126305408775806, -0.029826903715729713, 0.02320224605500698, -0.03334895893931389, -0.057407647371292114, -0.01148980576545, -0.013847796246409416, 0.04084252938628197, -0.06934865564107895, -0.03888343647122383, -0.060374666005373, -0.052764441817998886, -0.0718764066696167, 0.054264336824417114, 0.06482445448637009, -0.019503453746438026, -0.006883649621158838, -0.06399761885404587, -0.06900633871555328, 0.01996334083378315, -0.0833011195063591, -0.1075291559100151 ]
null
null
null
# Genji-python 6B For example usage or to easily use the model you can check our colab notebook: [Notebook](https://colab.research.google.com/drive/1PnWpx02IEUkY8jhLKd_NewUGEXahAska?usp=sharing) ## Model Description Genji is a transformer model finetuned on EleutherAI's GPT-J 6B model. This particular model is trained on python only code approaching 4GB in size. Split model has the checkpoints splitted, which makes it use less system RAM while loading and makes it faster to load. This model needs more effort to set up as you need to install git-lfs and pull the repo. | Hyperparameter | Value | |-------------------|--------| | n_parameters | 6,053,381,344 | | n_layers | 28* | | d_model | 4,096 | | d_ff | 16,384 | | n_heads | 16 | | d_head | 256 | | n_ctx | 2,048 | | n_vocab | 50,400 (same tokenizer as GPT-2/3) | | position encoding | [Rotary position encodings (RoPE)](https://arxiv.org/abs/2104.09864) | | RoPE dimensions | [64](https://github.com/kingoflolz/mesh-transformer-jax/blob/f2aa66e0925de6593dcbb70e72399b97b4130482/mesh_transformer/layers.py#L223) | `*` each layer consists of one feedforward block and one self attention block The model consists of 28 layers with a model dimension of 4096, and a feedforward dimension of 16384. The model dimension is split into 16 heads, each with a dimension of 256. Rotary position encodings (RoPE) was applied to 64 dimensions of each head. The model is trained with a tokenization vocabulary of 50257, using the same set of BPEs as GPT-2/GPT-3. ## Training data GPT-J 6B was pretrained on the [Pile](pile.eleuther.ai), a large scale curated dataset created by EleutherAI for the purpose of training this model. After the pre-training, it's finetuned on the python code that was taken from the Pile. ## Training procedure Genji-python-6B is trained for 20k steps on around 655 million tokens with learning rate of 2e-06 ## Intended Use This model is trained for assistence on writing python code and having fun trying weird stuff with it. ### How to use This model is only usable with our fork because GPT-J is not merged to the main transformers repo yet. When it's merged, we will make this model easily loadable. For now, you need to use this fork: [Fork](https://github.com/finetuneanon/transformers) to install with pip: ```bash pip install git+https://github.com/finetuneanon/transformers@gpt-neo-localattention3-rp-b ``` **git-lfs** also needs to be installed, on ubuntu: ```bash apt install git-lfs ``` after it's installed, initialize git-lfs: ```bash git lfs install ``` then clone this repo: ```bash git clone https://huggingface.co/NovelAI/genji-python-6B-split ``` Now we can load the model. We recommend the usage of the model as FP16. That way, it fits in 16GB VRAM cards. How to use: ```python from transformers import ( AutoTokenizer, AutoModelForCausalLM, GPTNeoForCausalLM, ) model = AutoModelForCausalLM.from_pretrained("genji-python-6B-split/model").half().eval().cuda() tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neo-2.7B") text = '''def print_customer_name''' tokens = tokenizer(text, return_tensors="pt").input_ids generated_tokens = model.generate(tokens.long().cuda(), use_cache=True, do_sample=True, top_k=50, temperature=0.3, top_p=0.9, repetition_penalty=1.125, min_length=1, max_length=len(tokens[0]) + 400, pad_token_id=tokenizer.eos_token_id) last_tokens = generated_tokens[0][len(tokens[0]):] generated_text = tokenizer.decode(last_tokens) print("Generation:\n" + generated_text) ``` When ran, this code generates: ```python Prompt: def print_customer_name Generation: (self, customer): """Print the name of a customer.""" if not self.is_valid(): return print("Customer: {}".format(customer)) ``` For example usage, you can see our colab notebook as well: [Notebook](https://colab.research.google.com/drive/1PnWpx02IEUkY8jhLKd_NewUGEXahAska?usp=sharing) ## Eval results TBD ## Acknowledgements This project was possible because of the compute provided by the [TPU Research Cloud](https://sites.research.google/trc/) and [EleutherAI](https://eleuther.ai/) for pretraining of the GPT-J 6B. Thanks to everyone who contributed to this project: - [Aero](https://github.com/AeroScripts) - [Finetune](https://github.com/finetuneanon) - [Kurumuz](https://github.com/kurumuz)
{"language": ["en"], "license": "apache-2.0", "tags": ["pytorch", "causal-lm"], "datasets": ["the Pile"]}
null
NovelAI/genji-python-6B-split
[ "pytorch", "causal-lm", "en", "arxiv:2104.09864", "license:apache-2.0", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2104.09864" ]
[ "en" ]
TAGS #pytorch #causal-lm #en #arxiv-2104.09864 #license-apache-2.0 #region-us
Genji-python 6B =============== For example usage or to easily use the model you can check our colab notebook: Notebook Model Description ----------------- Genji is a transformer model finetuned on EleutherAI's GPT-J 6B model. This particular model is trained on python only code approaching 4GB in size. Split model has the checkpoints splitted, which makes it use less system RAM while loading and makes it faster to load. This model needs more effort to set up as you need to install git-lfs and pull the repo. '\*' each layer consists of one feedforward block and one self attention block The model consists of 28 layers with a model dimension of 4096, and a feedforward dimension of 16384. The model dimension is split into 16 heads, each with a dimension of 256. Rotary position encodings (RoPE) was applied to 64 dimensions of each head. The model is trained with a tokenization vocabulary of 50257, using the same set of BPEs as GPT-2/GPT-3. Training data ------------- GPT-J 6B was pretrained on the Pile, a large scale curated dataset created by EleutherAI for the purpose of training this model. After the pre-training, it's finetuned on the python code that was taken from the Pile. Training procedure ------------------ Genji-python-6B is trained for 20k steps on around 655 million tokens with learning rate of 2e-06 Intended Use ------------ This model is trained for assistence on writing python code and having fun trying weird stuff with it. ### How to use This model is only usable with our fork because GPT-J is not merged to the main transformers repo yet. When it's merged, we will make this model easily loadable. For now, you need to use this fork: Fork to install with pip: git-lfs also needs to be installed, on ubuntu: after it's installed, initialize git-lfs: then clone this repo: Now we can load the model. We recommend the usage of the model as FP16. That way, it fits in 16GB VRAM cards. How to use: When ran, this code generates: For example usage, you can see our colab notebook as well: Notebook Eval results ------------ TBD Acknowledgements ---------------- This project was possible because of the compute provided by the TPU Research Cloud and EleutherAI for pretraining of the GPT-J 6B. Thanks to everyone who contributed to this project: * Aero * Finetune * Kurumuz
[ "### How to use\n\n\nThis model is only usable with our fork because GPT-J is not merged to the main transformers repo yet. When it's merged, we will make this model easily loadable.\nFor now, you need to use this fork:\nFork\n\n\nto install with pip:\n\n\ngit-lfs also needs to be installed, on ubuntu:\n\n\nafter it's installed, initialize git-lfs:\n\n\nthen clone this repo:\n\n\nNow we can load the model.\n\n\nWe recommend the usage of the model as FP16. That way, it fits in 16GB VRAM cards.\n\n\nHow to use:\n\n\nWhen ran, this code generates:\n\n\nFor example usage, you can see our colab notebook as well:\nNotebook\n\n\nEval results\n------------\n\n\nTBD\n\n\nAcknowledgements\n----------------\n\n\nThis project was possible because of the compute provided by the\nTPU Research Cloud and EleutherAI for pretraining of the GPT-J 6B.\n\n\nThanks to everyone who contributed to this project:\n\n\n* Aero\n* Finetune\n* Kurumuz" ]
[ "TAGS\n#pytorch #causal-lm #en #arxiv-2104.09864 #license-apache-2.0 #region-us \n", "### How to use\n\n\nThis model is only usable with our fork because GPT-J is not merged to the main transformers repo yet. When it's merged, we will make this model easily loadable.\nFor now, you need to use this fork:\nFork\n\n\nto install with pip:\n\n\ngit-lfs also needs to be installed, on ubuntu:\n\n\nafter it's installed, initialize git-lfs:\n\n\nthen clone this repo:\n\n\nNow we can load the model.\n\n\nWe recommend the usage of the model as FP16. That way, it fits in 16GB VRAM cards.\n\n\nHow to use:\n\n\nWhen ran, this code generates:\n\n\nFor example usage, you can see our colab notebook as well:\nNotebook\n\n\nEval results\n------------\n\n\nTBD\n\n\nAcknowledgements\n----------------\n\n\nThis project was possible because of the compute provided by the\nTPU Research Cloud and EleutherAI for pretraining of the GPT-J 6B.\n\n\nThanks to everyone who contributed to this project:\n\n\n* Aero\n* Finetune\n* Kurumuz" ]
[ 34, 225 ]
[ "passage: TAGS\n#pytorch #causal-lm #en #arxiv-2104.09864 #license-apache-2.0 #region-us \n### How to use\n\n\nThis model is only usable with our fork because GPT-J is not merged to the main transformers repo yet. When it's merged, we will make this model easily loadable.\nFor now, you need to use this fork:\nFork\n\n\nto install with pip:\n\n\ngit-lfs also needs to be installed, on ubuntu:\n\n\nafter it's installed, initialize git-lfs:\n\n\nthen clone this repo:\n\n\nNow we can load the model.\n\n\nWe recommend the usage of the model as FP16. That way, it fits in 16GB VRAM cards.\n\n\nHow to use:\n\n\nWhen ran, this code generates:\n\n\nFor example usage, you can see our colab notebook as well:\nNotebook\n\n\nEval results\n------------\n\n\nTBD\n\n\nAcknowledgements\n----------------\n\n\nThis project was possible because of the compute provided by the\nTPU Research Cloud and EleutherAI for pretraining of the GPT-J 6B.\n\n\nThanks to everyone who contributed to this project:\n\n\n* Aero\n* Finetune\n* Kurumuz" ]
[ -0.051404085010290146, 0.05016235634684563, -0.001415837905369699, 0.07780568301677704, 0.09218769520521164, 0.058333203196525574, 0.06260430067777634, 0.07607229053974152, 0.008082910440862179, 0.009329077787697315, 0.06762756407260895, 0.08352208882570267, 0.09021984040737152, 0.06775516271591187, 0.06654901802539825, -0.18516358733177185, -0.017150308936834335, 0.0035419429186731577, -0.023207569494843483, 0.10209468007087708, 0.10820551216602325, -0.051147621124982834, 0.08641061186790466, 0.03911719098687172, -0.11298318952322006, 0.023480452597141266, -0.029553940519690514, -0.04724879190325737, 0.12720680236816406, 0.11214699596166611, 0.04805253818631172, -0.019143568351864815, 0.07368319481611252, -0.10348725318908691, 0.03323620930314064, 0.06701013445854187, -0.03474695608019829, 0.07887604087591171, 0.0411992147564888, 0.0619765967130661, 0.21857710182666779, 0.04908519238233566, -0.009674830362200737, 0.02944152057170868, -0.0708177238702774, -0.09696042537689209, -0.13484811782836914, 0.0773036852478981, 0.08303569257259369, 0.054601818323135376, 0.034488219767808914, 0.18594832718372345, -0.004503690637648106, 0.06101463362574577, 0.1520586460828781, -0.18575362861156464, -0.08288676291704178, 0.09361650049686432, 0.048333197832107544, -0.004618666134774685, 0.010523679666221142, -0.01307199988514185, 0.06326309591531754, 0.05070578679442406, 0.009357525035738945, -0.03766436502337456, -0.09981165826320648, -0.010657447390258312, -0.13271090388298035, -0.11982094496488571, 0.23081648349761963, -0.031530726701021194, -0.039286356419324875, 0.0123726362362504, -0.10035991668701172, 0.03445465862751007, 0.07013468444347382, -0.05394637957215309, 0.034727200865745544, 0.018496783450245857, 0.15696069598197937, -0.13413137197494507, -0.10942338407039642, -0.08775458484888077, -0.014260652475059032, 0.13634473085403442, 0.06130554899573326, 0.08182991296052933, 0.0034616244956851006, 0.15420840680599213, -0.09192029386758804, -0.029036199674010277, -0.00470275804400444, -0.06548093259334564, 0.027837999165058136, 0.030018743127584457, 0.006297064945101738, -0.09296493977308273, 0.027160650119185448, 0.19441913068294525, -0.0030841855332255363, -0.02890731580555439, 0.11327557265758514, 0.053928524255752563, 0.006566249765455723, 0.10378158092498779, -0.13276807963848114, 0.007488598115742207, 0.11572360247373581, -0.07328583300113678, 0.06157630681991577, -0.00941123254597187, -0.071797214448452, -0.0029480329249054193, -0.0007098429487086833, 0.02871873788535595, 0.09065668284893036, 0.0071956319734454155, -0.03809862583875656, -0.07665438950061798, 0.270466148853302, -0.05211548134684563, -0.01142802368849516, -0.04511135071516037, -0.08207505196332932, -0.0520949512720108, 0.14629526436328888, 0.0064437673427164555, -0.07275792956352234, 0.012523047626018524, -0.04550271853804588, -0.05383705347776413, -0.09762687236070633, -0.08535169810056686, 0.045897047966718674, -0.08320021629333496, 0.04379260540008545, -0.18763120472431183, -0.17670902609825134, -0.036277610808610916, 0.07087258249521255, -0.03464676812291145, -0.07789662480354309, 0.12292615324258804, 0.006271501071751118, -0.02792852371931076, -0.014281945303082466, 0.09688661247491837, -0.03086080588400364, 0.019549202173948288, -0.04920171946287155, 0.042697276920080185, -0.09346342831850052, 0.018793610855937004, -0.04414977505803108, 0.04257531091570854, -0.16847488284111023, -0.025180883705615997, -0.05678067356348038, 0.008946803398430347, -0.10400454699993134, -0.032534632831811905, 0.04660491645336151, -0.047619011253118515, 0.10998886823654175, 0.05737212672829628, -0.11193796992301941, 0.04951077327132225, -0.026644248515367508, -0.12371513992547989, -0.08728157728910446, 0.09641151875257492, 0.06818453967571259, 0.004234697204083204, -0.010705937631428242, 0.0766734629869461, 0.1207146942615509, -0.12955842912197113, -0.035516589879989624, 0.038001060485839844, -0.05006992071866989, -0.08351633697748184, 0.07390035688877106, 0.041336555033922195, -0.16636991500854492, -0.004218353424221277, -0.05861080810427666, 0.07136398553848267, -0.03912365064024925, -0.031213462352752686, -0.07132516801357269, -0.08861909806728363, -0.06315816193819046, 0.023192288354039192, -0.024680206552147865, 0.012163195759057999, -0.05940111726522446, -0.10991902649402618, 0.14161311089992523, -0.023956840857863426, 0.026047062128782272, -0.03494967892765999, 0.1895405799150467, -0.10758590698242188, 0.005306032951921225, -0.07213730365037918, -0.07794006168842316, 0.07157538831233978, -0.031440529972314835, 0.0060452246107161045, 0.026330163702368736, 0.0030580523889511824, 0.08207593858242035, 0.002398478100076318, -0.01786048337817192, -0.04727695509791374, 0.004280690103769302, -0.06364543735980988, -0.042926836758852005, -0.05608690157532692, -0.010005617514252663, -0.036712612956762314, -0.06250884383916855, 0.006425016559660435, -0.015797700732946396, 0.07684989273548126, -0.05706947669386864, -0.009159241802990437, 0.040443506091833115, -0.034324437379837036, -0.03589159622788429, -0.06411663442850113, 0.023541560396552086, 0.03384862467646599, -0.0175107903778553, 0.08000769466161728, -0.04747992008924484, -0.0659518614411354, 0.06417492032051086, 0.020082730799913406, -0.028036989271640778, 0.043121229857206345, -0.06340327113866806, -0.06831086426973343, 0.014730317518115044, -0.01569822058081627, 0.14552579820156097, -0.012515338137745857, 0.06262846291065216, -0.0660417377948761, -0.024537744000554085, 0.02644902840256691, -0.040115371346473694, -0.00020821970247197896, 0.059223469346761703, 0.15225808322429657, 0.04069395735859871, 0.06140249967575073, 0.11817264556884766, -0.024128418415784836, 0.015814540907740593, 0.050925564020872116, -0.05775858089327812, -0.003965823445469141, -0.030413411557674408, -0.0037843440659344196, 0.1291828602552414, -0.04650384560227394, 0.04848238453269005, 0.07265106588602066, -0.020001880824565887, 0.06031404808163643, -0.14044329524040222, 0.01348582748323679, 0.019352521747350693, -0.023080743849277496, 0.06101938337087631, 0.03258642181754112, -0.09425006061792374, 0.05934419482946396, -0.031026942655444145, 0.002069649985060096, -0.017232991755008698, 0.015637919306755066, -0.06131402775645256, 0.14274679124355316, -0.08604943752288818, -0.1942317932844162, -0.12206070870161057, 0.10248272120952606, -0.03337549790740013, 0.007616519927978516, 0.04679910093545914, 0.017929961904883385, -0.08365961909294128, -0.0478237122297287, 0.0770455077290535, 0.03038075938820839, -0.04860711470246315, -0.06182770058512688, -0.07541832327842712, 0.00698648439720273, -0.1416962444782257, 0.014886406250298023, 0.027962813153862953, -0.10908681154251099, 0.09219485521316528, 0.03726491704583168, 0.13617944717407227, 0.060382191091775894, -0.06029936298727989, -0.047149550169706345, 0.022883163765072823, 0.25197070837020874, -0.08089309930801392, 0.1275128871202469, 0.22509697079658508, -0.007366931531578302, 0.039751749485731125, 0.016795415431261063, 0.008582798764109612, -0.05257547274231911, 0.011459372006356716, -0.03407328575849533, -0.07298643887042999, -0.27421265840530396, -0.08558432012796402, -0.02839045785367489, 0.02552628517150879, 0.02777790278196335, 0.07184718549251556, -0.037875112146139145, 0.12880927324295044, -0.046102043241262436, 0.09263241291046143, -0.09272482246160507, 0.05135466158390045, 0.03339258208870888, -0.023918060585856438, 0.014266258105635643, -0.030053088441491127, 0.060869377106428146, 0.16287919878959656, 0.11620239913463593, 0.09112520515918732, -0.06000771000981331, 0.08200389891862869, 0.08068432658910751, 0.1787489950656891, -0.013977523893117905, 0.13444730639457703, -0.08148138225078583, 0.03597993403673172, -0.03491503745317459, -0.01134101115167141, -0.06632537394762039, 0.04613906145095825, -0.0028387545607984066, 0.007643343415111303, 0.022747110575437546, 0.01444375328719616, 0.016642775386571884, 0.22061091661453247, -0.015028371475636959, -0.1578546017408371, -0.053686778992414474, -0.00280495616607368, -0.014895112253725529, -0.09919903427362442, 0.025920232757925987, 0.0971812903881073, -0.13111476600170135, 0.00772716524079442, 0.009871430695056915, 0.08458562940359116, -0.10870252549648285, -0.05439656227827072, 0.07170463353395462, 0.08434411883354187, 0.01410731766372919, 0.12973232567310333, -0.15866106748580933, 0.04889053851366043, 0.02078799530863762, -0.0059265755116939545, -0.08208520710468292, 0.04653727635741234, 0.01740841008722782, 0.10528376698493958, 0.11548753082752228, 0.026942448690533638, 0.09535998106002808, -0.0023881271481513977, -0.17307685315608978, 0.04236551746726036, -0.05680515617132187, -0.08219952881336212, -0.01889103092253208, 0.007184548769146204, 0.044280242174863815, -0.022300563752651215, 0.02997160330414772, -0.15610527992248535, -0.07530680298805237, 0.07463707029819489, -0.02927262894809246, -0.025507021695375443, -0.06435402482748032, -0.03167443722486496, -0.05761970579624176, 0.1079091727733612, -0.03176530450582504, -0.08424694836139679, -0.08607116341590881, -0.03821293264627457, 0.11445273458957672, -0.07735919207334518, 0.03634136915206909, -0.022672059014439583, 0.027536766603589058, -0.0548524372279644, -0.05748201534152031, 0.05877215415239334, -0.16388891637325287, -0.12982138991355896, -0.0007945403340272605, 0.018744109198451042, 0.0021259391214698553, 0.04930921271443367, 0.005925898440182209, 0.008918622508645058, -0.11258556693792343, -0.10883073508739471, -0.006516716908663511, 0.08012544363737106, 0.10942505300045013, -0.004384417552500963, 0.0166263896971941, 0.10491418093442917, -0.0028297787066549063, -0.08178646862506866, 0.08518454432487488, 0.14385826885700226, -0.06463761627674103, 0.04642675444483757, 0.1253657341003418, -0.08030150830745697, -0.2314290851354599, -0.08134178072214127, 0.005101436749100685, -0.0017896302742883563, -0.05538926646113396, -0.13411544263362885, 0.07613006234169006, 0.07467374205589294, -0.029670972377061844, 0.11719012260437012, -0.18701744079589844, -0.08069606125354767, 0.10906874388456345, 0.07823548465967178, 0.05184764415025711, -0.110598124563694, -0.02108045481145382, -0.017655527219176292, -0.1276201754808426, 0.0650644302368164, -0.11079291254281998, 0.03996006026864052, -0.06660862267017365, 0.010238701477646828, 0.004336107522249222, -0.059904854744672775, 0.13271141052246094, -0.060375239700078964, 0.01433418970555067, -0.09822219610214233, 0.033149559050798416, -0.019204409793019295, -0.04338478669524193, 0.14972633123397827, 0.04107420891523361, 0.07600384950637817, -0.09634780883789062, -0.018401993438601494, -0.06625087559223175, 0.08693552762269974, -0.030505184084177017, -0.09306658059358597, -0.07327818125486374, 0.01636163890361786, -0.01114443689584732, -0.01487051509320736, -0.052073728293180466, 0.07155236601829529, -0.04625090956687927, 0.1889180839061737, 0.036119211465120316, -0.06938430666923523, -0.06942632049322128, 0.02185535430908203, 0.006981435231864452, 0.107253797352314, -0.07799780368804932, -0.02882794290781021, 0.07043062895536423, -0.04562197998166084, 0.03545727953314781, -0.0013814896810799837, -0.14675931632518768, -0.0013691508211195469, 0.025382937863469124, -0.20280954241752625, -0.1072884276509285, -0.08218862116336823, 0.01892027258872986, 0.023044327273964882, -0.008040249347686768, 0.0980486050248146, -0.1238647997379303, -0.05840534716844559, 0.008716686628758907, 0.05502329766750336, -0.046974100172519684, 0.09788580983877182, 0.1343151181936264, -0.016511844471096992, -0.0663040354847908, 0.11908939480781555, 0.09536359459161758, -0.06894485652446747, -0.029277818277478218, 0.058515556156635284, -0.060887131839990616, -0.10418667644262314, -0.01884806714951992, 0.0036962581798434258, -0.0957040786743164, -0.11446598172187805, -0.006674730218946934, 0.011936555616557598, -0.002584663685411215, -0.01621590368449688, 0.042234063148498535, -0.025699758902192116, 0.011704355478286743, 0.021595140919089317, -0.06769935041666031, 0.06847421824932098, 0.007033813279122114, 0.0856507271528244, -0.10062000900506973, -0.0012361040571704507, 0.04000670090317726, 0.16255995631217957, -0.017211269587278366, 0.03675274923443794, -0.06304723024368286, -0.021295450627803802, -0.05748835206031799, 0.01605174131691456, -0.023345524445176125, 0.02661355398595333, 0.016283079981803894, -0.01788979396224022, -0.011829067021608353, 0.06958485394716263, -0.05345679819583893, -0.0067772772163152695, -0.021478703245520592, 0.0512026883661747, -0.08356276154518127, -0.03647226095199585, 0.06985941529273987, -0.01314766239374876, 0.11741691082715988, -0.013764111325144768, 0.008297941647469997, -0.022914182394742966, -0.05927693098783493, 0.050851915031671524, 0.004971402697265148, 0.04039362445473671, 0.03517433628439903, -0.1355309784412384, 0.011561433784663677, -0.01051981933414936, -0.08311863988637924, -0.07580981403589249, 0.06934001296758652, -0.11139701306819916, -0.02874085307121277, 0.048033952713012695, -0.0922609269618988, -0.02125890552997589, -0.011214506812393665, -0.0034924400970339775, 0.08192787319421768, 0.05862317234277725, 0.031334903091192245, 0.05083282291889191, -0.09951858967542648, 0.0002969808701891452, 0.009847044013440609, -0.013065716251730919, -0.05317497253417969, -0.018032867461442947, 0.023571204394102097, -0.004076671786606312, 0.13634301722049713, 0.12397211790084839, -0.009941300377249718, -0.03701150417327881, 0.09215818345546722, -0.028716908767819405, -0.04730984941124916, 0.01310376264154911, -0.07749143987894058, 0.07180196791887283, -0.053777534514665604, 0.05344805121421814, 0.03294070065021515, -0.045785706490278244, 0.060588423162698746, -0.0074679916724562645, 0.06644522398710251, 0.04989980533719063, 0.014518569223582745, -0.0747625008225441, -0.08096520602703094, -0.13297788798809052, 0.017941271886229515, 0.09707450121641159, -0.04236600175499916, 0.04999556764960289, 0.07875128835439682, -0.12461639940738678, 0.006528096739202738, -0.010434925556182861, -0.06316272914409637, -0.1089576855301857, -0.1115691289305687, -0.025808213278651237, -0.09938988089561462, -0.012630529701709747, -0.08831188082695007, 0.003188882488757372, 0.11400540918111801, -0.0051680016331374645, -0.04315033182501793, 0.12939046323299408, 0.01891111209988594, -0.029616836458444595, -0.043894242495298386, 0.05633877217769623, 0.015441653318703175, -0.0869804099202156, -0.0009506064816378057, -0.0324660949409008, 0.0318598672747612, 0.057362012565135956, 0.021331286057829857, 0.04310740530490875, 0.06887796521186829, 0.03253404423594475, -0.03170163556933403, -0.042835772037506104, 0.0016009086975827813, 0.008753075264394283, 0.10276218503713608, 0.0011564878514036536, 0.06150589510798454, 0.024960150942206383, 0.1305571347475052, -0.030999256297945976, -0.038628727197647095, -0.07946574687957764, 0.13229678571224213, -0.08965978771448135, -0.026383347809314728, 0.05180159583687782, -0.08044598251581192, -0.00986366719007492, 0.21952256560325623, 0.15932804346084595, -0.0337340421974659, -0.01574867218732834, 0.03451477363705635, 0.005005608778446913, -0.010698717087507248, 0.1396578848361969, 0.10896418243646622, 0.07247638702392578, -0.08995159715414047, 0.05168525502085686, -0.110068678855896, 0.004868552088737488, -0.028177957981824875, 0.10741819441318512, -0.021620137616991997, -0.05253152176737785, 0.007229018956422806, 0.061356186866760254, 0.002472497057169676, -0.14755861461162567, 0.019957365468144417, -0.009218759834766388, -0.06357990950345993, 0.018905969336628914, 0.07192553579807281, -0.028008902445435524, 0.05541369691491127, -0.06485773622989655, 0.009951513260602951, 0.17451457679271698, 0.006094090174883604, -0.17387449741363525, -0.059958625584840775, 0.1270073503255844, -0.009719807654619217, 0.272535115480423, -0.022785505279898643, 0.0935555100440979, 0.04626196622848511, 0.022233959287405014, -0.19107505679130554, 0.015592977404594421, 0.04543599113821983, -0.09466201812028885, 0.028973864391446114, 0.027752121910452843, -0.015620949678122997, -0.04762775078415871, -0.021129686385393143, -0.0547846183180809, -0.0030609893146902323, 0.09935181587934494, 0.06467245519161224, -0.06608092784881592, 0.06732749938964844, -0.14041410386562347, 0.17631632089614868, 0.08096258342266083, -0.04907633736729622, -0.035409387201070786, -0.12898841500282288, 0.009785860776901245, -0.02058648131787777, 0.09395933896303177, 0.039257343858480453, -0.04319724440574646, 0.0014393066521734, -0.05704843997955322, 0.05697260797023773, -0.08743134140968323, -0.02054867520928383, -0.0078980578109622, -0.07005675882101059, -0.05823499336838722, 0.06392325460910797, 0.005710946395993233, 0.006225286982953548, -0.03172067180275917, 0.08150655031204224, -0.08288812637329102, 0.01543681975454092, -0.10219751298427582, -0.10124924778938293 ]
null
null
transformers
# Genji-python 6B For example usage or to easily use the model you can check our colab notebook: [Notebook](https://colab.research.google.com/drive/1PnWpx02IEUkY8jhLKd_NewUGEXahAska?usp=sharing) ## Model Description Genji is a transformer model finetuned on EleutherAI's GPT-J 6B model. This particular model is trained on python only code approaching 4GB in size. | Hyperparameter | Value | |-------------------|--------| | n_parameters | 6,053,381,344 | | n_layers | 28* | | d_model | 4,096 | | d_ff | 16,384 | | n_heads | 16 | | d_head | 256 | | n_ctx | 2,048 | | n_vocab | 50,400 (same tokenizer as GPT-2/3) | | position encoding | [Rotary position encodings (RoPE)](https://arxiv.org/abs/2104.09864) | | RoPE dimensions | [64](https://github.com/kingoflolz/mesh-transformer-jax/blob/f2aa66e0925de6593dcbb70e72399b97b4130482/mesh_transformer/layers.py#L223) | `*` each layer consists of one feedforward block and one self attention block The model consists of 28 layers with a model dimension of 4096, and a feedforward dimension of 16384. The model dimension is split into 16 heads, each with a dimension of 256. Rotary position encodings (RoPE) was applied to 64 dimensions of each head. The model is trained with a tokenization vocabulary of 50257, using the same set of BPEs as GPT-2/GPT-3. ## Training data GPT-J 6B was pretrained on the [Pile](pile.eleuther.ai), a large scale curated dataset created by EleutherAI for the purpose of training this model. After the pre-training, it's finetuned on the python code that was taken from the Pile. ## Training procedure Genji-python-6B is trained for 20k steps on around 655 million tokens with learning rate of 2e-06 ## Intended Use This model is trained for assistence on writing python code and having fun trying weird stuff with it. ### How to use This model is only usable with our fork because GPT-J is not merged to the main transformers repo yet. When it's merged, we will make this model easily loadable. For now, you need to use this fork: [Fork](https://github.com/finetuneanon/transformers) to install with pip: ```bash pip install git+https://github.com/finetuneanon/transformers@gpt-neo-localattention3-rp-b ``` This model takes more than 16 gigs of RAM to load. If you want more efficient and faster loading, please check our split model. We recommend the usage of the model as FP16. That way, it fits in 16GB VRAM cards. How to use: ```python from transformers import ( AutoTokenizer, AutoModelForCausalLM, GPTNeoForCausalLM, ) model = AutoModelForCausalLM.from_pretrained("NovelAI/genji-python-6B", use_auth_token=True).half().eval().cuda() tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neo-2.7B") text = '''def print_customer_name''' tokens = tokenizer(text, return_tensors="pt").input_ids generated_tokens = model.generate(tokens.long().cuda(), use_cache=True, do_sample=True, top_k=50, temperature=0.3, top_p=0.9, repetition_penalty=1.125, min_length=1, max_length=len(tokens[0]) + 400, pad_token_id=tokenizer.eos_token_id) last_tokens = generated_tokens[0][len(tokens[0]):] generated_text = tokenizer.decode(last_tokens) print("Generation:\n" + generated_text) ``` When ran, this code generates: ```python Prompt: def print_customer_name Generation: (self, customer): """Print the name of a customer.""" if not self.is_valid(): return print("Customer: {}".format(customer)) ``` For example usage, you can see our colab notebook as well: [Notebook](https://colab.research.google.com/drive/1PnWpx02IEUkY8jhLKd_NewUGEXahAska?usp=sharing) ## Eval results TBD ## Acknowledgements This project was possible because of the compute provided by the [TPU Research Cloud](https://sites.research.google/trc/) and [EleutherAI](https://eleuther.ai/) for pretraining of the GPT-J 6B. Thanks to everyone who contributed to this project! - [Aero](https://github.com/AeroScripts) - [Finetune](https://github.com/finetuneanon) - [Kurumuz](https://github.com/kurumuz)
{"language": ["en"], "license": "apache-2.0", "tags": ["pytorch", "causal-lm"], "datasets": ["the Pile"]}
text-generation
NovelAI/genji-python-6B
[ "transformers", "pytorch", "gpt_neo", "text-generation", "causal-lm", "en", "arxiv:2104.09864", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2104.09864" ]
[ "en" ]
TAGS #transformers #pytorch #gpt_neo #text-generation #causal-lm #en #arxiv-2104.09864 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us
Genji-python 6B =============== For example usage or to easily use the model you can check our colab notebook: Notebook Model Description ----------------- Genji is a transformer model finetuned on EleutherAI's GPT-J 6B model. This particular model is trained on python only code approaching 4GB in size. '\*' each layer consists of one feedforward block and one self attention block The model consists of 28 layers with a model dimension of 4096, and a feedforward dimension of 16384. The model dimension is split into 16 heads, each with a dimension of 256. Rotary position encodings (RoPE) was applied to 64 dimensions of each head. The model is trained with a tokenization vocabulary of 50257, using the same set of BPEs as GPT-2/GPT-3. Training data ------------- GPT-J 6B was pretrained on the Pile, a large scale curated dataset created by EleutherAI for the purpose of training this model. After the pre-training, it's finetuned on the python code that was taken from the Pile. Training procedure ------------------ Genji-python-6B is trained for 20k steps on around 655 million tokens with learning rate of 2e-06 Intended Use ------------ This model is trained for assistence on writing python code and having fun trying weird stuff with it. ### How to use This model is only usable with our fork because GPT-J is not merged to the main transformers repo yet. When it's merged, we will make this model easily loadable. For now, you need to use this fork: Fork to install with pip: This model takes more than 16 gigs of RAM to load. If you want more efficient and faster loading, please check our split model. We recommend the usage of the model as FP16. That way, it fits in 16GB VRAM cards. How to use: When ran, this code generates: For example usage, you can see our colab notebook as well: Notebook Eval results ------------ TBD Acknowledgements ---------------- This project was possible because of the compute provided by the TPU Research Cloud and EleutherAI for pretraining of the GPT-J 6B. Thanks to everyone who contributed to this project! * Aero * Finetune * Kurumuz
[ "### How to use\n\n\nThis model is only usable with our fork because GPT-J is not merged to the main transformers repo yet. When it's merged, we will make this model easily loadable.\nFor now, you need to use this fork:\nFork\n\n\nto install with pip:\n\n\nThis model takes more than 16 gigs of RAM to load. If you want more efficient and faster loading, please check our split model.\nWe recommend the usage of the model as FP16. That way, it fits in 16GB VRAM cards.\n\n\nHow to use:\n\n\nWhen ran, this code generates:\n\n\nFor example usage, you can see our colab notebook as well:\nNotebook\n\n\nEval results\n------------\n\n\nTBD\n\n\nAcknowledgements\n----------------\n\n\nThis project was possible because of the compute provided by the\nTPU Research Cloud\n\n\nand EleutherAI for pretraining of the GPT-J 6B.\n\n\nThanks to everyone who contributed to this project!\n\n\n* Aero\n* Finetune\n* Kurumuz" ]
[ "TAGS\n#transformers #pytorch #gpt_neo #text-generation #causal-lm #en #arxiv-2104.09864 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n", "### How to use\n\n\nThis model is only usable with our fork because GPT-J is not merged to the main transformers repo yet. When it's merged, we will make this model easily loadable.\nFor now, you need to use this fork:\nFork\n\n\nto install with pip:\n\n\nThis model takes more than 16 gigs of RAM to load. If you want more efficient and faster loading, please check our split model.\nWe recommend the usage of the model as FP16. That way, it fits in 16GB VRAM cards.\n\n\nHow to use:\n\n\nWhen ran, this code generates:\n\n\nFor example usage, you can see our colab notebook as well:\nNotebook\n\n\nEval results\n------------\n\n\nTBD\n\n\nAcknowledgements\n----------------\n\n\nThis project was possible because of the compute provided by the\nTPU Research Cloud\n\n\nand EleutherAI for pretraining of the GPT-J 6B.\n\n\nThanks to everyone who contributed to this project!\n\n\n* Aero\n* Finetune\n* Kurumuz" ]
[ 67, 213 ]
[ "passage: TAGS\n#transformers #pytorch #gpt_neo #text-generation #causal-lm #en #arxiv-2104.09864 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n### How to use\n\n\nThis model is only usable with our fork because GPT-J is not merged to the main transformers repo yet. When it's merged, we will make this model easily loadable.\nFor now, you need to use this fork:\nFork\n\n\nto install with pip:\n\n\nThis model takes more than 16 gigs of RAM to load. If you want more efficient and faster loading, please check our split model.\nWe recommend the usage of the model as FP16. That way, it fits in 16GB VRAM cards.\n\n\nHow to use:\n\n\nWhen ran, this code generates:\n\n\nFor example usage, you can see our colab notebook as well:\nNotebook\n\n\nEval results\n------------\n\n\nTBD\n\n\nAcknowledgements\n----------------\n\n\nThis project was possible because of the compute provided by the\nTPU Research Cloud\n\n\nand EleutherAI for pretraining of the GPT-J 6B.\n\n\nThanks to everyone who contributed to this project!\n\n\n* Aero\n* Finetune\n* Kurumuz" ]
[ -0.06078480929136276, -0.04854094982147217, -0.0025415371637791395, 0.08422054350376129, 0.07592429220676422, -0.001247523701749742, 0.015790555626153946, 0.11682441830635071, -0.08018714934587479, 0.059230174869298935, 0.05173303186893463, 0.08350856602191925, 0.06607528775930405, 0.1603967398405075, 0.05430390685796738, -0.17726561427116394, 0.014291197061538696, 0.011004358530044556, -0.00878820102661848, 0.07468073070049286, 0.1116095557808876, -0.0874626561999321, 0.07211533933877945, 0.06844668090343475, -0.08129507303237915, 0.022926369681954384, -0.06034276261925697, -0.06715375185012817, 0.08558697253465652, 0.1169951930642128, 0.03866232559084892, 0.01838809624314308, 0.033864445984363556, -0.07296603918075562, 0.025312943384051323, 0.0722350925207138, 0.012264602817595005, 0.07207537442445755, 0.005001375451683998, 0.10194536298513412, 0.18238532543182373, -0.03677833452820778, -0.034302935004234314, 0.04352101683616638, -0.0560486763715744, -0.1528691053390503, -0.13711108267307281, 0.02251925878226757, 0.0845489576458931, 0.01718280091881752, 0.000015205270756268874, 0.14559830725193024, -0.0029117094818502665, 0.040192585438489914, 0.17692245543003082, -0.25903528928756714, -0.0891994833946228, 0.029879046604037285, 0.07990015298128128, -0.0369836650788784, -0.0005230855313129723, 0.025177814066410065, 0.061558254063129425, 0.040961623191833496, 0.064898282289505, -0.003924118354916573, -0.09935328364372253, 0.013546362519264221, -0.14536337554454803, -0.08490049839019775, 0.16689354181289673, -0.0029204979073256254, -0.03700097277760506, -0.08828471601009369, -0.14825741946697235, -0.03309006243944168, 0.02078985795378685, -0.09819227457046509, -0.0009093150147236884, -0.0031749794725328684, 0.0688701719045639, -0.07068733125925064, -0.12351938337087631, -0.063746877014637, -0.0681384801864624, 0.03304806351661682, 0.07173088937997818, 0.059571098536252975, 0.06966228038072586, 0.16044282913208008, -0.037634484469890594, -0.05316562578082085, -0.02682957425713539, -0.05247170850634575, -0.00505512859672308, 0.025264032185077667, 0.005980009213089943, -0.09519681334495544, -0.004961453843861818, 0.15289686620235443, 0.09466270357370377, -0.023743024095892906, 0.13489975035190582, 0.007726107724010944, -0.03231655806303024, 0.13245616853237152, -0.09837799519300461, -0.0888461023569107, 0.0698946937918663, -0.009222797118127346, 0.07555323094129562, 0.008997855708003044, -0.06141170859336853, -0.05419649928808212, 0.02463061921298504, 0.020920008420944214, 0.021243199706077576, 0.017217082902789116, -0.03139759600162506, -0.04911544546484947, 0.30836907029151917, -0.06891719251871109, -0.002317433711141348, -0.05615365505218506, -0.09994853287935257, -0.023577289655804634, 0.12064572423696518, -0.03926500678062439, -0.07041320949792862, 0.012476925738155842, -0.014331811107695103, -0.06626806408166885, -0.09673287719488144, -0.08272788673639297, 0.02261265739798546, -0.02419274114072323, -0.01941414549946785, -0.16722631454467773, -0.21878071129322052, 0.03143171966075897, 0.07574168592691422, -0.0786292776465416, -0.050008684396743774, 0.08842457830905914, 0.0026383879594504833, 0.014760510995984077, -0.031006326898932457, 0.16662703454494476, -0.05586474388837814, 0.017883475869894028, 0.06905313581228256, 0.10550287365913391, -0.018308348953723907, 0.029481180012226105, -0.029069051146507263, 0.03925880417227745, -0.1743638813495636, 0.038224078714847565, -0.061096131801605225, -0.009511943906545639, -0.10299848020076752, -0.02276715263724327, -0.034406449645757675, -0.006363121792674065, 0.06948469579219818, 0.04057719185948372, -0.13761837780475616, 0.0013975106412544847, 0.0022661969996988773, -0.08770205825567245, -0.07536230981349945, 0.1292320340871811, 0.03606834635138512, 0.014397463761270046, 0.05096563324332237, 0.02149863727390766, 0.11112527549266815, -0.14693008363246918, -0.10552116483449936, 0.04168110340833664, -0.057898204773664474, -0.08450020849704742, 0.07810507714748383, 0.0630497932434082, 0.005914808716624975, 0.03690339997410774, 0.047616977244615555, 0.07904663681983948, -0.025243492797017097, -0.03725067898631096, -0.0798637643456459, -0.08798881620168686, -0.026793505996465683, 0.008155041374266148, 0.02418106235563755, -0.019467109814286232, -0.11182142794132233, -0.06344233453273773, 0.12761645019054413, -0.01187958288937807, -0.006622430868446827, -0.07418206334114075, 0.1724843680858612, -0.15898677706718445, 0.02811170555651188, -0.06802749633789062, -0.06607428938150406, 0.07596777379512787, -0.07324862480163574, -0.03241737186908722, 0.05108753591775894, 0.031231427565217018, 0.10011493414640427, 0.001956880558282137, 0.024975895881652832, 0.02845670096576214, -0.0178234800696373, -0.09311047196388245, 0.02518065646290779, -0.05857500433921814, 0.013681188225746155, 0.008731001988053322, -0.08406495302915573, 0.03542191535234451, 0.06862162053585052, 0.11847641319036484, -0.03878598287701607, 0.001156445243395865, 0.010063461028039455, -0.06308911740779877, -0.0338737778365612, -0.08205046504735947, 0.034556884318590164, 0.02526090294122696, 0.018735401332378387, 0.07231004536151886, -0.06471984833478928, -0.09609571844339371, 0.07856644690036774, 0.1307101845741272, -0.050919532775878906, 0.05200204625725746, -0.07587440311908722, -0.06760265678167343, -0.03942841663956642, -0.059686996042728424, 0.08080755174160004, 0.007219458930194378, 0.12152658402919769, -0.056199729442596436, -0.07879770547151566, 0.05305379629135132, 0.017179474234580994, 0.03062189556658268, 0.07213636487722397, 0.08694422245025635, 0.0010813751723617315, 0.0567106194794178, 0.14297156035900116, -0.005363814998418093, 0.09525861591100693, 0.04353277385234833, -0.0812889039516449, 0.019499624148011208, 0.0054871197789907455, 0.014087204821407795, 0.1435023695230484, -0.0025177649222314358, 0.04854018986225128, 0.09164921939373016, -0.024589717388153076, 0.06147428974509239, -0.1521061360836029, 0.06326664984226227, 0.031047331169247627, -0.03165515139698982, 0.1632460355758667, 0.07170617580413818, -0.06485007703304291, 0.027705419808626175, -0.028173798695206642, 0.0011317392345517874, -0.01600528135895729, 0.039241351187229156, -0.006036664359271526, 0.12261094152927399, -0.017422249540686607, -0.1912822425365448, -0.153860941529274, 0.026205996051430702, -0.025618167594075203, 0.010474370792508125, 0.0692402645945549, -0.009997020475566387, -0.10144037753343582, -0.06325561553239822, 0.1030380055308342, 0.06388022750616074, -0.026163430884480476, -0.002380402060225606, -0.03608998283743858, 0.0348367877304554, -0.1341545581817627, 0.006269210018217564, 0.043983276933431625, -0.06502988189458847, 0.07443208992481232, 0.04329455643892288, 0.11422029882669449, 0.023164818063378334, -0.04294592887163162, -0.05865888297557831, 0.01969243958592415, 0.18653923273086548, -0.08344066143035889, 0.10905849933624268, 0.21335026621818542, 0.051262617111206055, 0.05129650980234146, 0.025959612801671028, -0.020714279264211655, -0.034156303852796555, 0.024055657908320427, -0.006753343157470226, -0.05305911973118782, -0.18953850865364075, -0.07560890167951584, -0.05840844660997391, 0.05082052946090698, 0.06392175704240799, 0.05322223901748657, -0.0862388163805008, 0.11564140766859055, -0.07388200610876083, 0.16402633488178253, -0.07766206562519073, 0.0480659194290638, 0.17395104467868805, 0.006461338605731726, 0.07407184690237045, -0.064877949655056, -0.05151771381497383, 0.1441102921962738, 0.06875021755695343, 0.12881600856781006, -0.05887991935014725, 0.11265334486961365, 0.02865532785654068, 0.12818604707717896, 0.02764192223548889, 0.11897487193346024, -0.04686632379889488, -0.015575321391224861, -0.0549352690577507, -0.025192031636834145, -0.021681133657693863, 0.04434655234217644, -0.02865004539489746, -0.017950313165783882, 0.031993985176086426, 0.0667588859796524, 0.06344076246023178, 0.19225172698497772, 0.04098665341734886, -0.21139554679393768, -0.08967716246843338, 0.013632719404995441, -0.05707709863781929, -0.10369846969842911, -0.028992505744099617, 0.07290571182966232, -0.11621010303497314, 0.022174594923853874, -0.025355445221066475, 0.08193589746952057, -0.06649135798215866, -0.017313756048679352, 0.05114636570215225, 0.12144448608160019, 0.05771879479289055, 0.13285532593727112, -0.08450721949338913, -0.003796759992837906, 0.013653529807925224, 0.05355295166373253, -0.08680079132318497, 0.07163020968437195, 0.019586116075515747, 0.00851710606366396, 0.09315583854913712, -0.02127760648727417, 0.03776554763317108, 0.006562447175383568, -0.2195800393819809, 0.03565502166748047, -0.03288478031754494, -0.03720279783010483, 0.06072475016117096, -0.02037000097334385, 0.009447175078094006, -0.017913933843374252, 0.09902825206518173, -0.10592059791088104, -0.16142642498016357, 0.07038088142871857, -0.0370427705347538, -0.03415374830365181, -0.05509132891893387, -0.05608842894434929, -0.09839008003473282, 0.16087284684181213, -0.020825104787945747, -0.08131363242864609, -0.06702831387519836, 0.015651477500796318, 0.135658860206604, -0.07277195155620575, 0.051836200058460236, -0.029490726068615913, 0.08489377796649933, -0.04806789755821228, -0.0610441155731678, 0.05019441619515419, -0.11144322156906128, -0.13996990025043488, -0.011157495900988579, 0.00585908954963088, -0.051202964037656784, 0.05755259469151497, -0.00008943498687585816, 0.024338005110621452, -0.12061162292957306, -0.08067963272333145, -0.017847798764705658, 0.07417318969964981, 0.07507425546646118, -0.062403805553913116, -0.01278464961796999, 0.007048286963254213, -0.007686237338930368, -0.08740285038948059, 0.09707468003034592, 0.21758481860160828, -0.08112502098083496, 0.06321404129266739, 0.07641800493001938, -0.012715470977127552, -0.23325702548027039, -0.13800109922885895, 0.02497052401304245, 0.04265093058347702, -0.07167481631040573, -0.08694702386856079, 0.11566628515720367, 0.11413340270519257, -0.034851428121328354, 0.1295420378446579, -0.19899524748325348, -0.1287345141172409, 0.07233808934688568, 0.008664330467581749, 0.047726135700941086, -0.11439785361289978, -0.031974513083696365, -0.06379695981740952, -0.14930382370948792, 0.029739728197455406, -0.06510691344738007, 0.10225747525691986, -0.08115874975919724, -0.006649447605013847, -0.025469107553362846, -0.05719923600554466, 0.15198375284671783, -0.10210645943880081, 0.013995545916259289, -0.08885473012924194, 0.07071513682603836, 0.038842834532260895, -0.07673914730548859, 0.1633734107017517, -0.02533317729830742, 0.040651340037584305, -0.10997191071510315, -0.04502003639936447, -0.03875337168574333, 0.051708072423934937, -0.0009037222480401397, -0.05906219780445099, -0.10699183493852615, 0.0054793511517345905, -0.014828512445092201, -0.05971506983041763, -0.038472987711429596, 0.0868082195520401, -0.055879995226860046, 0.14004328846931458, -0.01543484628200531, -0.10579075664281845, -0.0771200880408287, -0.009074769914150238, 0.02989855967462063, 0.0908888727426529, -0.09143228083848953, -0.007843499071896076, 0.09109174460172653, -0.09352646768093109, 0.04593127220869064, 0.01782168447971344, -0.11423735320568085, 0.020951025187969208, 0.08084051311016083, -0.17579014599323273, -0.10932791233062744, -0.052247438579797745, 0.04397788271307945, -0.05296626687049866, -0.011410209350287914, 0.08660059422254562, -0.11877158284187317, -0.059211697429418564, 0.015949010848999023, 0.021503305062651634, -0.02499900385737419, 0.17693501710891724, 0.08976604044437408, 0.020124563947319984, -0.05064869299530983, 0.09051870554685593, 0.07731996476650238, -0.0865582823753357, -0.04093704745173454, 0.05263045057654381, -0.09070979803800583, -0.0828380361199379, 0.02555481344461441, 0.012053443118929863, -0.05987941101193428, -0.03744063526391983, -0.06690478324890137, -0.008045926690101624, 0.03176255151629448, -0.03780166432261467, 0.05331908538937569, -0.015061942860484123, -0.02473987452685833, 0.042637910693883896, -0.04314231872558594, 0.1199791207909584, 0.06593523919582367, 0.10209325700998306, -0.11559046804904938, 0.028737349435687065, -0.016704734414815903, 0.10585316270589828, -0.029209986329078674, 0.05972098559141159, -0.02078336663544178, -0.040924832224845886, -0.12433522194623947, 0.0033267030958086252, -0.0398467481136322, -0.01515833381563425, 0.004834210500121117, 0.02845323644578457, -0.038615062832832336, 0.056610364466905594, -0.03529079630970955, -0.027583692222833633, -0.04910682141780853, 0.024526583030819893, -0.08023783564567566, -0.005933769512921572, 0.05253642052412033, -0.053838495165109634, 0.05669461563229561, -0.01974085159599781, 0.02749614790081978, -0.020830895751714706, 0.009484885260462761, -0.05022067949175835, 0.023473134264349937, 0.07827305793762207, 0.048912808299064636, -0.07098258286714554, 0.020449025556445122, 0.020374851301312447, -0.012120191939175129, -0.059231918305158615, 0.047944262623786926, -0.110024593770504, -0.05571121722459793, -0.008378147147595882, -0.03482813015580177, -0.03698157146573067, -0.004018805921077728, 0.007095462176948786, 0.11683321744203568, 0.11153361946344376, -0.005279164761304855, -0.025956112891435623, -0.1480550765991211, -0.004740898497402668, -0.01825641095638275, -0.09025444090366364, -0.014774973504245281, -0.05142200365662575, 0.044501520693302155, 0.012998263351619244, 0.16119317710399628, 0.0647360160946846, -0.02606266736984253, -0.014197559095919132, 0.008430513553321362, -0.02042579837143421, -0.035314660519361496, 0.06490960717201233, 0.013587966561317444, 0.07462279498577118, -0.05671552196145058, 0.04975772276520729, 0.07268521189689636, 0.038811422884464264, 0.04388069733977318, 0.034605637192726135, -0.03206906095147133, 0.13255687057971954, -0.03221597522497177, -0.08524377644062042, -0.10336068272590637, -0.0605180487036705, -0.07397734373807907, 0.1199791207909584, -0.048469506204128265, 0.028242118656635284, 0.08703603595495224, -0.13214945793151855, 0.0006313726771622896, -0.05505624786019325, -0.052235957235097885, -0.11243778467178345, -0.06041639670729637, -0.06826824694871902, -0.1043429896235466, -0.00816575437784195, -0.06066088750958443, 0.0018736767815425992, 0.1077449843287468, 0.011854329146444798, 0.012465296313166618, 0.19550809264183044, 0.025875071063637733, -0.04064848646521568, -0.017320964485406876, -0.0007726061157882214, -0.011545998975634575, -0.03287361562252045, 0.03783384710550308, -0.01750120148062706, 0.029227348044514656, 0.08023787289857864, 0.015303099527955055, 0.023040059953927994, 0.04546986147761345, 0.0191066674888134, -0.03935713320970535, -0.0537218376994133, 0.005376552231609821, 0.012075657956302166, 0.048237934708595276, 0.0036862080451101065, 0.06079639494419098, 0.013306417502462864, 0.116356261074543, -0.035520296543836594, -0.1256856471300125, -0.1044500470161438, 0.09763811528682709, -0.0194095466285944, 0.00002964323721244, 0.016157761216163635, -0.0743451714515686, -0.01743723265826702, 0.22336535155773163, 0.12299548089504242, -0.02587375044822693, -0.03226211667060852, 0.05148473381996155, -0.009049853309988976, -0.04284598305821419, 0.09796332567930222, 0.10192438215017319, 0.14815030992031097, -0.08198503404855728, 0.05093573033809662, -0.07674510031938553, -0.05755505710840225, -0.047047100961208344, 0.11907269805669785, -0.03667404502630234, 0.04053064435720444, 0.014181294478476048, 0.026537494733929634, 0.07213575392961502, -0.1302746832370758, -0.03718467429280281, -0.021176667883992195, -0.09294648468494415, 0.03235558420419693, 0.10612360388040543, -0.04670467600226402, 0.049501460045576096, -0.07651985436677933, 0.05591564252972603, 0.11423218250274658, -0.0005095506785437465, -0.1434711068868637, 0.010437005199491978, 0.10395948588848114, -0.03252483531832695, 0.1590757668018341, -0.0037268793676048517, 0.06715212017297745, 0.09345705062150955, -0.014743873849511147, -0.18621519207954407, 0.05212220922112465, 0.014390422962605953, -0.08635065704584122, 0.07204576581716537, 0.07399548590183258, -0.00653989240527153, -0.06692000478506088, 0.002188033889979124, 0.012049133889377117, 0.013413732871413231, 0.08896921575069427, 0.07358858734369278, -0.06725269556045532, 0.041212406009435654, -0.1046021580696106, 0.13971994817256927, 0.12757907807826996, -0.046871770173311234, -0.01605052687227726, -0.13470986485481262, 0.03328079730272293, -0.011269602924585342, 0.1167573556303978, 0.014839028008282185, -0.12161652743816376, 0.01232044119387865, 0.019305575639009476, 0.05197986587882042, -0.11978854238986969, -0.06743663549423218, -0.0009264973341487348, -0.084453284740448, -0.02951107919216156, 0.11863548308610916, 0.067482590675354, 0.02026902697980404, -0.02676035650074482, -0.022605285048484802, -0.10515202581882477, 0.0013483598595485091, -0.11702115833759308, -0.08553779125213623 ]
null
null
transformers
# bert-base-multilingual-uncased-sentiment This a bert-base-multilingual-uncased model finetuned for sentiment analysis on product reviews in six languages: English, Dutch, German, French, Spanish and Italian. It predicts the sentiment of the review as a number of stars (between 1 and 5). This model is intended for direct use as a sentiment analysis model for product reviews in any of the six languages above, or for further finetuning on related sentiment analysis tasks. ## Training data Here is the number of product reviews we used for finetuning the model: | Language | Number of reviews | | -------- | ----------------- | | English | 150k | | Dutch | 80k | | German | 137k | | French | 140k | | Italian | 72k | | Spanish | 50k | ## Accuracy The finetuned model obtained the following accuracy on 5,000 held-out product reviews in each of the languages: - Accuracy (exact) is the exact match on the number of stars. - Accuracy (off-by-1) is the percentage of reviews where the number of stars the model predicts differs by a maximum of 1 from the number given by the human reviewer. | Language | Accuracy (exact) | Accuracy (off-by-1) | | -------- | ---------------------- | ------------------- | | English | 67% | 95% | Dutch | 57% | 93% | German | 61% | 94% | French | 59% | 94% | Italian | 59% | 95% | Spanish | 58% | 95% ## Contact In addition to this model, [NLP Town](https://www.nlp.town) offers custom, monolingual sentiment models for many languages and an improved multilingual model through [RapidAPI](https://rapidapi.com/nlp-town-nlp-town-default/api/multilingual-sentiment-analysis2/). Feel free to contact us for questions, feedback and/or requests for similar models.
{"language": ["en", "nl", "de", "fr", "it", "es"], "license": "mit"}
text-classification
Noxel/sentiments_multilenguaje
[ "transformers", "bert", "text-classification", "en", "nl", "de", "fr", "it", "es", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "en", "nl", "de", "fr", "it", "es" ]
TAGS #transformers #bert #text-classification #en #nl #de #fr #it #es #license-mit #autotrain_compatible #endpoints_compatible #region-us
bert-base-multilingual-uncased-sentiment ======================================== This a bert-base-multilingual-uncased model finetuned for sentiment analysis on product reviews in six languages: English, Dutch, German, French, Spanish and Italian. It predicts the sentiment of the review as a number of stars (between 1 and 5). This model is intended for direct use as a sentiment analysis model for product reviews in any of the six languages above, or for further finetuning on related sentiment analysis tasks. Training data ------------- Here is the number of product reviews we used for finetuning the model: Accuracy -------- The finetuned model obtained the following accuracy on 5,000 held-out product reviews in each of the languages: * Accuracy (exact) is the exact match on the number of stars. * Accuracy (off-by-1) is the percentage of reviews where the number of stars the model predicts differs by a maximum of 1 from the number given by the human reviewer. Language: English, Accuracy (exact): 67%, Accuracy (off-by-1): 95% Language: Dutch, Accuracy (exact): 57%, Accuracy (off-by-1): 93% Language: German, Accuracy (exact): 61%, Accuracy (off-by-1): 94% Language: French, Accuracy (exact): 59%, Accuracy (off-by-1): 94% Language: Italian, Accuracy (exact): 59%, Accuracy (off-by-1): 95% Language: Spanish, Accuracy (exact): 58%, Accuracy (off-by-1): 95% Contact ------- In addition to this model, NLP Town offers custom, monolingual sentiment models for many languages and an improved multilingual model through RapidAPI. Feel free to contact us for questions, feedback and/or requests for similar models.
[]
[ "TAGS\n#transformers #bert #text-classification #en #nl #de #fr #it #es #license-mit #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ 49 ]
[ "passage: TAGS\n#transformers #bert #text-classification #en #nl #de #fr #it #es #license-mit #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ 0.019098790362477303, -0.006608160678297281, -0.006332166492938995, 0.034716494381427765, 0.13021038472652435, 0.03185926750302315, 0.10685770213603973, 0.08441891521215439, 0.08366866409778595, -0.04194153472781181, 0.1653517186641693, 0.23215410113334656, -0.037362225353717804, 0.0924299955368042, -0.11594684422016144, -0.2599201202392578, 0.08408937603235245, 0.05006737262010574, -0.04147963225841522, 0.07322480529546738, 0.12399184703826904, -0.03493504598736763, 0.09009789675474167, -0.029728876426815987, -0.09035694599151611, 0.05425327643752098, 0.071589395403862, -0.11097581684589386, 0.11143564432859421, 0.1008056178689003, 0.11168835312128067, 0.054474640637636185, -0.014172574505209923, -0.21346160769462585, 0.013110124506056309, -0.033940501511096954, -0.12967577576637268, 0.02504892461001873, 0.018815014511346817, -0.065290167927742, 0.07767629623413086, 0.03141350671648979, -0.0011326803360134363, 0.07219315320253372, -0.14467237889766693, -0.10952651500701904, -0.019667362794280052, 0.022501187399029732, 0.05530897527933121, 0.038002077490091324, 0.00802164152264595, 0.06887900084257126, -0.11920934170484543, 0.08515465259552002, 0.06714928150177002, -0.3122214078903198, 0.003991900943219662, 0.136386901140213, 0.04914708808064461, -0.00816352665424347, -0.06640353053808212, 0.10184629261493683, 0.09406745433807373, -0.018102845177054405, -0.011506307870149612, -0.07661014050245285, 0.03294030949473381, 0.05961339548230171, -0.04824918508529663, -0.0263046994805336, 0.21257062256336212, -0.014695710502564907, 0.03555767610669136, -0.013045094907283783, -0.04274507239460945, -0.0521390438079834, -0.018867528066039085, -0.027208739891648293, 0.004736802540719509, 0.11912640929222107, 0.06690521538257599, 0.004948311485350132, -0.11076030880212784, 0.02201937697827816, -0.22596551477909088, 0.21056556701660156, -0.01283198595046997, 0.032442133873701096, -0.1232067421078682, 0.027657397091388702, -0.06778208166360855, -0.08285633474588394, -0.023610137403011322, -0.07025817781686783, 0.05412857607007027, -0.050815753638744354, -0.041994526982307434, 0.024821046739816666, 0.08853912353515625, 0.2102942168712616, 0.03319663554430008, 0.03527676314115524, -0.0689801573753357, 0.10562849044799805, 0.012069816701114178, 0.09863616526126862, 0.07381487637758255, -0.06731031090021133, -0.00337744178250432, -0.18216747045516968, 0.013893830589950085, -0.04364829137921333, -0.22085899114608765, -0.019922707229852676, -0.022080564871430397, 0.06849047541618347, -0.025435371324419975, 0.08175257593393326, -0.06243855878710747, 0.03721500188112259, 0.07593896985054016, -0.03835144266486168, 0.04832753166556358, -0.0022348915226757526, 0.04433926194906235, 0.06092542037367821, -0.0233171209692955, 0.017351899296045303, 0.006299211177974939, 0.14631778001785278, -0.062453944236040115, -0.017207061871886253, -0.03388301283121109, -0.0890890508890152, 0.050058379769325256, -0.10608949512243271, 0.06830912828445435, -0.1610526293516159, -0.10871255397796631, 0.0141063891351223, 0.045653391629457474, -0.0463610403239727, 0.00817886833101511, -0.06038527935743332, -0.02924148552119732, 0.08594899624586105, -0.05709238350391388, -0.10373852401971817, -0.09951194375753403, 0.05709151551127434, -0.03112025558948517, 0.05691615492105484, -0.21284319460391998, 0.042287442833185196, -0.07973884791135788, 0.008601258508861065, -0.05907581374049187, 0.0158234816044569, -0.07768087834119797, 0.14910395443439484, 0.00039828935405239463, -0.028661519289016724, -0.05866623669862747, 0.08290412276983261, -0.0791892483830452, 0.1469934731721878, -0.10868669301271439, -0.08938153833150864, 0.15352481603622437, -0.10798224806785583, -0.1368929147720337, 0.0704289898276329, -0.02347037009894848, 0.08970563858747482, 0.08932503312826157, 0.19750788807868958, 0.06659703701734543, -0.05506283789873123, 0.0734696239233017, 0.16623157262802124, -0.09251074492931366, -0.08751678466796875, 0.02714231237769127, -0.01677609607577324, -0.14658023416996002, 0.04534946382045746, 0.04918380454182625, 0.06710626184940338, -0.029872890561819077, -0.04678918421268463, -0.006189384497702122, 0.013743886724114418, 0.07490340620279312, 0.0031939989421516657, 0.07388383895158768, -0.09675823152065277, -0.008335767313838005, 0.07270564138889313, -0.013259322382509708, 0.058678630739450455, 0.0148479538038373, -0.08601293712854385, 0.04037963226437569, 0.021230027079582214, 0.02117392234504223, -0.13053710758686066, -0.03350374847650528, -0.02965385653078556, 0.07393502444028854, 0.02046986296772957, 0.16894389688968658, 0.04508047550916672, -0.06629368662834167, -0.022441817447543144, 0.021992627531290054, 0.15954115986824036, 0.06732829660177231, -0.027894876897335052, -0.13868235051631927, 0.08497854322195053, -0.06312033534049988, -0.02867812104523182, -0.14952611923217773, -0.0012675321195274591, 0.13686807453632355, 0.09844925999641418, -0.011694968678057194, 0.1100759506225586, -0.11484985798597336, 0.03675876557826996, -0.10179060697555542, 0.030348049476742744, 0.09640107303857803, -0.004533072467893362, -0.0958821102976799, 0.17405849695205688, -0.09261278063058853, 0.285637229681015, 0.19640851020812988, -0.2376057207584381, -0.02827908843755722, -0.03328266739845276, -0.0014851999003440142, 0.015797557309269905, 0.0599459744989872, -0.03419489413499832, 0.03865634277462959, -0.025838468223810196, 0.17787234485149384, -0.03194170445203781, -0.03537493571639061, 0.005294604226946831, -0.059720516204833984, -0.09754317998886108, 0.06562802195549011, 0.12090159952640533, -0.2571698725223541, 0.20269419252872467, 0.2854750156402588, 0.06649893522262573, 0.1753838062286377, -0.014997216872870922, 0.048700761049985886, 0.0010058424668386579, -0.04072165489196777, -0.006540230941027403, 0.01673506386578083, -0.1592755913734436, -0.025027688592672348, 0.05422480031847954, 0.0454850047826767, 0.052259329706430435, -0.11230285465717316, -0.04627643898129463, 0.03155972436070442, 0.005773371551185846, -0.029563402757048607, 0.07636291533708572, -0.011928783729672432, 0.09701158106327057, 0.020695675164461136, -0.148484006524086, 0.1381395310163498, -0.0052210488356649876, -0.08572722971439362, 0.17006343603134155, -0.14914080500602722, -0.2574155628681183, -0.17705138027668, -0.22750258445739746, -0.027133846655488014, 0.0729188621044159, 0.11284283548593521, -0.0719115361571312, -0.07809890061616898, 0.024774672463536263, 0.018489954993128777, -0.09176361560821533, 0.0223783478140831, -0.07599133998155594, 0.07185211032629013, -0.06914772838354111, -0.08710826188325882, -0.09548036009073257, -0.0009848515037447214, 0.0010008732788264751, 0.08032012730836868, -0.14777356386184692, 0.08910512179136276, 0.12941902875900269, -0.014747104607522488, 0.05394603684544563, -0.053918540477752686, 0.1733364760875702, -0.09113050997257233, -0.012475712224841118, 0.08573240041732788, -0.03901981934905052, 0.0313584990799427, 0.21100470423698425, 0.03991503641009331, -0.1001131609082222, 0.016404688358306885, -0.03244461119174957, -0.09246121346950531, -0.19755327701568604, -0.12973909080028534, -0.10419918596744537, 0.08004903793334961, 0.0622277557849884, 0.08173613250255585, 0.18482622504234314, 0.045681215822696686, 0.05515227094292641, 0.04124251753091812, 0.02616248093545437, 0.08027129620313644, 0.3296092748641968, -0.029579434543848038, 0.1291239857673645, -0.09427057951688766, -0.10695288330316544, 0.11766363680362701, 0.03592890501022339, 0.0325426384806633, 0.19167058169841766, 0.07703719288110733, 0.04608068987727165, 0.0008462841506116092, 0.11756133288145065, 0.10027387738227844, 0.06373988091945648, -0.049937207251787186, -0.033466510474681854, -0.005706919822841883, -0.016325676813721657, 0.07723966985940933, 0.018157467246055603, -0.14644192159175873, -0.04073638841509819, -0.1324317455291748, 0.07908399403095245, 0.016108524054288864, 0.07500835508108139, -0.17697575688362122, 0.0021570813842117786, 0.0926530510187149, -0.008754726499319077, -0.07456725835800171, 0.09838584065437317, -0.009194353595376015, -0.0937904417514801, 0.12131855636835098, 0.007018435746431351, 0.11204150319099426, -0.02388172037899494, 0.09302282333374023, -0.0010973928729072213, -0.133881613612175, 0.030954396352171898, 0.10725150257349014, -0.333345890045166, 0.23748847842216492, 0.016103370115160942, -0.04281587898731232, -0.04181251302361488, -0.03017064929008484, 0.020029341802001, 0.2778557538986206, 0.1251339167356491, 0.008109362795948982, -0.19251589477062225, -0.14524799585342407, 0.038489289581775665, -0.013548145070672035, 0.12673871219158173, -0.016679279506206512, -0.025278640910983086, -0.07866635173559189, 0.0069559249095618725, 0.019060947000980377, 0.02715812623500824, -0.011192452162504196, -0.17253448069095612, 0.04253034293651581, 0.05515716224908829, 0.09222199022769928, -0.0005365133401937783, -0.01203143410384655, -0.13680042326450348, 0.21154534816741943, -0.14854472875595093, -0.006839205045253038, -0.12797771394252777, -0.07946323603391647, -0.03744235634803772, -0.04268476366996765, 0.0405149832367897, -0.06656874716281891, 0.012895217165350914, -0.0782627984881401, -0.154671311378479, 0.16328011453151703, -0.09711083769798279, -0.014021671377122402, -0.06704864650964737, 0.09389542788267136, -0.07382860034704208, 0.0286591574549675, 0.06330008059740067, 0.028118833899497986, -0.03994571045041084, -0.11696118861436844, 0.0027922848239541054, -0.03628350794315338, -0.032650332897901535, -0.020145578309893608, -0.10417565703392029, -0.021338006481528282, 0.01717652939260006, -0.03371661156415939, 0.23100227117538452, 0.22183392941951752, -0.08331090211868286, 0.1501511037349701, 0.12825565040111542, -0.11237205564975739, -0.35870620608329773, -0.0683722123503685, -0.17396117746829987, -0.041114676743745804, -0.04771203547716141, -0.12673230469226837, 0.0990249440073967, 0.013675510883331299, -0.03954394906759262, 0.08779395371675491, -0.12424422055482864, -0.10184463113546371, 0.19025948643684387, -0.05674605444073677, 0.3859144151210785, -0.08767647296190262, -0.11268400400876999, -0.1150486096739769, -0.14345350861549377, 0.16382496058940887, -0.020133253186941147, 0.07198762148618698, 0.010377411730587482, 0.008069411851465702, 0.010300940833985806, -0.012720710597932339, 0.14716792106628418, 0.006660634186118841, 0.06999629735946655, -0.12328995764255524, -0.10236269235610962, 0.046097345650196075, -0.0046213469468057156, 0.0010701501742005348, -0.07417812198400497, -0.015098982490599155, -0.10829479247331619, -0.04973753169178963, -0.02610822394490242, 0.10451573133468628, -0.007800407707691193, -0.07864569872617722, -0.06573151797056198, -0.004526602569967508, 0.015285013243556023, -0.03001699596643448, 0.32083389163017273, -0.058258961886167526, 0.12274623662233353, 0.11822163313627243, 0.13296107947826385, -0.13798965513706207, 0.08839385211467743, -0.030444592237472534, -0.08543239533901215, 0.057136185467243195, -0.11257053166627884, 0.05663180351257324, 0.12738560140132904, -0.06179102137684822, 0.11403772234916687, 0.1059693694114685, 0.037522632628679276, -0.04508484527468681, 0.1716376394033432, -0.14389343559741974, -0.04607774689793587, -0.030529337003827095, -0.052812617272138596, 0.12502539157867432, 0.021045520901679993, 0.13065914809703827, 0.0049469200894236565, 0.023865526542067528, 0.020907893776893616, -0.009458817541599274, -0.0738338902592659, 0.014003636315464973, 0.03798598051071167, 0.008122759871184826, -0.10703413188457489, 0.055704016238451004, 0.01725657284259796, -0.14580632746219635, 0.004064914304763079, 0.052065540105104446, -0.14050014317035675, -0.11850494146347046, -0.050304729491472244, 0.16408884525299072, -0.18754808604717255, -0.09950229525566101, -0.05138158053159714, -0.17415869235992432, 0.07301383465528488, 0.2138710916042328, 0.10926154255867004, 0.11173400282859802, -0.0235701035708189, -0.03778129070997238, 0.018226001411676407, 0.0019125016406178474, -0.08964530378580093, 0.028878556564450264, -0.09415813535451889, -0.016691507771611214, -0.02498309314250946, 0.08989764004945755, -0.08392511308193207, -0.03828836977481842, -0.19143259525299072, 0.009332017041742802, -0.11481951177120209, -0.024448299780488014, -0.07573544234037399, -0.020549777895212173, 0.05400250107049942, -0.0656658485531807, -0.0671086385846138, -0.06588862836360931, -0.12250944972038269, 0.012290109880268574, 0.02054733596742153, 0.0978413000702858, -0.0708303153514862, -0.038357652723789215, 0.0957302525639534, -0.015076510608196259, 0.05771274119615555, 0.05031917616724968, -0.04209275171160698, 0.10994651913642883, -0.20102961361408234, -0.04241598770022392, 0.12215075641870499, 0.014638763852417469, 0.057068414986133575, 0.03673385828733444, 0.017334995791316032, 0.10211128741502762, -0.00503113679587841, 0.07950539141893387, -0.028857531026005745, -0.10298755764961243, 0.06137651950120926, -0.049019020050764084, -0.1667880117893219, -0.004919488914310932, -0.02560497634112835, 0.0939222201704979, -0.02940257079899311, 0.19116105139255524, -0.06935888528823853, 0.028582332655787468, 0.008114442229270935, 0.018418535590171814, -0.009558253921568394, -0.1728295236825943, -0.08057502657175064, -0.11023139953613281, -0.04013517126441002, -0.010423344559967518, 0.30323129892349243, 0.07326645404100418, -0.041712936013936996, 0.0949232429265976, 0.02794552966952324, -0.04116090387105942, 0.02867046184837818, 0.2360372692346573, 0.10548676550388336, -0.026100989431142807, -0.14084672927856445, 0.06194138154387474, 0.02334718033671379, -0.053473059087991714, 0.09164490550756454, 0.09085378050804138, -0.054049428552389145, 0.09612438827753067, 0.024232562631368637, -0.004333114251494408, -0.1016736775636673, -0.0684235468506813, 0.0028753201477229595, 0.06099637225270271, 0.009583834558725357, 0.050581641495227814, 0.11739051342010498, -0.07317277789115906, 0.045753609389066696, -0.06142802536487579, -0.02356823720037937, -0.20728221535682678, -0.11137764155864716, -0.09830649197101593, -0.12419864535331726, 0.023397177457809448, -0.04714140295982361, 0.04094739258289337, 0.039046287536621094, 0.06615988910198212, -0.045690469443798065, 0.0254428181797266, -0.1330518126487732, -0.03907012566924095, 0.04184113070368767, -0.05329237878322601, 0.011321362107992172, -0.11696355044841766, -0.026673827320337296, -0.1504492610692978, -0.04652032628655434, -0.06236709654331207, 0.05531827732920647, 0.00949032325297594, -0.021140839904546738, -0.127137690782547, -0.07555141299962997, -0.016394637525081635, 0.07465636730194092, -0.06431366503238678, 0.13604921102523804, -0.00563928484916687, -0.003066364908590913, 0.07854371517896652, 0.1595800220966339, -0.051068637520074844, -0.08424512296915054, -0.03580842539668083, 0.2022244930267334, 0.08902614563703537, 0.17964838445186615, -0.05376878380775452, 0.0004702890873886645, -0.0625644326210022, 0.25407397747039795, 0.2996257245540619, -0.012568561360239983, 0.04563609138131142, 0.010801585391163826, 0.04117583483457565, 0.15152546763420105, 0.09815054386854172, 0.039381835609674454, 0.20692189037799835, -0.04305197671055794, -0.04831409081816673, -0.02779778651893139, -0.005823324900120497, -0.04820584878325462, 0.09917251765727997, 0.05146384984254837, -0.04514024779200554, -0.07556963711977005, 0.09715629369020462, -0.16532690823078156, 0.08950228989124298, 0.06025141105055809, -0.13605466485023499, -0.020713694393634796, -0.008054060861468315, 0.10415829718112946, -0.004107328597456217, 0.06664928048849106, -0.021958407014608383, -0.1304907649755478, 0.0253512654453516, 0.0021840145345777273, -0.2355223447084427, 0.007279625628143549, 0.03693337365984917, 0.017842972651124, 0.08278704434633255, -0.01572347991168499, 0.017072644084692, 0.09845101833343506, 0.0560404472053051, -0.012524357065558434, 0.07772362977266312, 0.0226216409355402, -0.0225543025881052, -0.0071214293129742146, -0.08920023590326309, 0.0007821321487426758, -0.021915331482887268, 0.07883857935667038, -0.17440299689769745, 0.08404375612735748, -0.03217814117670059, -0.15668627619743347, -0.04002344235777855, 0.05181194096803665, -0.07741820812225342, 0.0664617270231247, 0.07519728690385818, 0.019779182970523834, -0.03727012872695923, -0.038485221564769745, -0.0058923205360770226, 0.029327992349863052, -0.12245302647352219, -0.0741138830780983, -0.032266318798065186, -0.04216976463794708, 0.1330777406692505, 0.008275037631392479, -0.1854766309261322, -0.03441683575510979, -0.06112590804696083, 0.08186025172472, -0.12031108886003494, 0.08650004863739014, 0.09458015859127045, 0.027939448133111, -0.04141281545162201, -0.17268934845924377, 0.052752990275621414, 0.050273120403289795, -0.04033505171537399, -0.09938579797744751 ]
null
null
transformers
#EmbeddingSimilarityEvaluator: Evaluating the model on STS.en-en.txt dataset in epoch 2 after 26000 steps: | Type | Pearson | Spearman | | ----------- | ----------- | ----------- | | Cosine | 0.7650 | 0.8095 | | Euclidean | 0.8089 | 0.8010 | | Cosine | 0.8075 | 0.7999 | | Euclidean | 0.7531 | 0.7680
{}
feature-extraction
NtDNlp/sentence-embedding-vietnamese
[ "transformers", "pytorch", "xlm-roberta", "feature-extraction", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #xlm-roberta #feature-extraction #endpoints_compatible #region-us
#EmbeddingSimilarityEvaluator: Evaluating the model on URL dataset in epoch 2 after 26000 steps: Type: Cosine, Pearson: 0.7650, Spearman: 0.8095 Type: Euclidean, Pearson: 0.8089, Spearman: 0.8010 Type: Cosine, Pearson: 0.8075, Spearman: 0.7999 Type: Euclidean, Pearson: 0.7531, Spearman: 0.7680
[]
[ "TAGS\n#transformers #pytorch #xlm-roberta #feature-extraction #endpoints_compatible #region-us \n" ]
[ 33 ]
[ "passage: TAGS\n#transformers #pytorch #xlm-roberta #feature-extraction #endpoints_compatible #region-us \n" ]
[ -0.09157028794288635, -0.014434642158448696, -0.008438770659267902, 0.008568478748202324, 0.15428805351257324, 0.027176732197403908, -0.012182911857962608, 0.1135665774345398, 0.05065551772713661, -0.005177127197384834, 0.07933509349822998, 0.276661217212677, -0.02807416208088398, 0.045276034623384476, -0.08575616031885147, -0.23575440049171448, 0.0658026859164238, 0.09898689389228821, -0.0791340246796608, 0.08058097958564758, 0.06315360218286514, -0.09723333269357681, 0.07765084505081177, -0.02933439612388611, -0.1736798882484436, 0.04365864768624306, 0.02934623323380947, -0.08028046041727066, 0.09301559627056122, 0.037817176431417465, 0.13537608087062836, 0.018912747502326965, -0.09373614937067032, -0.16753792762756348, 0.02858855202794075, -0.017754023894667625, -0.06875209510326385, 0.029476162046194077, 0.06861914694309235, -0.11640989780426025, 0.047109007835388184, 0.057239461690187454, -0.011941466480493546, 0.02849241904914379, -0.14742204546928406, -0.16978125274181366, -0.08342111110687256, 0.037297967821359634, 0.01943832077085972, 0.10050526261329651, 0.02927526645362377, 0.17522893846035004, -0.13446080684661865, 0.1072564348578453, 0.24374951422214508, -0.2945135533809662, 0.003086104290559888, 0.07994884997606277, 0.15814892947673798, 0.02678254060447216, 0.009328767657279968, 0.060988884419202805, -0.0028941952623426914, 0.009946653619408607, -0.0014480497920885682, -0.1112838163971901, -0.03887246921658516, 0.077644944190979, -0.09693463146686554, -0.07392911612987518, 0.20173045992851257, -0.03195658326148987, 0.04930413141846657, 0.010199666023254395, -0.0888778418302536, -0.07041224092245102, -0.032620541751384735, 0.01763264834880829, -0.009467972442507744, 0.030963297933340073, 0.007224694360047579, -0.025397326797246933, -0.08577924966812134, 0.016078226268291473, -0.19281281530857086, 0.2674907147884369, 0.008604285307228565, 0.0870567038655281, -0.19132637977600098, 0.0442470908164978, -0.049770861864089966, -0.09369100630283356, -0.004138564690947533, -0.07667695730924606, 0.007322416175156832, 0.025517625734210014, -0.0624912828207016, -0.004670046269893646, 0.07623907923698425, 0.10568064451217651, -0.022045617923140526, 0.015914158895611763, 0.009878050535917282, 0.09981605410575867, 0.03406756743788719, 0.15131109952926636, -0.00609434861689806, -0.03809468820691109, 0.04368637874722481, -0.12904028594493866, -0.023145556449890137, -0.048145849257707596, -0.12358739227056503, -0.04894574359059334, 0.01668778993189335, 0.1252707540988922, 0.043346989899873734, 0.014602301642298698, -0.06446899473667145, -0.03432104364037514, 0.052449826151132584, -0.07974566519260406, 0.014087524265050888, -0.01577346958220005, 0.023837346583604813, 0.16563239693641663, -0.019986482337117195, -0.02894638106226921, -0.05215878039598465, 0.0349564328789711, -0.06953684240579605, 0.04432932287454605, -0.05924021080136299, -0.07949719578027725, 0.0280245803296566, -0.16228719055652618, 0.053883954882621765, -0.15369302034378052, -0.10276494175195694, 0.017045188695192337, 0.03498244285583496, -6.322925401036628e-7, 0.01669502817094326, 0.009836068376898766, -0.025286341086030006, -0.0075867450796067715, -0.05386420711874962, -0.05796147510409355, -0.05189244821667671, 0.09509318321943283, 0.005521012470126152, 0.05858064815402031, -0.09715135395526886, 0.09467590600252151, -0.07579662650823593, 0.04191499203443527, -0.15167520940303802, 0.025377949699759483, -0.0378120094537735, 0.16226524114608765, 0.010663220658898354, -0.07644085586071014, -0.07609470188617706, 0.0503087155520916, -0.03763769567012787, 0.10555906593799591, -0.06293600797653198, -0.1200820580124855, 0.21545106172561646, -0.09735628962516785, -0.1822645366191864, 0.036747854202985764, -0.007191497832536697, -0.03305189311504364, 0.04239990934729576, 0.18002162873744965, 0.10842454433441162, -0.060018397867679596, 0.0676974505186081, 0.12016487121582031, -0.15359073877334595, -0.16805720329284668, 0.017982179298996925, -0.007843355648219585, -0.04883991554379463, 0.048394229263067245, -0.028009921312332153, 0.09140530228614807, -0.0701826810836792, -0.041819822043180466, -0.04769382253289223, -0.014392212964594364, 0.025493822991847992, 0.04651696979999542, 0.0906708836555481, -0.030273128300905228, 0.020094960927963257, 0.004497519228607416, 0.009590672329068184, -0.016986701637506485, 0.04359466955065727, -0.06628887355327606, 0.19542917609214783, -0.11182569712400436, 0.008203069679439068, -0.25517815351486206, -0.06376253068447113, 0.007905710488557816, 0.05069703608751297, -0.03662382438778877, 0.160138800740242, 0.06576819717884064, -0.06421902775764465, 0.039779625833034515, -0.037469763308763504, 0.07995011657476425, 0.018009260296821594, -0.03348680958151817, -0.041255924850702286, -0.0016580584924668074, -0.08449344336986542, -0.081076480448246, -0.003972908016294241, -0.009337268769741058, 0.03782891482114792, 0.06345246732234955, 0.016424067318439484, 0.04593527689576149, -0.051834095269441605, 0.07485006004571915, -0.03570108115673065, 0.006550376303493977, 0.08792943507432938, -0.005329515784978867, -0.05973991006612778, 0.16642674803733826, -0.12699344754219055, 0.34188342094421387, 0.19397006928920746, -0.28197991847991943, 0.0243122186511755, -0.024003464728593826, -0.011948001570999622, 0.03574485331773758, 0.08093426376581192, 0.006935557816177607, 0.07543458044528961, 0.0029287729412317276, 0.1465660035610199, -0.03894364833831787, -0.03705717995762825, 0.016377225518226624, -0.03784716874361038, -0.03934844955801964, 0.08012449741363525, 0.1126195639371872, -0.14468204975128174, 0.14134728908538818, 0.2064598947763443, 0.035313695669174194, 0.09887310862541199, -0.062338776886463165, -0.025687647983431816, -0.005918433424085379, 0.008060744032263756, -0.01725246012210846, 0.03825047239661217, -0.20588810741901398, -0.040142692625522614, 0.06791659444570541, 0.010934161953628063, 0.10956470668315887, -0.12803934514522552, -0.05218097195029259, 0.04629281163215637, -0.000018143975466955453, -0.09527771174907684, 0.08708560466766357, 0.05882610008120537, 0.06448328495025635, -0.0049484954215586185, -0.0526200495660305, 0.0783906802535057, 0.006376967765390873, -0.053735002875328064, 0.169996976852417, -0.11987955868244171, -0.2900390028953552, -0.12114602327346802, -0.1384693682193756, -0.010461380705237389, 0.0038338948506861925, 0.08462625741958618, -0.06082240864634514, -0.03980620577931404, 0.05897420272231102, 0.007668035104870796, -0.13416461646556854, 0.024728452786803246, -0.05857979133725166, 0.059553422033786774, -0.07520200312137604, -0.09249021857976913, -0.06867515295743942, -0.07081010192632675, -0.010653057135641575, 0.09957116097211838, -0.09654723107814789, 0.1354060173034668, 0.1395338624715805, 0.02545938454568386, 0.087888702750206, -0.0013984832912683487, 0.15354430675506592, -0.06029384955763817, -0.08724482357501984, 0.22722448408603668, -0.006049639079719782, 0.08341653645038605, 0.06873347610235214, 0.03364452347159386, -0.06565151363611221, -0.056719910353422165, -0.0708695501089096, -0.11662285774946213, -0.20178399980068207, -0.10964824259281158, -0.1620374321937561, -0.010827051475644112, -0.00958555843681097, 0.04914753884077072, 0.11961749941110611, 0.06335868686437607, 0.057046450674533844, -0.04477566480636597, -0.07448995113372803, 0.028996406123042107, 0.16418427228927612, 0.00033708522096276283, 0.09680595248937607, -0.06854651123285294, -0.08545834571123123, 0.06448549777269363, 0.06218265742063522, 0.29802972078323364, 0.08491279929876328, 0.040251828730106354, 0.06908250600099564, 0.16041521728038788, 0.13597674667835236, 0.15052038431167603, -0.00807341281324625, -0.037844058126211166, 0.0038100078236311674, 0.0011868904111906886, -0.027004439383745193, 0.004542199894785881, 0.1636507511138916, -0.10906607657670975, -0.09618408232927322, -0.15939804911613464, 0.0891093835234642, 0.07183791697025299, -0.015728969126939774, -0.17417675256729126, 0.027822207659482956, 0.05624439939856529, 0.01354428380727768, -0.02834417298436165, 0.03783726692199707, -0.03630293533205986, -0.12338398396968842, 0.03002735786139965, -0.0739092081785202, 0.11485813558101654, 0.01040785200893879, 0.0458032600581646, -0.014696195721626282, -0.08275720477104187, 0.06992755085229874, 0.06168152019381523, -0.17755208909511566, 0.28818225860595703, 0.0010776768904179335, -0.03309065103530884, -0.04832009598612785, 0.018684428185224533, 0.014944087713956833, 0.14009101688861847, 0.1266094297170639, 0.02193388342857361, -0.12298452854156494, -0.16515903174877167, 0.0377223938703537, 0.038609277456998825, 0.12266754359006882, -0.025210237130522728, 0.02144950069487095, -0.016437385231256485, -0.019283611327409744, -0.028228741139173508, 0.031099479645490646, 0.08500470221042633, -0.16053608059883118, 0.057867228984832764, -0.055192604660987854, -0.010253611020743847, -0.008276020176708698, -0.0015975366113707423, -0.0973401591181755, 0.19371484220027924, -0.0482957623898983, -0.050088025629520416, -0.10911639779806137, -0.1016123965382576, 0.11538902670145035, -0.0992407277226448, 0.09097609668970108, -0.0557364784181118, -0.026009967550635338, -0.05919942259788513, -0.21704941987991333, 0.10827456414699554, -0.10972194373607635, 0.05626622587442398, -0.016415603458881378, 0.17537477612495422, -0.0855942815542221, 0.002047858666628599, 0.028744900599122047, 0.01700819469988346, -0.11257088929414749, -0.10291898250579834, -0.007983984425663948, 0.024425728246569633, 0.045349933207035065, 0.09674669057130814, -0.05415003374218941, 0.03042156994342804, -0.013968405313789845, 0.04235779866576195, 0.238027885556221, 0.13260039687156677, -0.081050343811512, 0.12023235112428665, 0.061863165348768234, -0.046752579510211945, -0.2678137719631195, -0.05102682113647461, -0.1245017722249031, -0.024344833567738533, 0.005983533803373575, -0.09754978865385056, 0.11970007419586182, 0.04178072512149811, 0.0061156717129051685, 0.12806802988052368, -0.3023538291454315, -0.05469188839197159, 0.10397587716579437, 0.029445966705679893, 0.40126821398735046, -0.11451883614063263, -0.08166846632957458, 0.012355034239590168, -0.2298608273267746, 0.09162206202745438, 0.008213513530790806, 0.09003595262765884, -0.04353130981326103, 0.03724097087979317, 0.03336569294333458, -0.07317467033863068, 0.12736670672893524, 0.02876785583794117, 0.0666193962097168, -0.04467674344778061, -0.11722929775714874, 0.06057114899158478, -0.02094455063343048, 0.0161824282258749, 0.024035388603806496, 0.029045162722468376, -0.15672096610069275, -0.023032085970044136, -0.11144962906837463, 0.07227961719036102, 0.04478616639971733, -0.01266569085419178, -0.018691617995500565, -0.04695841297507286, 0.011912993155419827, 0.020060235634446144, 0.2160942554473877, -0.038461290299892426, 0.10543473064899445, 0.03610407933592796, 0.04496058076620102, -0.19266198575496674, -0.1895495057106018, -0.06662838906049728, -0.028375262394547462, 0.09013116359710693, -0.047899872064590454, 0.06039371341466904, 0.14011627435684204, -0.012957427650690079, 0.029076676815748215, 0.1274128407239914, 0.027251092717051506, 0.012349811382591724, 0.1174062117934227, -0.1547870934009552, -0.05593477562069893, -0.05171322450041771, -0.0983380675315857, 0.0967085063457489, 0.08383660018444061, 0.08954761922359467, 0.06399985402822495, -0.01915697008371353, -0.03851188346743584, -0.013776220381259918, -0.07221516966819763, 0.07092584669589996, 0.05034618824720383, 0.035641156136989594, -0.15711838006973267, 0.030383078381419182, -0.0195783544331789, -0.23101873695850372, -0.0542815700173378, 0.09496515244245529, -0.10178972780704498, -0.11261942237615585, -0.052628111094236374, 0.12744297087192535, -0.20438304543495178, -0.030553048476576805, -0.08389927446842194, -0.12247119098901749, 0.07140205055475235, 0.1996716409921646, 0.09291296452283859, 0.09381451457738876, -0.026995431631803513, -0.025332586839795113, -0.027394961565732956, -0.045219358056783676, 0.03323706239461899, 0.028400475159287453, -0.12072642892599106, 0.028618331998586655, -0.009208157658576965, 0.17973999679088593, -0.07451965659856796, -0.055330779403448105, -0.12891292572021484, 0.06127446889877319, -0.0688638687133789, -0.06085875257849693, -0.12701676785945892, -0.05312678962945938, 0.012904442846775055, -0.03842158243060112, -0.0587276890873909, 0.007930555380880833, -0.13261939585208893, 0.02393973432481289, -0.010352144949138165, -0.01354971807450056, -0.04743451252579689, -0.03934432566165924, 0.07245753705501556, -0.05704532936215401, 0.06459584832191467, 0.1876150369644165, -0.059510353952646255, 0.0971030443906784, -0.14307598769664764, -0.1627776175737381, 0.10507141053676605, 0.04869471862912178, 0.08942874521017075, 0.03158745542168617, 0.057064224034547806, 0.08430922031402588, -0.01589985005557537, 0.02428661286830902, -0.05962018668651581, -0.13984975218772888, -0.010094855912029743, -0.041708141565322876, -0.13131128251552582, -0.04694812372326851, -0.05041968822479248, 0.14894016087055206, 0.05147732049226761, 0.10631061345338821, 0.0025581233203411102, 0.12549357116222382, -0.06492336839437485, -0.012910562567412853, -0.011393092572689056, -0.17079751193523407, 0.017316443845629692, -0.05593439191579819, 0.021276040002703667, 0.004668292123824358, 0.2749950587749481, 0.022813264280557632, 0.05829356983304024, 0.0015209615230560303, 0.03751463443040848, 0.08761266618967056, 0.022042136639356613, 0.24564595520496368, 0.10393764823675156, -0.01967843435704708, -0.06262717396020889, 0.10663989931344986, 0.000004718029686046066, -0.004579533822834492, 0.12577253580093384, 0.13863039016723633, 0.07185038179159164, 0.09731139987707138, 0.06331025063991547, 0.013051485642790794, -0.090315081179142, -0.21163173019886017, -0.018207048997282982, 0.06831132620573044, 0.03271098807454109, 0.027676064521074295, 0.14665599167346954, -0.04393741115927696, 0.10100267827510834, 0.015544063411653042, -0.02847309410572052, -0.13193419575691223, -0.04813774675130844, -0.07593260705471039, -0.12523342669010162, 0.0017317291349172592, -0.08008270710706711, 0.00977280456572771, 0.09443707019090652, 0.001697806059382856, -0.02114385925233364, 0.0878748670220375, 0.04817131906747818, -0.05238121375441551, 0.0421157106757164, -0.022509947419166565, 0.003310480620712042, 0.06426757574081421, 0.032480064779520035, -0.09893295168876648, -0.10218894481658936, -0.025455856695771217, 0.034367628395557404, -0.08296329528093338, 0.04235530644655228, -0.1319851130247116, -0.10602080076932907, -0.037051331251859665, 0.04463726654648781, -0.05368301272392273, 0.13046328723430634, 0.0097217271104455, -0.013151773251593113, 0.0070860618725419044, 0.15246345102787018, -0.06242436170578003, -0.030061941593885422, -0.028347868472337723, 0.1624174565076828, 0.11403258889913559, 0.08360500633716583, -0.0077412910759449005, 0.0212560947984457, -0.04098131135106087, 0.28996819257736206, 0.271434485912323, -0.020641043782234192, 0.05410045385360718, 0.05744726583361626, 0.03310631588101387, 0.06775276362895966, 0.09417791664600372, 0.11430279910564423, 0.3603788912296295, -0.06611234694719315, -0.06718692183494568, -0.06855439394712448, 0.005274269729852676, -0.11255810409784317, 0.01809309422969818, 0.06907325983047485, -0.07307307422161102, -0.02384614385664463, 0.10789079964160919, -0.15020914375782013, 0.15259124338626862, 0.12083744257688522, -0.19798941910266876, -0.05029390752315521, -0.06107236072421074, 0.14985784888267517, 0.00792041327804327, 0.10508867353200912, -0.05493280664086342, -0.12400973588228226, 0.07309694588184357, 0.04096504673361778, -0.2857780456542969, -0.06028957664966583, 0.09112837165594101, 0.01440730132162571, -0.01960510015487671, -0.035745758563280106, 0.010477501899003983, 0.06880848854780197, 0.09663966298103333, -0.015710733830928802, 0.06643185764551163, 0.020357772707939148, -0.11587066948413849, -0.014265206642448902, 0.023404046893119812, 0.003186179092153907, -0.1041482463479042, 0.0247186329215765, -0.13606661558151245, 0.05573420971632004, -0.04557572677731514, -0.029489165171980858, 0.002504227217286825, -0.04794379696249962, -0.051298849284648895, 0.05269921198487282, 0.08508112281560898, 0.01495441421866417, -0.010993449948728085, -0.04659959673881531, 0.005371680948883295, 0.06713096797466278, -0.025611242279410362, -0.16179391741752625, -0.06893502920866013, -0.06976944953203201, 0.08596295118331909, -0.04104150831699371, -0.06852199882268906, -0.05451328679919243, -0.0590645931661129, 0.018176807090640068, -0.10612662136554718, 0.04986186325550079, 0.07486963272094727, 0.03801770135760307, 0.012747821398079395, -0.03861638158559799, 0.03760795667767525, 0.08371565490961075, -0.10995818674564362, -0.07640618085861206 ]
null
null
transformers
# Quran Speech Recognizer This application will listen to the user's Quran recitation, and take the user to the position of the Quran from where the s/he had recited. You can also take a look at our [presentation slides](https://docs.google.com/presentation/d/1dbbVYHi3LQRiggH14nN36YV2A-ddUAKg67aX5MWi0ys/edit?usp=sharing). # Methodology We used transfer learning to make our application. We fine-tuned the pretrained model available at https://huggingface.co/elgeish/wav2vec2-large-xlsr-53-arabic using the data available at https://www.kaggle.com/c/quran-asr-challenge/data. Our model can be found at https://huggingface.co/Nuwaisir/Quran_speech_recognizer. # Usage Run all the cells of run_ui.ipynb. The last cell will hear your recitation for 5 seconds (changeable) from the time you run that cell. And then convert your speech to Arabic text and show the most probable corresponding parts of 30th juzz (Surah 78 - 114) of the Quran as the output based on edit distance value. Currently, we are searching from Surah 78 to Surah 114 as the searching algorithm needs some time to search the whole Quran. This range can be changed in the 6th cell of the notebook.
{}
automatic-speech-recognition
Nuwaisir/Quran_speech_recognizer
[ "transformers", "pytorch", "wav2vec2", "automatic-speech-recognition", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #wav2vec2 #automatic-speech-recognition #endpoints_compatible #has_space #region-us
# Quran Speech Recognizer This application will listen to the user's Quran recitation, and take the user to the position of the Quran from where the s/he had recited. You can also take a look at our presentation slides. # Methodology We used transfer learning to make our application. We fine-tuned the pretrained model available at URL using the data available at URL Our model can be found at URL # Usage Run all the cells of run_ui.ipynb. The last cell will hear your recitation for 5 seconds (changeable) from the time you run that cell. And then convert your speech to Arabic text and show the most probable corresponding parts of 30th juzz (Surah 78 - 114) of the Quran as the output based on edit distance value. Currently, we are searching from Surah 78 to Surah 114 as the searching algorithm needs some time to search the whole Quran. This range can be changed in the 6th cell of the notebook.
[ "# Quran Speech Recognizer\nThis application will listen to the user's Quran recitation, and take the \nuser to the position of the Quran from where the s/he had recited.\nYou can also take a look at our presentation slides.", "# Methodology\nWe used transfer learning to make our application. We fine-tuned the pretrained\nmodel available at URL\nusing the data available at URL\nOur model can be found at URL", "# Usage\nRun all the cells of run_ui.ipynb. The last cell will hear your\nrecitation for 5 seconds (changeable) from the time you run that cell. And then convert your\nspeech to Arabic text and show the most probable corresponding parts of 30th juzz\n(Surah 78 - 114) of the Quran as the output based on edit distance value.\n\nCurrently, we are searching from Surah 78 to Surah 114 as the searching\nalgorithm needs some time to search the whole Quran. This range can be changed\nin the 6th cell of the notebook." ]
[ "TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #endpoints_compatible #has_space #region-us \n", "# Quran Speech Recognizer\nThis application will listen to the user's Quran recitation, and take the \nuser to the position of the Quran from where the s/he had recited.\nYou can also take a look at our presentation slides.", "# Methodology\nWe used transfer learning to make our application. We fine-tuned the pretrained\nmodel available at URL\nusing the data available at URL\nOur model can be found at URL", "# Usage\nRun all the cells of run_ui.ipynb. The last cell will hear your\nrecitation for 5 seconds (changeable) from the time you run that cell. And then convert your\nspeech to Arabic text and show the most probable corresponding parts of 30th juzz\n(Surah 78 - 114) of the Quran as the output based on edit distance value.\n\nCurrently, we are searching from Surah 78 to Surah 114 as the searching\nalgorithm needs some time to search the whole Quran. This range can be changed\nin the 6th cell of the notebook." ]
[ 41, 51, 38, 119 ]
[ "passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #endpoints_compatible #has_space #region-us \n# Quran Speech Recognizer\nThis application will listen to the user's Quran recitation, and take the \nuser to the position of the Quran from where the s/he had recited.\nYou can also take a look at our presentation slides.# Methodology\nWe used transfer learning to make our application. We fine-tuned the pretrained\nmodel available at URL\nusing the data available at URL\nOur model can be found at URL# Usage\nRun all the cells of run_ui.ipynb. The last cell will hear your\nrecitation for 5 seconds (changeable) from the time you run that cell. And then convert your\nspeech to Arabic text and show the most probable corresponding parts of 30th juzz\n(Surah 78 - 114) of the Quran as the output based on edit distance value.\n\nCurrently, we are searching from Surah 78 to Surah 114 as the searching\nalgorithm needs some time to search the whole Quran. This range can be changed\nin the 6th cell of the notebook." ]
[ -0.014495238661766052, -0.02467707358300686, -0.0014711113180965185, -0.07258910685777664, 0.00033524527680128813, -0.048160482197999954, 0.0017960863187909126, 0.10214661806821823, -0.09148456156253815, 0.06686311960220337, 0.02448047138750553, -0.051348961889743805, 0.13454052805900574, -0.05722960829734802, 0.038345348089933395, -0.11695335805416107, 0.10802637040615082, -0.00567608792334795, 0.15303967893123627, 0.13442856073379517, 0.060165077447891235, -0.03483301028609276, -0.0029704980552196503, 0.0380047969520092, -0.06348663568496704, -0.007159021217375994, -0.16005131602287292, -0.11765824258327484, 0.07803124934434891, 0.009916466660797596, 0.011834613978862762, 0.037303823977708817, 0.025429241359233856, -0.048047397285699844, 0.041632845997810364, 0.08134987950325012, 0.02132425457239151, -0.04318922013044357, 0.042274367064237595, 0.049566157162189484, 0.13275720179080963, 0.10185258835554123, -0.09497659653425217, -0.005494275130331516, -0.06981221586465836, -0.049466826021671295, -0.07926531881093979, 0.012032102793455124, 0.2242409586906433, 0.0643196851015091, -0.03209734335541725, 0.06346108019351959, -0.07058582454919815, 0.07192105054855347, 0.29914796352386475, -0.3548283576965332, 0.003062300616875291, -0.024839086458086967, 0.24416513741016388, 0.048644497990608215, -0.08986140042543411, 0.08877362310886383, -0.004491645842790604, -0.042476117610931396, -0.02599886804819107, -0.06886959820985794, -0.09343212842941284, -0.029429525136947632, -0.12364813685417175, -0.06801346689462662, 0.15535995364189148, -0.009401761926710606, -0.027997087687253952, 0.027767067775130272, -0.059412021189928055, -0.0019460208714008331, -0.04503604397177696, -0.018659230321645737, -0.05917851999402046, -0.023404436185956, 0.07218052446842194, 0.04063287004828453, -0.07215199619531631, -0.18686911463737488, -0.16036531329154968, 0.03148626163601875, 0.0494442842900753, -0.005483162589371204, -0.0735919401049614, 0.046526845544576645, 0.04004896804690361, -0.023975925520062447, -0.09251201897859573, 0.040885865688323975, -0.11483296006917953, 0.005688066594302654, -0.06810705363750458, -0.11017166823148727, -0.08560016006231308, 0.07016727328300476, 0.06997272372245789, 0.0804140716791153, 0.0074259755201637745, 0.05712618678808212, -0.0231766514480114, 0.032133687287569046, -0.07275348156690598, -0.08281931281089783, -0.08209969848394394, -0.028172018006443977, -0.03796094283461571, -0.011503459885716438, 0.006268182769417763, -0.08272431790828705, 0.02636178396642208, 0.03480326384305954, -0.012416421435773373, 0.08053787797689438, 0.08098608255386353, -0.02372681349515915, 0.12846536934375763, -0.11544687300920486, -0.018213849514722824, 0.036766357719898224, -0.08362844586372375, -0.09104383736848831, 0.0674566775560379, -0.036981306970119476, -0.20551469922065735, -0.06688614934682846, 0.024368245154619217, 0.02223825268447399, -0.026958050206303596, -0.03460855409502983, -0.008134860545396805, 0.03520819544792175, -0.0830417349934578, -0.18333353102207184, -0.1832713633775711, -0.02373623102903366, -0.026804538443684578, -0.04899243637919426, 0.11014793813228607, 0.009633616544306278, -0.014923645183444023, -0.01824445091187954, -0.08850187808275223, 0.07736843824386597, -0.06542336195707321, 0.1420089602470398, 0.002940176520496607, 0.1700194627046585, 0.08869948238134384, 0.0818331241607666, -0.03308122605085373, -0.005107794422656298, -0.043356962502002716, 0.08873485028743744, -0.10345225781202316, -0.014505415223538876, -0.14392124116420746, -0.01862778700888157, -0.1750982105731964, 0.017805766314268112, 0.10031019151210785, 0.02014222927391529, -0.16116271913051605, 0.012342190369963646, 0.18217328190803528, -0.025161439552903175, -0.10524207353591919, 0.11915594339370728, 0.03906803950667381, 0.02589355781674385, -0.010388973169028759, 0.27238327264785767, 0.024033382534980774, 0.004077034071087837, -0.07531557232141495, 0.032136980444192886, -0.06986935436725616, -0.047897957265377045, 0.06282909214496613, 0.01822345331311226, 0.09879960864782333, -0.029763204976916313, 0.10714956372976303, 0.0332852266728878, 0.0061928327195346355, -0.0867709144949913, 0.06515498459339142, -0.029861586168408394, -0.022688280791044235, 0.03635890409350395, 0.11076290905475616, -0.0017620960716158152, 0.012655671685934067, -0.0316188670694828, 0.09810079634189606, -0.023557627573609352, 0.07969553023576736, -0.10287421941757202, 0.11852896958589554, -0.07523064315319061, 0.14111807942390442, -0.15462297201156616, 0.20417654514312744, -0.009081186726689339, 0.04312574863433838, -0.0482521653175354, -0.0422477200627327, 0.03988246992230415, 0.004822694230824709, 0.02377401851117611, -0.005712658166885376, 0.1463501751422882, 0.009470994584262371, -0.13269191980361938, 0.07440152019262314, -0.03558465093374252, -0.029823921620845795, -0.06405624002218246, -0.02716212533414364, 0.0014754069270566106, 0.016695713624358177, 0.08764857053756714, -0.08270776271820068, -0.0008052782504819334, 0.0846930667757988, 0.03758670389652252, -0.006642238702625036, -0.03010784089565277, 0.007518288679420948, 0.006798821967095137, 0.060896873474121094, 0.18536314368247986, -0.17056617140769958, -0.008532471023499966, 0.06398320943117142, -0.10874221473932266, -0.041673172265291214, 0.09403683245182037, 0.011220588348805904, -0.0808824971318245, 0.008232491090893745, -0.059994980692863464, 0.08053810149431229, 0.049801524728536606, 0.12292765080928802, -0.06382235139608383, -0.05102524161338806, -0.007667262572795153, -0.03938286378979683, 0.008220529183745384, 0.08898290991783142, 0.009569712914526463, -0.20009709894657135, 0.01800462417304516, 0.08076558262109756, 0.011211500503122807, -0.0019015241414308548, 0.002646382199600339, -0.09770456701517105, 0.08537507802248001, 0.0958409309387207, 0.01919993944466114, -0.02105795033276081, -0.04586660861968994, -0.11394886672496796, 0.03285543620586395, 0.021469615399837494, -0.0016303503653034568, -0.0379491001367569, 0.0735522210597992, -0.006365874316543341, -0.051622044295072556, -0.05857419595122337, 0.014062461443245411, -0.0435003936290741, 0.06565280258655548, 0.020623022690415382, -0.07921663671731949, -0.06339188665151596, -0.03526683896780014, -0.04966495931148529, 0.16436737775802612, -0.01690255105495453, -0.22018156945705414, -0.005977424327284098, -0.046260230243206024, 0.015177332796156406, 0.03386931121349335, 0.10343430936336517, -0.05383550375699997, 0.00813344493508339, -0.05339104309678078, -0.10138889402151108, -0.004307810682803392, 0.06708445399999619, 0.046341195702552795, -0.10796856135129929, 0.015666907653212547, -0.12020774185657501, 0.04611058905720711, -0.043598830699920654, 0.008336273021996021, 0.08646449446678162, -0.124296635389328, 0.09632717072963715, 0.06710207462310791, -0.08196061104536057, 0.01358597632497549, 0.008632545359432697, 0.06940023601055145, -0.06927626579999924, 0.042106349021196365, 0.08129789680242538, 0.036166757345199585, 0.034984175115823746, 0.06632590293884277, -0.05949145182967186, -0.0866141989827156, -0.027955802157521248, 0.05149197578430176, -0.02705470100045204, -0.08038048446178436, -0.017199115827679634, -0.02554471231997013, -0.06672857701778412, -0.005053372122347355, 0.05768536403775215, -0.026832226663827896, -0.03751871734857559, -0.04825993999838829, 0.034787073731422424, 0.00003350711631355807, 0.03583665192127228, 0.013717607595026493, -0.02705715224146843, 0.09579417109489441, -0.09070135653018951, -0.022642534226179123, 0.012025238014757633, 0.06714625656604767, 0.16433921456336975, -0.04230216518044472, 0.13511592149734497, 0.03581712767481804, 0.05899733304977417, 0.09356052428483963, 0.08904637396335602, -0.09594669193029404, -0.034165892750024796, -0.03335358574986458, -0.011309542693197727, -0.07937031984329224, 0.011959616094827652, 0.04114681854844093, -0.08642314374446869, 0.0352417528629303, -0.11897743493318558, 0.07219848036766052, 0.1831393539905548, 0.015945250168442726, -0.16685403883457184, -0.04221015051007271, -0.026754578575491905, -0.14087888598442078, -0.09555613994598389, 0.031589310616254807, -0.038493409752845764, -0.01700659841299057, -0.027789682149887085, -0.010396459139883518, 0.11979825794696808, -0.04347722977399826, 0.0560576468706131, -0.18738940358161926, -0.18371237814426422, -0.05371064692735672, 0.10687828809022903, -0.15851598978042603, 0.1713438332080841, 0.0384037010371685, 0.08623525500297546, -0.04022696614265442, -0.04034063220024109, 0.0834471583366394, 0.04843928664922714, 0.07966125011444092, 0.026229579001665115, -0.051711343228816986, -0.13054877519607544, -0.07428950071334839, 0.023190714418888092, 0.05761687830090523, 0.061876434832811356, 0.0493648536503315, 0.01484582107514143, 0.0455566830933094, -0.0762631967663765, 0.09856455028057098, 0.04283585026860237, -0.01625262387096882, 0.07566283643245697, -0.0021700977813452482, 0.1341455727815628, -0.0036112868692725897, 0.05219726264476776, -0.06134754791855812, 0.11605144292116165, -0.10702744871377945, 0.0007624598802067339, -0.05986356362700462, -0.12200597673654556, 0.05058750510215759, -0.054488155990839005, -0.0680331140756607, 0.002167587634176016, -0.07224636524915695, 0.017431534826755524, -0.06662141531705856, 0.004033591598272324, -0.05994285270571709, 0.024814274162054062, -0.014964129775762558, 0.0657653734087944, -0.04195353016257286, 0.029019564390182495, 0.05072895064949989, -0.0035815800074487925, -0.10085658729076385, -0.04683700576424599, 0.00895871501415968, -0.06699935346841812, -0.014534419402480125, -0.09089579433202744, 0.07143285870552063, 0.09843356162309647, -0.03244500979781151, -0.016357669606804848, 0.014634508639574051, 0.13320055603981018, 0.020573953166604042, 0.15846574306488037, 0.31926438212394714, 0.026273582130670547, -0.2690460681915283, -0.25050756335258484, 0.031089525669813156, 0.03786250576376915, -0.06160476803779602, -0.06144719198346138, 0.030061958357691765, -0.02148800529539585, -0.05665094405412674, -0.11601197719573975, -0.20115646719932556, -0.04493565112352371, 0.19457969069480896, -0.036891501396894455, 0.30351972579956055, -0.16213876008987427, -0.04906921088695526, -0.06039614602923393, -0.015322664752602577, -0.06544600427150726, -0.15964113175868988, 0.18247507512569427, -0.006499532610177994, 0.033522557467222214, -0.005356841720640659, -0.015375331044197083, 0.11955947428941727, 0.029483841732144356, 0.0006491243839263916, -0.12419336289167404, 0.13057935237884521, 0.010752005502581596, -0.013592446222901344, 0.11346814781427383, -0.08773595839738846, 0.0867953822016716, -0.19850073754787445, -0.04229998216032982, -0.02793174982070923, 0.015766190364956856, 0.08287301659584045, 0.03104153461754322, -0.018132053315639496, -0.018956491723656654, 0.09787366539239883, 0.050579983741045, -0.023515265434980392, 0.030942697077989578, -0.007290084846317768, 0.1256633996963501, 0.0766339898109436, -0.05890601873397827, 0.013863898813724518, -0.05809852108359337, 0.010873891413211823, 0.1701059192419052, -0.11880674958229065, -0.03343161195516586, 0.027821123600006104, 0.005550074391067028, 0.033987049013376236, 0.03627690300345421, -0.07714705169200897, 0.04540560394525528, 0.1003473773598671, -0.04575244337320328, -0.1496783196926117, 0.04667583853006363, 0.003157454775646329, -0.0004331230011302978, 0.04488668218255043, 0.024519408121705055, 0.023615458980202675, 0.015377080999314785, 0.05103975534439087, -0.0007905041566118598, -0.019510231912136078, 0.2657109200954437, -0.045825403183698654, 0.10812319070100784, -0.04321649298071861, 0.07075825333595276, 0.12167742103338242, 0.02097751572728157, 0.10100377351045609, 0.1450299769639969, -0.12510088086128235, -0.11119112372398376, -0.12466324865818024, -0.0510159395635128, 0.030521102249622345, -0.04856692627072334, -0.022717811167240143, 0.016408031806349754, 0.05638120695948601, 0.13272008299827576, 0.026012690737843513, -0.015490776859223843, -0.038337141275405884, 0.012110644951462746, 0.04987470433115959, 0.023729782551527023, 0.033874619752168655, 0.027179555967450142, 0.02432572841644287, 0.0801176205277443, 0.030256301164627075, 0.1298699527978897, -0.057383209466934204, -0.04407937824726105, -0.09322593361139297, 0.08596668392419815, -0.1924397051334381, -0.08978458493947983, -0.09968768060207367, 0.008395658805966377, 0.019389696419239044, 0.03819514438509941, -0.017551667988300323, 0.00591290881857276, -0.031338609755039215, -0.013459156267344952, -0.07078709453344345, 0.05496377870440483, -0.12527364492416382, 0.007382077630609274, 0.03466399013996124, -0.012226331047713757, 0.052838973701000214, 0.12393142282962799, -0.00825108028948307, 0.02772487699985504, -0.046378251165151596, -0.15903405845165253, 0.05862253159284592, 0.05553734675049782, -0.04296122491359711, 0.027642803266644478, -0.0008689876412972808, 0.03896578401327133, 0.012926924042403698, -0.005533718504011631, 0.047333527356386185, -0.05682280287146568, -0.05397289618849754, -0.26675674319267273, 0.03126753121614456, -0.039734847843647, 0.04621444270014763, -0.000651587441097945, 0.15936996042728424, 0.03924969956278801, -0.015543212182819843, -0.06423962861299515, -0.0068032750859856606, 0.003591321175917983, -0.02447141706943512, -0.12834173440933228, 0.04878910630941391, -0.12843693792819977, 0.04919080436229706, -0.020320598036050797, 0.033666543662548065, -0.013361725024878979, -0.011042403057217598, -0.03308187797665596, 0.0031707643065601587, -0.051528483629226685, -0.057549506425857544, 0.11423888802528381, 0.15789195895195007, -0.03558438643813133, 0.012806226499378681, 0.016531193628907204, -0.03283192962408066, 0.13704974949359894, -0.07255173474550247, 0.17084284126758575, 0.0077896504662930965, 0.06622408330440521, -0.0781862884759903, -0.09352003037929535, -0.05341272056102753, -0.03157205879688263, -0.0981445387005806, 0.013273108750581741, -0.010341343469917774, 0.0328526608645916, 0.25677862763404846, 0.014161218889057636, 0.018433719873428345, -0.03807016462087631, -0.010775680653750896, -0.01621275022625923, -0.08441729098558426, 0.010627358220517635, -0.013776366598904133, -0.005531081929802895, -0.0012973390985280275, 0.07394368201494217, 0.05934387817978859, 0.024761896580457687, 0.050121501088142395, 0.2269168347120285, 0.005795030388981104, -0.10527727752923965, -0.03632725030183792, -0.10481192916631699, -0.014930219389498234, -0.03851546347141266, 0.006638192106038332, 0.20628011226654053, -0.02030225656926632, 0.085246242582798, 0.036002617329359055, -0.0351136177778244, 0.10119561851024628, -0.08849916607141495, -0.11183089017868042, -0.035537704825401306, 0.023769285529851913, 0.010698876343667507, 0.13647127151489258, 0.15639416873455048, -0.054571110755205154, -0.010767069645226002, 0.07154347747564316, -0.03610505163669586, -0.09394574165344238, -0.03719697147607803, 0.12497609853744507, 0.07713788002729416, -0.01616010069847107, -0.07627822458744049, -0.051032550632953644, -0.09045649319887161, 0.24491672217845917, -0.07715609669685364, -0.05239547789096832, -0.004636179190129042, 0.008259115740656853, 0.018077315762639046, 0.030139096081256866, 0.025628307834267616, 0.10028715431690216, 0.2890278995037079, -0.017124315723776817, -0.05507255718111992, -0.0015671094879508018, -0.11456701904535294, -0.16813868284225464, 0.06695497781038284, 0.02371424436569214, 0.0006095845019444823, -0.04063072055578232, 0.011025737039744854, -0.0487714447081089, -0.2057846486568451, -0.14179646968841553, 0.07273697108030319, -0.040453992784023285, 0.04819022864103317, -0.035221509635448456, 0.0991540253162384, -0.05094915255904198, 0.009679307229816914, 0.10784643143415451, -0.001956528052687645, 0.05108852684497833, -0.05744416266679764, 0.001366952434182167, 0.05421050637960434, -0.060813069343566895, -0.004411643836647272, -0.017426740378141403, 0.03407853841781616, 0.03404609113931656, 0.07248461246490479, -0.04392937198281288, 0.194440096616745, -0.006658373400568962, -0.08938293904066086, 0.06625507771968842, 0.1544683426618576, 0.040392130613327026, -0.057656023651361465, 0.046160850673913956, 0.012083779089152813, 0.03388846293091774, 0.007272602524608374, 0.12919089198112488, -0.022832294926047325, 0.00735006807371974, 0.057792242616415024, 0.13173501193523407, 0.11712498962879181, 0.012455307878553867, 0.07991065829992294, -0.05821295082569122, -0.07646292448043823, -0.025691227987408638, -0.2318032830953598, -0.06605840474367142, -0.10206213593482971, -0.016042418777942657, -0.16571827232837677, 0.041554514318704605, -0.10495412349700928, -0.008955815806984901, 0.025575796142220497, -0.004891280550509691, 0.047671809792518616, 0.06364115327596664, 0.08267094939947128, -0.02033579908311367, -0.02953103370964527, -0.2253994643688202, 0.03174375742673874, 0.061035122722387314, -0.20555049180984497, -0.10508459061384201 ]
null
null
transformers
# 707 DialoGPT Model
{"tags": ["conversational"]}
text-generation
Obscurity/DialoGPT-Medium-707
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# 707 DialoGPT Model
[ "# 707 DialoGPT Model" ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# 707 DialoGPT Model" ]
[ 51, 8 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# 707 DialoGPT Model" ]
[ -0.03814740851521492, 0.047702495008707047, -0.005425239447504282, 0.005011532921344042, 0.13169705867767334, 0.003489617258310318, 0.1488846093416214, 0.13061212003231049, 0.005398031324148178, -0.04815620556473732, 0.12325767427682877, 0.17506550252437592, 0.012569879181683064, 0.09202606230974197, -0.07788997888565063, -0.3034251630306244, 0.06144910305738449, 0.05817699432373047, 0.04801405221223831, 0.1250361204147339, 0.08725110441446304, -0.03765939176082611, 0.0715523213148117, -0.0060875252820551395, -0.1390467882156372, 0.022341221570968628, 0.02643280103802681, -0.12091650813817978, 0.11626829206943512, 0.06473761796951294, 0.03160535544157028, 0.025061078369617462, -0.03410205990076065, -0.1513414978981018, 0.025810744613409042, -0.013062338344752789, -0.04486648738384247, 0.04834067448973656, 0.019878320395946503, -0.054455049335956573, 0.14087626338005066, 0.10981094837188721, -0.008664632216095924, 0.04923328384757042, -0.15837179124355316, -0.013412704691290855, -0.013180687092244625, 0.06462129205465317, 0.08546609431505203, 0.11209794133901596, -0.04827703535556793, 0.1137392446398735, -0.06984450668096542, 0.11710605770349503, 0.07795781642198563, -0.33485716581344604, -0.0049672117456793785, 0.13097263872623444, 0.029173851013183594, 0.049042437225580215, -0.03221341595053673, 0.08334516733884811, 0.019424866884946823, -0.006518014706671238, -0.05017773061990738, -0.07041703909635544, -0.052237313240766525, 0.003921364899724722, -0.09638561308383942, -0.03195395693182945, 0.24064911901950836, -0.008918354287743568, 0.057528991252183914, -0.09215083718299866, -0.10365142673254013, -0.041563134640455246, -0.05568842589855194, -0.032840460538864136, -0.07571757584810257, 0.0801122784614563, 0.005693536717444658, -0.08806464821100235, -0.12624500691890717, -0.04340454190969467, -0.18564468622207642, 0.12793703377246857, 0.028430737555027008, 0.042358800768852234, -0.20636124908924103, 0.09429289400577545, -0.04729009047150612, -0.0957656279206276, 0.010294853709638119, -0.09400492161512375, 0.0013887900859117508, 0.017199968919157982, -0.024938587099313736, -0.034811798483133316, 0.06782816350460052, 0.10633233189582825, -0.006384590640664101, 0.042572036385536194, -0.030905118212103844, 0.054790958762168884, 0.031630516052246094, 0.08518126606941223, -0.014195557683706284, -0.08138741552829742, 0.024953998625278473, -0.09111501276493073, 0.00655670091509819, -0.05343622714281082, -0.18039627373218536, -0.04177691787481308, 0.07382088154554367, 0.054175253957509995, 0.02455764263868332, 0.12123385071754456, 0.00014122610446065664, -0.037756357342004776, 0.06628470867872238, -0.02903106063604355, -0.015092344023287296, 0.009051046334207058, -0.000887855130713433, 0.12956060469150543, -0.0017196497647091746, 0.0565900057554245, -0.13677799701690674, -0.014949111267924309, -0.05534999445080757, -0.0028484619688242674, -0.0305305365473032, -0.04788506403565407, -0.00497537711635232, -0.006039528641849756, 0.001365226460620761, -0.1363246887922287, -0.1914009004831314, -0.002338531892746687, -0.010634020902216434, -0.05551453307271004, -0.08695131540298462, -0.08917472511529922, -0.008421022444963455, 0.04466259852051735, -0.06466483324766159, -0.01846252754330635, -0.05715267360210419, 0.0815618634223938, -0.0065156277269124985, 0.08011393994092941, -0.12368813902139664, 0.07227775454521179, -0.08756445348262787, -0.0298701710999012, -0.09305662661790848, 0.12357814610004425, 0.007412245962768793, 0.05980556830763817, -0.026554297655820847, -0.02942974492907524, -0.09838088601827621, 0.053278036415576935, -0.02049335651099682, 0.2373940348625183, -0.08859286457300186, -0.09248322993516922, 0.2875031530857086, -0.05378968268632889, -0.11597070097923279, 0.12743060290813446, 0.004042006563395262, 0.05424236133694649, 0.1252031922340393, 0.21040594577789307, 0.0012869356432929635, -0.005392749328166246, 0.08762586861848831, 0.10875996947288513, -0.09004867076873779, -0.03123306855559349, 0.01043562963604927, -0.035366158932447433, -0.09340333938598633, 0.0383024737238884, 0.1019144207239151, 0.06416534632444382, -0.060597434639930725, -0.023176128044724464, -0.007394767832010984, -0.003436911152675748, 0.0708475112915039, -0.012706268578767776, 0.1292622834444046, -0.03748587891459465, -0.05508001148700714, -0.018939156085252762, 0.019289463758468628, -0.05695752799510956, 0.028315823525190353, -0.06804588437080383, 0.0822295993566513, -0.021168602630496025, 0.05855930596590042, -0.1397143453359604, -0.04653678461909294, -0.027500275522470474, 0.18590600788593292, 0.03782182186841965, 0.1041734367609024, 0.06263420730829239, -0.0456230565905571, -0.02230233885347843, 0.019399218261241913, 0.1566445231437683, -0.013742412440478802, -0.08645506203174591, -0.07220175862312317, 0.07964664697647095, -0.05174609646201134, 0.10874078422784805, -0.08107508718967438, 0.010472740046679974, -0.014989896677434444, 0.11241283267736435, -0.023094497621059418, 0.02047601155936718, 0.016168322414159775, -0.015369515866041183, -0.040620651096105576, 0.008487372659146786, 0.10274770110845566, -0.007463999092578888, -0.07630512118339539, 0.20020079612731934, -0.18376164138317108, 0.16540087759494781, 0.1792450249195099, -0.20397037267684937, -0.0013108088169246912, -0.14260978996753693, -0.033188119530677795, 0.008581594564020634, 0.06207462027668953, -0.04679495841264725, 0.19465501606464386, -0.020058009773492813, 0.18071234226226807, -0.051969096064567566, -0.025479396805167198, -0.015429477207362652, -0.056495629251003265, 0.013069234788417816, 0.08019649237394333, 0.09133318066596985, -0.19192957878112793, 0.16328026354312897, 0.09689699113368988, 0.058085814118385315, 0.20343472063541412, 0.028677767142653465, -0.013555338606238365, 0.059273503720760345, 0.003736877581104636, -0.05949626490473747, -0.06337324529886246, -0.29961276054382324, -0.027568325400352478, 0.0766916498541832, 0.054277319461107254, 0.12237471342086792, -0.0975876972079277, -0.026727575808763504, -0.010473230853676796, -0.026780759915709496, 0.02685079164803028, 0.11036317050457001, 0.03430292755365372, 0.12910203635692596, -0.010791992768645287, -0.06759759038686752, 0.07091113179922104, 0.008779973722994328, -0.09327147156000137, 0.19353324174880981, -0.1198665052652359, -0.36111077666282654, -0.11660443991422653, -0.17558468878269196, -0.06110508367419243, 0.0502951480448246, 0.0912264734506607, -0.10801374167203903, -0.026481308043003082, -0.013964330777525902, 0.08739464730024338, -0.09728474169969559, 0.009580956771969795, -0.02038842998445034, 0.0007905493839643896, -0.11106420308351517, -0.09731122851371765, -0.06247077137231827, -0.05684738606214523, -0.06912357360124588, 0.11099637299776077, -0.15249384939670563, 0.010237379930913448, 0.231130450963974, 0.04951314255595207, 0.06424705684185028, -0.03569592535495758, 0.20955641567707062, -0.1114339530467987, 0.0035544957499951124, 0.18245014548301697, -0.017204513773322105, 0.05587226152420044, 0.14681991934776306, -0.005018712021410465, -0.09007003903388977, 0.0363738089799881, -0.027823207899928093, -0.06678411364555359, -0.19835463166236877, -0.1410980522632599, -0.12260543555021286, 0.07731389999389648, 0.014894218184053898, 0.04516807571053505, 0.17494644224643707, 0.062436655163764954, -0.03794039040803909, -0.002503181342035532, 0.053082916885614395, 0.08156009018421173, 0.27569660544395447, -0.07220512628555298, 0.14478498697280884, -0.018193155527114868, -0.14618994295597076, 0.06779585033655167, 0.051215510815382004, 0.0830565020442009, 0.07920318096876144, 0.03693799301981926, -0.00380401941947639, 0.05617306008934975, 0.13516272604465485, 0.08319909125566483, 0.025353020057082176, -0.05513288453221321, -0.031313706189394, -0.03481147065758705, -0.04274556413292885, 0.03337058424949646, 0.08072949945926666, -0.1572171151638031, -0.028572866693139076, -0.04258505254983902, 0.051187556236982346, 0.033659398555755615, 0.11231008917093277, -0.18149036169052124, -0.014015658758580685, 0.05604655295610428, -0.06155093386769295, -0.1344318389892578, 0.08714459836483002, -0.01639288105070591, -0.1325443983078003, 0.052106961607933044, -0.005566684994846582, 0.1162874773144722, -0.08319166302680969, 0.07765673100948334, -0.1310533583164215, -0.06624377518892288, 0.0009540609898976982, 0.11463567614555359, -0.2752595841884613, 0.21610340476036072, -0.004262178670614958, -0.043032556772232056, -0.11538456380367279, -0.00534633407369256, -0.001526739913970232, 0.0966174528002739, 0.10699179768562317, -0.009633141569793224, 0.07578357309103012, 0.003273531561717391, -0.06115957349538803, 0.019907288253307343, 0.10310720652341843, -0.003199709812179208, -0.018685726448893547, -0.031525276601314545, -0.0073226396925747395, -0.011910300701856613, -0.05131585896015167, -0.029334358870983124, -0.1779872626066208, 0.0751258060336113, 0.07116835564374924, 0.08638615906238556, 0.0288879182189703, -0.025042660534381866, -0.09523476660251617, 0.2704463005065918, -0.0036576909478753805, -0.09307774156332016, -0.096050925552845, -0.012193229049444199, 0.07487604022026062, -0.0525234080851078, 0.034957144409418106, -0.06658442318439484, 0.015363309532403946, -0.05351056903600693, -0.17905758321285248, 0.12352399528026581, -0.09181366860866547, -0.052483655512332916, -0.01892678625881672, 0.2326752245426178, -0.036798007786273956, 0.01676112972199917, 0.05436434596776962, 0.0017534304643049836, -0.09561426937580109, -0.10097780078649521, 0.002199915237724781, 0.031189631670713425, 0.005334150977432728, 0.035265062004327774, -0.00833794753998518, -0.06615617126226425, -0.04221213981509209, -0.021146532148122787, 0.3233482837677002, 0.14635787904262543, -0.03234918415546417, 0.16057302057743073, 0.13555952906608582, -0.06238820031285286, -0.26021167635917664, -0.11563245207071304, -0.0555974543094635, -0.029319163411855698, -0.10047910362482071, -0.16618961095809937, 0.07106761634349823, -0.00272634020075202, -0.022362712770700455, 0.09183617681264877, -0.27305206656455994, -0.10016749054193497, 0.18741963803768158, -0.026538481935858727, 0.40481480956077576, -0.08937487751245499, -0.09599673748016357, -0.04726765677332878, -0.19243139028549194, 0.15958577394485474, 0.0003070519014727324, 0.12249857932329178, -0.02053743228316307, 0.149367555975914, 0.05764087289571762, -0.021468963474035263, 0.07878170907497406, 0.01897687278687954, -0.043524134904146194, -0.10214164853096008, -0.0576070211827755, 0.02209348976612091, 0.027643714100122452, 0.06043785810470581, -0.034488312900066376, 0.03645317256450653, -0.13903039693832397, -0.0697457492351532, -0.09509347379207611, 0.029181979596614838, 0.02550293505191803, -0.0738123431801796, 0.005104499869048595, -0.0606207400560379, -0.01796574704349041, 0.004064314998686314, 0.1774221509695053, -0.09656526893377304, 0.1397891342639923, 0.12213077396154404, 0.14162534475326538, -0.15240514278411865, 0.009560088627040386, -0.06451260298490524, -0.062333766371011734, 0.06749572604894638, -0.06435444205999374, 0.023527121171355247, 0.11320745199918747, -0.029078762978315353, 0.08856822550296783, 0.08834152668714523, -0.0034355632960796356, 0.0011360073694959283, 0.1049066036939621, -0.22959063947200775, -0.0893966406583786, -0.07554332911968231, 0.01748102717101574, 0.06950350105762482, 0.11829019337892532, 0.19839921593666077, -0.000406519538955763, -0.028623992577195168, 0.006503017619252205, 0.016791198402643204, -0.04634048044681549, 0.09311505407094955, -0.0043632108718156815, 0.02147444151341915, -0.15826217830181122, 0.05571194365620613, -0.013220294378697872, -0.09529589116573334, 0.02994541823863983, 0.1646208018064499, -0.11790794134140015, -0.11658614873886108, -0.08284015953540802, 0.10400213301181793, -0.1324712485074997, -0.014417956583201885, -0.025342490524053574, -0.12135744839906693, 0.07650114595890045, 0.0955805629491806, 0.04664371535181999, 0.06475501507520676, -0.09212148934602737, -0.03661809861660004, -0.022185761481523514, -0.023298081010580063, 0.024984201416373253, -0.02134113572537899, -0.05568326637148857, 0.04387207329273224, -0.036354176700115204, 0.10255562514066696, -0.09711185097694397, -0.10833828151226044, -0.15289464592933655, 0.038131922483444214, -0.10924025624990463, -0.09897933155298233, -0.11387841403484344, -0.048719413578510284, -0.012049013748764992, -0.03843139484524727, -0.04480484500527382, -0.03709615394473076, -0.11245188862085342, 0.03178693726658821, -0.04676428437232971, 0.0022704785224050283, -0.06384537369012833, 0.036044247448444366, 0.04213804379105568, -0.015309112146496773, 0.16809821128845215, 0.1473298817873001, -0.09969011694192886, 0.10324081778526306, -0.13196627795696259, -0.06530427187681198, 0.1098352000117302, 0.020581014454364777, 0.048132214695215225, 0.07232891768217087, 0.006792531814426184, 0.07512528449296951, 0.05512958765029907, 0.05167871713638306, 0.03853964805603027, -0.1054830551147461, 0.02316674403846264, -0.06113574653863907, -0.1284959316253662, -0.049622852355241776, -0.01735001616179943, 0.01746993139386177, 0.044898588210344315, 0.07494200021028519, -0.05875665321946144, 0.07342462986707687, -0.07064445316791534, 0.03461592271924019, 0.022076265886425972, -0.1461409032344818, 0.02952142432332039, -0.07602337747812271, 0.05857711657881737, 0.02589796856045723, 0.22701966762542725, 0.018253615126013756, -0.00248549273237586, 0.018468214198946953, 0.0628683865070343, 0.058805111795663834, -0.01592187210917473, 0.22780926525592804, 0.12132342159748077, -0.05409441888332367, -0.08606542646884918, 0.09136879444122314, 0.028436733409762383, 0.0460711345076561, 0.11100279539823532, -0.03676643222570419, -0.03492417559027672, 0.08506567776203156, -0.0017783561488613486, 0.009420055896043777, -0.11481012403964996, -0.14121240377426147, -0.05166170746088028, 0.056352805346250534, -0.0626474916934967, 0.11396212130784988, 0.15944510698318481, -0.01275903545320034, 0.0279601588845253, 0.0006888288189657032, -0.07702223211526871, -0.1784513294696808, -0.19113625586032867, -0.07759910076856613, -0.1603432148694992, 0.017829790711402893, -0.13142891228199005, 0.05592099577188492, 0.018951809033751488, 0.09840720146894455, -0.05579336732625961, 0.0841171145439148, 0.05359915271401405, -0.11978309601545334, 0.08286809176206589, -0.03101973608136177, 0.08918871730566025, -0.0352775901556015, 0.00048288467223756015, -0.06764425337314606, 0.014573280699551105, 0.018842335790395737, 0.04681006446480751, -0.059856366366147995, 0.0021269216667860746, -0.08995641022920609, -0.07035264372825623, -0.05849387124180794, 0.06369979679584503, 0.0029656263068318367, 0.15791194140911102, 0.011977764777839184, -0.03091414086520672, 0.014563674107193947, 0.24131028354167938, -0.08691434562206268, -0.1146550178527832, -0.0629463791847229, 0.24460047483444214, 0.01336564403027296, 0.10994826257228851, -0.02555253356695175, -0.007432985119521618, -0.07487167418003082, 0.34070324897766113, 0.29574495553970337, -0.07461562007665634, 0.014410379342734814, 0.021890781819820404, 0.03466389328241348, 0.11069171875715256, 0.10375283658504486, 0.10215792804956436, 0.3161631226539612, -0.059571754187345505, -0.019444061443209648, -0.012909216806292534, -0.0494721457362175, -0.06427055597305298, 0.06936855614185333, 0.05907958745956421, -0.07265600562095642, -0.03321296349167824, 0.11784818768501282, -0.243057519197464, 0.1027870625257492, -0.16349677741527557, -0.16020065546035767, -0.07871545851230621, 0.003470993135124445, 0.07967576384544373, 0.03709622845053673, 0.08322107046842575, 0.0003441794542595744, -0.06317680329084396, 0.05537538602948189, 0.02050078473985195, -0.21113096177577972, -0.013888887129724026, 0.06811532378196716, -0.05822209268808365, -0.05578049272298813, -0.017296038568019867, 0.09136279672384262, 0.07856861501932144, 0.06055128946900368, -0.009126373566687107, 0.06802251189947128, -0.0076951636001467705, -0.047920577228069305, 0.040169086307287216, 0.05282316356897354, 0.03365377336740494, -0.09201771765947342, 0.05801888555288315, -0.16558320820331573, 0.0377938486635685, -0.003650203114375472, -0.036686573177576065, -0.03056660108268261, 0.034995634108781815, -0.06085174158215523, 0.0869644433259964, 0.08349878340959549, -0.014915251173079014, -0.0051364474929869175, -0.03414280340075493, -0.012135724537074566, -0.013125698082149029, -0.09318865090608597, -0.09719635546207428, -0.18809497356414795, -0.10529110580682755, 0.08358070254325867, -0.01960349641740322, -0.1533443182706833, 0.019799841567873955, -0.11795920878648758, 0.0517815463244915, -0.1334993988275528, 0.09468302130699158, 0.0806272104382515, 0.02254018373787403, 0.008175699040293694, -0.01807740516960621, 0.043181102722883224, 0.09622419625520706, -0.14448589086532593, -0.0800958052277565 ]
null
null
transformers
# GPT2-Mongolia ## Model description GPT-2 is a transformers model pretrained on a very small corpus of Mongolian news data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was trained to guess the next word in sentences. ## How to use ```python import tensorflow as tf from transformers import GPT2Config, TFGPT2LMHeadModel, GPT2Tokenizer from transformers import WEIGHTS_NAME, CONFIG_NAME tokenizer = GPT2Tokenizer.from_pretrained('Ochiroo/tiny_mn_gpt') model = TFGPT2LMHeadModel.from_pretrained('Ochiroo/tiny_mn_gpt') text = "Намайг Эрдэнэ-Очир гэдэг. Би" input_ids = tokenizer.encode(text, return_tensors='tf') beam_outputs = model.generate( input_ids, max_length = 25, num_beams = 5, temperature = 0.7, no_repeat_ngram_size=2, num_return_sequences=5 ) print(tokenizer.decode(beam_outputs[0])) ``` ## Training data and biases Trained on 500MB of Mongolian news dataset (IKON) on RTX 2060.
{"language": "mn"}
text-generation
Ochiroo/tiny_mn_gpt
[ "transformers", "tf", "gpt2", "text-generation", "mn", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "mn" ]
TAGS #transformers #tf #gpt2 #text-generation #mn #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# GPT2-Mongolia ## Model description GPT-2 is a transformers model pretrained on a very small corpus of Mongolian news data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was trained to guess the next word in sentences. ## How to use ## Training data and biases Trained on 500MB of Mongolian news dataset (IKON) on RTX 2060.
[ "# GPT2-Mongolia", "## Model description\n\nGPT-2 is a transformers model pretrained on a very small corpus of Mongolian news data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was trained to guess the next word in sentences.", "## How to use", "## Training data and biases\n\nTrained on 500MB of Mongolian news dataset (IKON) on RTX 2060." ]
[ "TAGS\n#transformers #tf #gpt2 #text-generation #mn #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# GPT2-Mongolia", "## Model description\n\nGPT-2 is a transformers model pretrained on a very small corpus of Mongolian news data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was trained to guess the next word in sentences.", "## How to use", "## Training data and biases\n\nTrained on 500MB of Mongolian news dataset (IKON) on RTX 2060." ]
[ 48, 7, 103, 4, 28 ]
[ "passage: TAGS\n#transformers #tf #gpt2 #text-generation #mn #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# GPT2-Mongolia## Model description\n\nGPT-2 is a transformers model pretrained on a very small corpus of Mongolian news data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was trained to guess the next word in sentences.## How to use## Training data and biases\n\nTrained on 500MB of Mongolian news dataset (IKON) on RTX 2060." ]
[ 0.06271741539239883, -0.061164915561676025, -0.0011824190150946379, 0.06055455654859543, 0.10993706434965134, 0.043125927448272705, 0.16421657800674438, 0.12473912537097931, 0.0438583642244339, -0.041048310697078705, 0.12250577658414841, 0.012713697738945484, 0.0038408078253269196, 0.16743305325508118, 0.06549090892076492, -0.3168608844280243, 0.0590779110789299, -0.02732204459607601, -0.02049143984913826, 0.1328144520521164, 0.07209770381450653, 0.02095293439924717, 0.07551462203264236, -0.006174974609166384, -0.21019594371318817, 0.01704447716474533, 0.08304993063211441, -0.1256832331418991, 0.09976744651794434, 0.1008484736084938, 0.01841580495238304, 0.06560681760311127, 0.030700266361236572, -0.012704349122941494, 0.03330189734697342, 0.003003947902470827, -0.09923844039440155, 0.02874358929693699, -0.031209168955683708, -0.0478026308119297, 0.23197540640830994, -0.009153762832283974, 0.0014296253211796284, 0.0001774937118170783, -0.144338458776474, -0.10787110030651093, 0.05634042248129845, 0.09376953542232513, -0.005080925766378641, 0.08875909447669983, -0.039073728024959564, 0.092564158141613, -0.11994849890470505, 0.05745074152946472, 0.12791988253593445, -0.3452122211456299, -0.03229045867919922, 0.18594452738761902, -0.07608959823846817, 0.057841330766677856, -0.028347086161375046, 0.07578947395086288, 0.019836707040667534, 0.015803109854459763, 0.04345102980732918, -0.08895338326692581, -0.09598631411790848, -0.03972729668021202, -0.15610450506210327, -0.035051412880420685, 0.2616366744041443, -0.020273482427001, -0.03175400570034981, -0.07030289620161057, -0.030207110568881035, 0.00499931164085865, 0.01327334064990282, -0.06731383502483368, -0.08342806994915009, 0.07235824316740036, -0.04878673329949379, -0.05622800439596176, -0.10976118594408035, -0.13814343512058258, -0.1864304542541504, 0.2568778395652771, 0.006254818290472031, 0.0754133090376854, -0.13749761879444122, 0.13398326933383942, -0.1410500407218933, -0.02632598578929901, -0.020895540714263916, -0.07627278566360474, -0.06942135840654373, 0.0003855257236864418, -0.051865629851818085, -0.06938527524471283, 0.011665906757116318, 0.10175266116857529, 0.04480225592851639, 0.02935859002172947, 0.021426748484373093, 0.06057560071349144, 0.003818303579464555, 0.1226549968123436, -0.11417347937822342, 0.0695154219865799, 0.008516852743923664, -0.13437367975711823, -0.023374086245894432, -0.04904060438275337, -0.1557217538356781, -0.08762103319168091, 0.053959570825099945, -0.003240576246753335, -0.023700324818491936, 0.21650339663028717, 0.02500554919242859, -0.03996274992823601, -0.07888138294219971, -0.04847889021039009, -0.0768032819032669, -0.06372878700494766, -0.0821991041302681, -0.015179043635725975, 0.017727425321936607, -0.015201264061033726, -0.006245071068406105, -0.05621572583913803, -0.04612177982926369, -0.020998971536755562, -0.01786048710346222, -0.12094229459762573, -0.026502417400479317, 0.094312883913517, -0.0037765484303236008, -0.22014600038528442, -0.1667054295539856, -0.0074512651190161705, 0.04289957880973816, -0.04544267803430557, 0.006544081494212151, -0.02732822485268116, -0.023204198107123375, 0.07384180277585983, -0.035814620554447174, 0.012914487160742283, -0.05338938906788826, 0.03887021169066429, -0.06920149177312851, 0.16337023675441742, -0.0820484533905983, 0.020366663113236427, -0.1190541461110115, 0.006351522170007229, -0.14725661277770996, 0.17634177207946777, -0.0015408287290483713, 0.02934313751757145, -0.05748840793967247, -0.000529301178175956, -0.08609407395124435, 0.03135526552796364, -0.045311205089092255, 0.160325288772583, -0.17158907651901245, -0.0627019926905632, 0.2930871546268463, -0.13094867765903473, -0.013487997464835644, 0.07841067761182785, -0.03281125798821449, 0.2477850317955017, 0.17871429026126862, 0.1418394148349762, 0.016729405149817467, 0.0680263489484787, 0.04389743134379387, 0.1002175360918045, -0.21406294405460358, 0.14302672445774078, 0.06995715200901031, 0.12045256048440933, -0.21900567412376404, 0.01821577176451683, -0.004961285274475813, 0.0674026757478714, -0.0643513947725296, -0.029009830206632614, 0.04510762542486191, 0.03713027387857437, 0.14726117253303528, -0.03142489492893219, 0.10728858411312103, -0.014151141978800297, -0.09571737796068192, 0.07142381370067596, 0.05193572863936424, -0.013147714547812939, 0.016246138140559196, -0.15003253519535065, -0.03728962689638138, -0.002516956767067313, 0.06921962648630142, -0.1286996603012085, -0.07368946820497513, 0.0014253745321184397, 0.07483698427677155, 0.10590901970863342, 0.09338255226612091, 0.06148876994848251, -0.0728919580578804, -0.08848810195922852, 0.038685962557792664, 0.021639565005898476, 0.005997729022055864, -0.030747896060347557, -0.1381467580795288, 0.08953000605106354, -0.01909761317074299, 0.08384803682565689, -0.08055003732442856, 0.012897321954369545, 0.06665214151144028, -0.050413165241479874, 0.00835606548935175, 0.05769810453057289, 0.03855564817786217, -0.027424093335866928, -0.027119513601064682, -0.04982947185635567, 0.03671518340706825, 0.03668934106826782, -0.20868785679340363, 0.18072395026683807, -0.11574425548315048, 0.04659181088209152, 0.12305759638547897, -0.08471298217773438, -0.1628114879131317, 0.047705039381980896, -0.03338118642568588, 0.0022986396215856075, 0.031645093113183975, -0.01382753811776638, 0.10112392902374268, -0.10611824691295624, 0.044396933168172836, -0.05433499813079834, -0.00730863306671381, 0.038207657635211945, -0.07763266563415527, 0.011828321032226086, 0.08603137731552124, 0.014761306345462799, -0.19359466433525085, 0.10825134068727493, -0.023384099826216698, 0.02401578053832054, 0.24055007100105286, 0.08145222812891006, -0.041050106287002563, -0.0619020015001297, 0.02133331447839737, 0.013473701663315296, -0.028767455369234085, -0.1360902339220047, -0.062403012067079544, 0.047165002673864365, 0.06264953315258026, 0.013027958571910858, -0.10062405467033386, -0.0645764172077179, -0.025668425485491753, -0.0860433354973793, -0.043954670429229736, 0.07571552693843842, -0.09574971348047256, 0.15110528469085693, 0.08562080562114716, 0.013710821978747845, 0.11282913386821747, 0.015365191735327244, -0.13985522091388702, 0.1586311310529709, -0.04011574015021324, -0.34133264422416687, -0.05874595046043396, -0.0681411474943161, -0.0009212733712047338, 0.09214062988758087, 0.027289627119898796, -0.12094206362962723, -0.005467158276587725, -0.022831693291664124, 0.11137109249830246, -0.054367780685424805, 0.14594578742980957, -0.05150133743882179, 0.01454215869307518, -0.017302485182881355, 0.002526279538869858, -0.02921421080827713, -0.0647089034318924, -0.1530996561050415, 0.16957423090934753, -0.20343387126922607, 0.010082256980240345, 0.16083799302577972, 0.006221258547157049, 0.03578123450279236, -0.08918018639087677, 0.19168367981910706, -0.10547176003456116, 0.024822356179356575, 0.05456690117716789, -0.036733318120241165, -0.04571190103888512, 0.14490275084972382, -0.001513628289103508, -0.09219981729984283, 0.13894101977348328, 0.010291455313563347, -0.16204217076301575, -0.1537298858165741, -0.0338512659072876, -0.032866861671209335, 0.05594148859381676, 0.038669403642416, 0.06760372221469879, 0.1305731236934662, 0.0803215280175209, -0.020611170679330826, 0.15044339001178741, 0.09611552208662033, 0.053455986082553864, 0.03505958616733551, 0.03123772330582142, 0.08479449152946472, -0.046179935336112976, -0.032129209488630295, 0.10554762929677963, -0.08584682643413544, 0.19558478891849518, 0.03958102688193321, 0.07067079842090607, 0.07607455551624298, -0.02626357600092888, 0.15982787311077118, 0.028318390250205994, 0.019284885376691818, -0.05664627626538277, -0.011146928183734417, -0.040888167917728424, -0.012943684123456478, 0.09682806581258774, -0.08795832097530365, -0.030738458037376404, -0.01552669145166874, 0.059876639395952225, 0.07745197415351868, 0.08364858478307724, 0.05286761000752449, -0.18418125808238983, -0.1516801118850708, -0.007708818651735783, -0.042981039732694626, -0.03750425577163696, 0.07177085429430008, 0.0924263745546341, -0.1769040822982788, 0.04475576803088188, 0.03365129232406616, 0.03856121003627777, -0.059945546090602875, 0.009911947883665562, -0.004558209795504808, -0.07884977012872696, -0.05063250660896301, 0.1257314682006836, -0.39469584822654724, 0.20074251294136047, 0.010800361633300781, -0.01832490973174572, -0.07021412998437881, -0.07418366521596909, 0.02727343700826168, 0.0787358283996582, 0.15885703265666962, 0.04629994556307793, -0.02332211658358574, -0.1033792719244957, -0.07069609314203262, 0.0035281595773994923, 0.061453890055418015, -0.10754775255918503, 0.04583306983113289, -0.023734301328659058, 0.033540286123752594, 0.05288130044937134, -0.01840023323893547, -0.07645168155431747, -0.030654430389404297, 0.03903914615511894, 0.04418998584151268, 0.09711311012506485, -0.009574923664331436, -0.05405472591519356, 0.022548504173755646, 0.11394397169351578, -0.08769740909337997, -0.13010960817337036, -0.11622648686170578, 0.1685735583305359, -0.04217050224542618, -0.10178527981042862, 0.025188816711306572, 0.020086703822016716, 0.06383316218852997, 0.09746502339839935, -0.18300312757492065, 0.12254785746335983, -0.06064600870013237, -0.14320151507854462, 0.033950358629226685, 0.06093377619981766, 0.16292473673820496, 0.10201382637023926, 0.033759672194719315, -0.01740504428744316, 0.04418867826461792, -0.1360645443201065, 0.034291140735149384, 0.02971307374536991, -0.103552907705307, 0.024447733536362648, 0.026261836290359497, -0.036512140184640884, -0.05371787026524544, -0.06200528144836426, 0.19703368842601776, 0.034967612475156784, -0.10084959864616394, 0.16132298111915588, 0.08226670324802399, -0.06565166264772415, -0.28093910217285156, 0.003076582681387663, 0.06150219589471817, 0.026815738528966904, -0.07807344198226929, -0.21479880809783936, 0.05533037707209587, 0.02415570616722107, 0.015307742170989513, 0.00921810045838356, -0.2608943581581116, -0.11021099239587784, 0.09343986958265305, -0.01759222149848938, 0.3126421570777893, -0.09935624897480011, -0.05320935696363449, -0.030398095026612282, -0.02694704756140709, 0.2084142416715622, -0.13392631709575653, 0.09361229091882706, -0.012722078710794449, 0.10165266692638397, -0.004337418358772993, 0.04193679615855217, 0.08022082597017288, 0.07144594192504883, 0.04519566148519516, -0.13932925462722778, -0.07813769578933716, 0.18765199184417725, 0.039809174835681915, 0.0785502940416336, 0.10292518883943558, -0.0008185886545106769, -0.045204609632492065, -0.11947286128997803, -0.09200923889875412, -0.014895455911755562, 0.027071887627243996, -0.14234104752540588, -0.06279736757278442, 0.0673619732260704, 0.04750201851129532, -0.034912265837192535, 0.014374813996255398, -0.08242540061473846, 0.05753496661782265, -0.0329519659280777, 0.10657279193401337, -0.003888658480718732, 0.054171960800886154, 0.02747751772403717, -0.03282725811004639, 0.10698837786912918, -0.15154504776000977, 0.04541472718119621, 0.05132327601313591, 0.019809704273939133, 0.15294338762760162, 0.09475772082805634, -0.05338308960199356, 0.09434276819229126, 0.0914982408285141, -0.16188254952430725, -0.1133616715669632, -0.037009283900260925, -0.10236009210348129, 0.06734338402748108, -0.013496258296072483, 0.10990772396326065, -0.08040530979633331, -0.0069551048800349236, 0.04861539229750633, -0.04621974751353264, -0.05837381258606911, 0.05767066404223442, 0.006229270715266466, 0.003165614092722535, -0.09527354687452316, 0.039528053253889084, 0.009768208488821983, 0.06519235670566559, 0.07626958191394806, -0.005123028066009283, -0.21181818842887878, -0.036437131464481354, -0.03401776775717735, 0.03246187046170235, -0.13353627920150757, -0.08210702985525131, 0.002222864655777812, -0.08628664165735245, 0.0466693677008152, 0.1181131899356842, 0.08483193814754486, 0.06338664144277573, -0.01555517129600048, 0.06657884269952774, -0.011456305161118507, -0.008445335552096367, -0.007995844818651676, -0.03572533279657364, -0.12344593554735184, 0.15809398889541626, 0.03936067968606949, 0.1031937450170517, -0.14190886914730072, -0.11493947356939316, -0.15877774357795715, 0.06751105934381485, -0.09005022048950195, 0.059873633086681366, -0.09353024512529373, -0.030882366001605988, 0.004737270530313253, -0.0867881029844284, -0.08683697134256363, -0.020292894914746284, -0.08569879829883575, 0.044261734932661057, -0.03223571926355362, 0.008028054609894753, -0.012811881490051746, -0.048678699880838394, 0.022035792469978333, 0.00951451901346445, 0.1210218220949173, 0.1835401952266693, -0.02736002206802368, 0.13357210159301758, -0.11173097044229507, 0.05463699996471405, 0.05699443817138672, -0.015295812860131264, 0.002442754339426756, -0.02197588048875332, 0.033879805356264114, -0.02847164496779442, -0.0410296767950058, 0.11175678670406342, 0.04745125025510788, -0.07086972892284393, 0.03829808533191681, -0.07224433124065399, 0.021525857970118523, -0.09263171255588531, 0.07560060173273087, 0.02522795833647251, 0.10895593464374542, 0.08530296385288239, -0.09933428466320038, -0.03503899276256561, -0.07407934963703156, 0.006072324700653553, -0.02798195369541645, -0.010130729526281357, 0.013091690838336945, -0.0696958675980568, 0.028981223702430725, -0.006845083553344011, 0.22939752042293549, 0.059986233711242676, -0.036134082823991776, 0.0038166178856045008, 0.022280901670455933, 0.0694470927119255, -0.04552863538265228, 0.14525075256824493, 0.10968353599309921, -0.012707206420600414, -0.020162969827651978, 0.07324439287185669, 0.03953591734170914, -0.00460088811814785, 0.15071596205234528, -0.09287191182374954, -0.0008995108073577285, 0.053469058126211166, -0.04999810457229614, 0.041088297963142395, -0.10646208375692368, -0.0616096556186676, -0.17242540419101715, 0.005475795827805996, -0.00012636921019293368, -0.01349708903580904, 0.20694608986377716, -0.017776180058717728, 0.032943230122327805, 0.01892055571079254, -0.05171908810734749, -0.14585687220096588, -0.31588807702064514, -0.0972806066274643, -0.16982513666152954, -0.022112052887678146, -0.06818466633558273, -0.044621095061302185, 0.13393636047840118, 0.0999438688158989, 0.013772032223641872, 0.14132091403007507, -0.04718039184808731, -0.14309373497962952, 0.07965051382780075, -0.07648971676826477, -0.04008682444691658, -0.05815572664141655, -0.013074362650513649, 0.02208051085472107, 0.030670035630464554, -0.015102269127964973, 0.02853350155055523, 0.032384175807237625, -0.010899851098656654, -0.11649132519960403, -0.041810646653175354, -0.06060904636979103, 0.05346023291349411, -0.00693012960255146, -0.05472351610660553, 0.052105892449617386, -0.09367696195840836, 0.006023991387337446, 0.25797396898269653, 0.021135522052645683, -0.061316054314374924, -0.15356510877609253, 0.21607370674610138, -0.08487063646316528, 0.06766726821660995, -0.028072550892829895, -0.03595547005534172, -0.05080657824873924, 0.23760946094989777, 0.18574091792106628, -0.0012085873167961836, 0.01641019433736801, -0.008004074916243553, 0.02100972644984722, 0.09302183985710144, 0.20373089611530304, -0.008402499370276928, 0.20733468234539032, -0.04987584054470062, -0.04867083579301834, -0.08035708963871002, -0.001616907655261457, 0.056587379425764084, 0.02896188572049141, 0.055773377418518066, 0.01404047105461359, -0.12628011405467987, 0.10417685657739639, -0.19689694046974182, -0.03546972572803497, -0.06939232349395752, -0.007123221177607775, -0.07407541573047638, 0.003460691776126623, -0.13527145981788635, 0.07512422651052475, 0.06175805628299713, -0.012745502404868603, 0.045591045171022415, -0.08773612976074219, 0.08937481045722961, -0.1430835872888565, -0.06258963793516159, 0.1390026956796646, 0.15904541313648224, 0.09582909941673279, -0.07613229006528854, 0.11351267993450165, 0.046196237206459045, -0.001995176775380969, -0.06557474285364151, 0.10219256579875946, -0.01238651666790247, 0.04419887810945511, 0.008108849637210369, 0.09311623126268387, 0.0057560657151043415, 0.003858008421957493, 0.1046171560883522, -0.04082714021205902, 0.006387822329998016, -0.029048679396510124, -0.032677747309207916, -0.1310407519340515, 0.036272723227739334, -0.06462781876325607, 0.09204847365617752, 0.19018185138702393, -0.05515965074300766, 0.03954978287220001, -0.04980141669511795, 0.009178836829960346, 0.014362064190208912, -0.06536374986171722, -0.044620443135499954, -0.09893468022346497, -0.035285331308841705, 0.010096533223986626, 0.00689890468493104, -0.11690571904182434, -0.017160724848508835, -0.051020488142967224, -0.02150026522576809, -0.0008509613107889891, 0.10286564379930496, -0.010196750052273273, 0.08024141937494278, -0.050429776310920715, -0.05701269209384918, 0.015889231115579605, 0.0990578904747963, -0.1641697734594345, -0.1350066065788269 ]
null
null
transformers
# HEL-ACH-EN ## Model description MT model translating Acholi to English initialized with weights from [opus-mt-luo-en](https://huggingface.co/Helsinki-NLP/opus-mt-luo-en) on HuggingFace. ## Intended uses & limitations Machine Translation experiments. Do not use for sensitive tasks. #### How to use ```python # You can include sample code which will be formatted from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("Ogayo/Hel-ach-en") model = AutoModelForSeq2SeqLM.from_pretrained("Ogayo/Hel-ach-en") ``` #### Limitations and bias Trained on Jehovah Witnesses data so contains theirs and Christian views. ## Training data Trained on OPUS JW300 data. Initialized with weights from [opus-mt-luo-en](https://huggingface.co/Helsinki-NLP/opus-mt-luo-en?text=Bed+gi+nyasi+mar+chieng%27+nyuol+mopong%27+gi+mor%21#model_card) ## Training procedure Remove duplicates and rows with no alphabetic characters. Used GPU ## Eval results testset | BLEU --- | --- JW300.luo.en| 46.1
{"language": ["ach", "en"], "license": "cc-by-4.0", "tags": ["translation"], "datasets": ["JW300"], "metrics": ["bleu"]}
translation
Ogayo/Hel-ach-en
[ "transformers", "pytorch", "marian", "text2text-generation", "translation", "ach", "en", "dataset:JW300", "license:cc-by-4.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "ach", "en" ]
TAGS #transformers #pytorch #marian #text2text-generation #translation #ach #en #dataset-JW300 #license-cc-by-4.0 #autotrain_compatible #endpoints_compatible #region-us
HEL-ACH-EN ========== Model description ----------------- MT model translating Acholi to English initialized with weights from opus-mt-luo-en on HuggingFace. Intended uses & limitations --------------------------- Machine Translation experiments. Do not use for sensitive tasks. #### How to use #### Limitations and bias Trained on Jehovah Witnesses data so contains theirs and Christian views. Training data ------------- Trained on OPUS JW300 data. Initialized with weights from opus-mt-luo-en Training procedure ------------------ Remove duplicates and rows with no alphabetic characters. Used GPU Eval results ------------
[ "#### How to use", "#### Limitations and bias\n\n\nTrained on Jehovah Witnesses data so contains theirs and Christian views.\n\n\nTraining data\n-------------\n\n\nTrained on OPUS JW300 data.\nInitialized with weights from opus-mt-luo-en\n\n\nTraining procedure\n------------------\n\n\nRemove duplicates and rows with no alphabetic characters. Used GPU\n\n\nEval results\n------------" ]
[ "TAGS\n#transformers #pytorch #marian #text2text-generation #translation #ach #en #dataset-JW300 #license-cc-by-4.0 #autotrain_compatible #endpoints_compatible #region-us \n", "#### How to use", "#### Limitations and bias\n\n\nTrained on Jehovah Witnesses data so contains theirs and Christian views.\n\n\nTraining data\n-------------\n\n\nTrained on OPUS JW300 data.\nInitialized with weights from opus-mt-luo-en\n\n\nTraining procedure\n------------------\n\n\nRemove duplicates and rows with no alphabetic characters. Used GPU\n\n\nEval results\n------------" ]
[ 62, 5, 80 ]
[ "passage: TAGS\n#transformers #pytorch #marian #text2text-generation #translation #ach #en #dataset-JW300 #license-cc-by-4.0 #autotrain_compatible #endpoints_compatible #region-us \n#### How to use#### Limitations and bias\n\n\nTrained on Jehovah Witnesses data so contains theirs and Christian views.\n\n\nTraining data\n-------------\n\n\nTrained on OPUS JW300 data.\nInitialized with weights from opus-mt-luo-en\n\n\nTraining procedure\n------------------\n\n\nRemove duplicates and rows with no alphabetic characters. Used GPU\n\n\nEval results\n------------" ]
[ -0.0779348835349083, 0.20699597895145416, -0.0027637998573482037, 0.15489359200000763, 0.14479616284370422, 0.06094696372747421, 0.10544323176145554, 0.08896705508232117, 0.01954226940870285, -0.020296309143304825, 0.11216849833726883, 0.09816273301839828, 0.09155095368623734, 0.07978396862745285, -0.10799712687730789, -0.16640865802764893, 0.020084913820028305, 0.09248855710029602, -0.01601301319897175, 0.07581649720668793, 0.03604535013437271, -0.07127495110034943, 0.11488449573516846, -0.0733998492360115, -0.1131255030632019, -0.04002375900745392, -0.043341707438230515, -0.03580797091126442, 0.08123735338449478, 0.0915142223238945, 0.008401949889957905, 0.024950945749878883, -0.005915566813200712, -0.15748387575149536, 0.04139724373817444, 0.011550376191735268, -0.06569588929414749, 0.009390636347234249, 0.029254309833049774, -0.047764141112565994, 0.1122017428278923, -0.1126975268125534, -0.027896786108613014, 0.04534181207418442, -0.18339158594608307, -0.11539889872074127, -0.09889248013496399, -0.060632143169641495, 0.08387862890958786, 0.08548206090927124, -0.0345364473760128, 0.1708994209766388, -0.07218825072050095, 0.09654231369495392, 0.18952511250972748, -0.12130408734083176, -0.008928983472287655, 0.02192256972193718, -0.07751832157373428, 0.0036018043756484985, -0.06563594937324524, 0.015699302777647972, 0.07253848016262054, 0.052431050688028336, -0.02319338545203209, -0.10622609406709671, -0.01181432232260704, 0.00844887550920248, -0.11914625763893127, -0.0708201453089714, 0.22432000935077667, -0.048171158879995346, 0.009939548559486866, -0.0031482900958508253, -0.04196223244071007, -0.020473113283514977, 0.03361128643155098, 0.0539078563451767, -0.039709918200969696, -0.016221024096012115, 0.012976478785276413, -0.008599141612648964, -0.06671329587697983, -0.06346443295478821, -0.1255316287279129, 0.07775789499282837, 0.10741889476776123, 0.013430406339466572, -0.015842042863368988, 0.12569116055965424, 0.022628720849752426, -0.12258122861385345, -0.034675467759370804, -0.050605833530426025, 0.03457612544298172, -0.001153634861111641, -0.06201467290520668, 0.11694835871458054, 0.019785193726420403, 0.1203218549489975, 0.02739807218313217, 0.01711234264075756, -0.05115533247590065, 0.030575329437851906, 0.017333298921585083, 0.06805106997489929, 0.06085679307579994, -0.03618578240275383, 0.116917185485363, 0.07423027604818344, -0.00000884763812791789, 0.008636889979243279, -0.06921892613172531, -0.11711262166500092, 0.041953228414058685, 0.14975526928901672, -0.009687818586826324, 0.12433511763811111, -0.05110544711351395, -0.0022025646176189184, -0.032261479645967484, -0.12239003926515579, 0.04685521498322487, -0.028199631720781326, 0.03725266084074974, -0.01814771629869938, 0.0801236629486084, -0.01945308968424797, -0.1363176852464676, -0.07462730258703232, -0.01866745762526989, 0.06267082691192627, -0.10022611171007156, -0.06578920036554337, 0.027250170707702637, -0.053417008370161057, 0.03685351088643074, -0.11602742969989777, -0.22721832990646362, -0.01438049878925085, -0.016526905819773674, 0.001707961200736463, -0.0209353007376194, -0.05306946858763695, -0.003978584427386522, -0.03384359925985336, -0.03244706243276596, 0.03195758908987045, -0.07503305375576019, 0.12763731181621552, 0.05849281698465347, 0.050373971462249756, -0.08275899291038513, 0.057815633714199066, -0.14146019518375397, -0.02860408090054989, 0.0052331662736833096, 0.06864830106496811, -0.06815795600414276, 0.04900653287768364, -0.02131953462958336, -0.07005015760660172, 0.003582870354875922, 0.050254229456186295, 0.030267873778939247, 0.16060784459114075, -0.24675969779491425, -0.031401943415403366, 0.10162077844142914, -0.1348215788602829, -0.14359106123447418, 0.18021348118782043, -0.046608634293079376, -0.0015200396301224828, 0.10268957167863846, 0.09778790175914764, -0.03893979638814926, 0.017406178638339043, 0.033964622765779495, -0.07247991859912872, 0.03158086910843849, 0.02564220316708088, 0.09897702932357788, 0.016138680279254913, -0.05364217981696129, 0.007832754403352737, -0.02558056451380253, 0.040532324463129044, -0.03237733244895935, -0.06998267024755478, 0.021269502118229866, -0.04834078624844551, -0.08172489702701569, 0.04902338609099388, -0.010266878642141819, -0.04566897079348564, 0.03177723288536072, -0.07609084993600845, 0.08481381088495255, -0.0634194016456604, 0.048686433583498, -0.034528207033872604, 0.013711974956095219, -0.092342808842659, 0.04924624040722847, -0.05304378271102905, 0.03607639670372009, 0.007798776030540466, 0.008103943429887295, 0.1008971631526947, 0.030059855431318283, 0.017860818654298782, 0.0631796196103096, -0.062482696026563644, 0.002918363083153963, 0.07325369119644165, -0.04277009889483452, -0.10556041449308395, -0.10647699236869812, 0.0028117301408201456, 0.0441238209605217, 0.027098514139652252, -0.1450510174036026, 0.06267300248146057, 0.107272207736969, 0.08196038752794266, -0.032410942018032074, 0.020785225555300713, 0.03067892976105213, 0.07956229150295258, -0.015845194458961487, -0.09416642785072327, 0.045249417424201965, -0.008525162003934383, -0.062262918800115585, 0.12213613837957382, -0.23000063002109528, 0.14675526320934296, 0.19496789574623108, -0.19907894730567932, 0.03420679643750191, 0.03191259503364563, -0.0316801443696022, -0.015025004744529724, -0.03375644236803055, 0.04139390215277672, 0.13697977364063263, -0.05230093002319336, 0.12526272237300873, -0.04502004384994507, -0.0709809884428978, -0.017756881192326546, -0.03744424879550934, -0.019907448440790176, 0.11831206828355789, 0.17338848114013672, -0.25502321124076843, 0.1592680811882019, 0.05603849142789841, -0.038833629339933395, 0.1457308828830719, 0.02328946441411972, -0.022532444447278976, -0.010148970410227776, -0.08771662414073944, 0.029111744835972786, 0.0674901157617569, -0.12747420370578766, -0.02197510376572609, 0.09179943799972534, 0.039473678916692734, 0.07966084033250809, -0.17359820008277893, -0.0056976936757564545, 0.014688696712255478, -0.011610793881118298, -0.10565365850925446, 0.04454132914543152, -0.009850261732935905, 0.07360924780368805, -0.09486079216003418, 0.09257663041353226, 0.06580295413732529, -0.01706312596797943, -0.11792667955160141, 0.14089985191822052, -0.091336190700531, -0.21345239877700806, -0.17740483582019806, -0.10132650285959244, -0.07590951770544052, 0.09611323475837708, 0.1381624937057495, 0.005604876670986414, -0.0901113823056221, -0.02142316848039627, 0.09100209921598434, -0.037609897553920746, 0.04169788584113121, -0.08509833365678787, -0.04421094059944153, -0.015700221061706543, -0.10801466554403305, -0.05879434943199158, 0.045954085886478424, -0.16748228669166565, 0.13293133676052094, -0.10562489181756973, 0.1203267052769661, -0.03210582211613655, 0.02648775279521942, 0.05870926007628441, -0.057238850742578506, 0.15951403975486755, -0.07573872804641724, -0.0921332910656929, 0.17167998850345612, -0.07451730966567993, 0.010600019246339798, -0.012472270056605339, -0.03643887862563133, -0.0974920392036438, -0.008699373342096806, 0.0568108968436718, -0.06987166404724121, -0.19402168691158295, 0.010234186425805092, -0.10008316487073898, 0.04918397590517998, 0.13952983915805817, 0.028523504734039307, 0.035740017890930176, 0.1000165194272995, -0.1353490650653839, 0.2397020310163498, 0.03674480691552162, 0.0949258804321289, 0.08287544548511505, 0.07237395644187927, 0.08379270881414413, -0.07775901257991791, -0.04380425810813904, 0.05599438026547432, 0.0007101171067915857, 0.22922006249427795, -0.08227155357599258, 0.047114625573158264, 0.11703765392303467, 0.08728622645139694, 0.07595258206129074, 0.07123295217752457, 0.015515717677772045, 0.06449490040540695, -0.03808516636490822, -0.03819224238395691, -0.1299600750207901, 0.04145185276865959, -0.038333822041749954, -0.0038197217509150505, -0.1450415402650833, 0.0340513214468956, 0.09301823377609253, 0.09150544553995132, -0.05451052263379097, -0.2941737174987793, -0.014198328368365765, 0.05266161262989044, -0.0334450788795948, -0.09596941620111465, 0.07001136243343353, 0.017564205452799797, -0.08703793585300446, 0.1400027871131897, -0.005393941421061754, 0.11445529013872147, -0.15549224615097046, 0.042703256011009216, -0.01956135779619217, -0.05372398719191551, -0.041888847947120667, 0.14460301399230957, -0.5131739974021912, 0.19967354834079742, 0.04615270718932152, 0.06773243099451065, -0.09818524867296219, -0.04284686595201492, 0.1314777135848999, 0.06301691383123398, 0.009951788932085037, -0.005400317255407572, 0.04990417882800102, -0.13632096350193024, 0.011165144853293896, 0.07926641404628754, -0.0407741479575634, -0.031801313161849976, 0.08245106041431427, -0.029375899583101273, 0.05855609104037285, -0.04182128235697746, 0.10472963005304337, -0.07277525961399078, -0.1339210867881775, 0.0614406056702137, 0.061790864914655685, 0.010720287449657917, -0.03430144488811493, -0.08560871332883835, -0.11484213173389435, 0.02690514177083969, -0.08124687522649765, -0.04995809495449066, -0.10161533951759338, 0.09524674713611603, 0.05413162708282471, -0.10341986268758774, -0.0336984358727932, 0.029529789462685585, 0.029721582308411598, -0.031457219272851944, -0.04568297043442726, 0.07031388580799103, -0.06577194482088089, -0.11956776678562164, 0.01341711450368166, 0.11763390898704529, -0.03381616994738579, 0.07850318402051926, 0.00009258960199076682, -0.001959267072379589, -0.04159170016646385, -0.09363283216953278, 0.0645555928349495, 0.06911938637495041, 0.14463260769844055, 0.11840911209583282, 0.040580637753009796, -0.05921215936541557, -0.06983047723770142, -0.08210692554712296, 0.1052478551864624, 0.2804182767868042, 0.06559091061353683, -0.005412701517343521, 0.03475802019238472, -0.09532120823860168, -0.23329682648181915, -0.012271045707166195, -0.08364767581224442, 0.04676049202680588, -0.03527986258268356, -0.11316709965467453, -0.037115927785634995, 0.0005531467031687498, 0.034486256539821625, -0.008013296872377396, -0.03226273134350777, -0.12397978454828262, 0.1372642070055008, 0.11080291867256165, 0.2713148891925812, -0.10528510063886642, -0.017673959955573082, -0.08698874711990356, -0.31707102060317993, 0.13251923024654388, -0.1809709221124649, 0.11120414733886719, -0.016639485955238342, 0.06890546530485153, -0.0014665639027953148, -0.049555256962776184, 0.18711191415786743, 0.0740898996591568, 0.11141759157180786, -0.09965460002422333, -0.1358194500207901, 0.012329311110079288, -0.01931835152208805, -0.0011132884537801147, -0.10059766471385956, -0.0060278684832155704, -0.09665250778198242, -0.07737774401903152, -0.0006842472357675433, 0.009661231189966202, -0.0030376401264220476, -0.0475694015622139, 0.04897389933466911, -0.0002877576043829322, 0.12345632165670395, 0.029900509864091873, 0.06532537937164307, -0.04075983166694641, -0.061744507402181625, -0.005090242717415094, 0.1555292308330536, -0.08187545090913773, 0.16017641127109528, -0.1247488334774971, -0.008986752480268478, 0.07819435000419617, -0.02297571487724781, 0.11759772151708603, 0.1064835861325264, -0.04625409469008446, 0.18885493278503418, 0.09538549184799194, 0.031045585870742798, 0.0319770909845829, 0.021891215816140175, -0.11195442080497742, -0.043164853006601334, -0.03944661095738411, -0.0814984142780304, -0.0012245794059708714, 0.07805345207452774, 0.09616369009017944, 0.011021403595805168, -0.040701236575841904, -0.0031443561892956495, 0.02643830142915249, -0.05254936218261719, 0.1281338483095169, -0.043447285890579224, 0.039054591208696365, -0.16054803133010864, 0.058478765189647675, 0.025540867820382118, -0.00809581857174635, 0.04402662068605423, 0.07605085521936417, -0.11522120237350464, -0.10839362442493439, -0.009159864857792854, 0.23788785934448242, -0.09699472039937973, -0.06872336566448212, -0.1417645364999771, -0.08924628794193268, 0.04239519685506821, 0.029734428972005844, 0.0887826681137085, 0.047485776245594025, -0.030720161274075508, 0.005032478366047144, -0.08194511383771896, 0.03218214213848114, 0.04132359102368355, 0.03485649824142456, -0.07679617404937744, 0.23696161806583405, -0.04401489719748497, 0.048859309405088425, -0.02830263413488865, -0.0000972967900452204, -0.056682758033275604, 0.0639871209859848, -0.1254742592573166, 0.038200777024030685, -0.059746552258729935, 0.03685687854886055, 0.020590495318174362, -0.11887114495038986, -0.05781930312514305, 0.06455191969871521, -0.1097727119922638, -0.003255451563745737, -0.015683554112911224, 0.08499041944742203, 0.017078829929232597, -0.05677513778209686, 0.06072704493999481, -0.02728944458067417, 0.060468629002571106, 0.05116424709558487, -0.07262943685054779, 0.03507192060351372, -0.08866623044013977, -0.046391941606998444, 0.03731023520231247, 0.050206296145915985, 0.002787090605124831, -0.01696903258562088, 0.017477137967944145, 0.07014285773038864, -0.021115053445100784, 0.02597757987678051, 0.09624096006155014, -0.06328403204679489, -0.019968999549746513, -0.02509438991546631, -0.07031938433647156, 0.003129194024950266, 0.06709945201873779, -0.008800934068858624, 0.10056883841753006, 0.10687024891376495, -0.03788076341152191, 0.0025918488390743732, -0.13013499975204468, 0.02100381813943386, 0.006606848910450935, -0.1279563158750534, -0.053889837116003036, -0.0979541763663292, 0.09418788552284241, 0.04999277740716934, 0.2543680667877197, 0.050685372203588486, -0.03322602063417435, -0.031224124133586884, 0.0514608770608902, 0.10552416741847992, -0.05951845273375511, 0.21012172102928162, -0.03319384902715683, -0.02309923619031906, -0.00734927412122488, 0.04833780229091644, 0.08718040585517883, 0.055064599961042404, 0.13221274316310883, 0.07418610900640488, 0.16264991462230682, 0.026191245764493942, -0.04212024062871933, -0.02816443145275116, -0.011283074505627155, -0.10894718766212463, 0.08813782036304474, 0.006710485089570284, -0.02734309248626232, -0.015736421570181847, 0.12635965645313263, -0.10641316324472427, 0.0076824151910841465, -0.062234386801719666, -0.0033541640732437372, -0.10913527756929398, -0.19401615858078003, -0.055659838020801544, -0.057960085570812225, 0.024385422468185425, -0.07917387783527374, -0.00801894161850214, 0.0854746550321579, 0.02666153572499752, -0.0761735737323761, 0.08823488652706146, -0.009019852615892887, -0.026225371286273003, -0.004074533004313707, -0.05223255977034569, -0.019122205674648285, -0.025517813861370087, -0.0642365887761116, 0.05426869913935661, -0.07877712696790695, 0.07356531172990799, -0.059120457619428635, 0.03497667238116264, 0.04843462258577347, -0.09423276782035828, -0.09180490672588348, -0.055536214262247086, 0.006177040748298168, 0.08763416856527328, 0.1909179836511612, 0.04216992110013962, 0.02452683262526989, 0.06834346055984497, 0.1370553970336914, 0.023595863953232765, -0.18182134628295898, -0.09876983612775803, 0.062414996325969696, 0.12429530173540115, -0.025328727439045906, 0.05647850036621094, -0.06944091618061066, 0.15610942244529724, 0.15889300405979156, 0.17788265645503998, -0.07076873630285263, -0.0057071177288889885, 0.006270087324082851, 0.02324673905968666, 0.042173683643341064, 0.019533641636371613, 0.06049420312047005, 0.04146493971347809, -0.06865012645721436, -0.06956285983324051, -0.0256075207144022, -0.008874921128153801, 0.01610255427658558, 0.06347202509641647, 0.023763079196214676, -0.03440653160214424, 0.01852973736822605, -0.032847702503204346, -0.02266809344291687, -0.04668697342276573, -0.021821781992912292, -0.1456144005060196, -0.06707483530044556, 0.0012677800841629505, -0.02270643785595894, 0.07956625521183014, 0.02011367864906788, -0.04359147325158119, 0.050479039549827576, 0.09202805906534195, 0.03425902873277664, -0.13513050973415375, -0.018615486100316048, 0.09034200757741928, 0.020938459783792496, 0.10538589209318161, -0.03547408804297447, 0.11417055130004883, 0.06608177721500397, 0.04967153072357178, -0.03533685952425003, 0.0694626048207283, -0.03576267138123512, 0.029808729887008667, 0.015130194835364819, 0.07471881806850433, -0.04199913889169693, -0.04269882291555405, 0.053494397550821304, -0.029572932049632072, 0.03024066612124443, -0.006127695553004742, -0.10507550835609436, -0.02335481531918049, 0.011609808541834354, -0.059656981378793716, 0.07627787441015244, 0.13456884026527405, -0.03552122786641121, 0.009819868952035904, -0.08973878622055054, 0.048607125878334045, 0.043090324848890305, -0.006674008443951607, -0.030213864520192146, -0.08974447846412659, -0.014765995554625988, -0.04670620709657669, 0.006174820009618998, -0.1152820959687233, -0.043758757412433624, -0.09820765256881714, -0.11517474055290222, -0.16914424300193787, 0.08670036494731903, 0.03025018237531185, 0.014518175274133682, -0.0417693667113781, -0.2914189398288727, 0.020176637917757034, 0.0797136127948761, -0.08457382768392563, -0.11104867607355118 ]
null
null
transformers
# Rick and Morty DialoGPT Model
{"tags": ["conversational"]}
text-generation
Oji/DialoGPT-small-Rick
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Rick and Morty DialoGPT Model
[ "# Rick and Morty DialoGPT Model" ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Rick and Morty DialoGPT Model" ]
[ 51, 10 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Rick and Morty DialoGPT Model" ]
[ -0.01990443281829357, 0.10367733240127563, -0.006012056488543749, 0.013662099838256836, 0.1287931650876999, 0.004103946499526501, 0.13405320048332214, 0.13470496237277985, -0.029608309268951416, -0.0377325713634491, 0.1409052610397339, 0.2081032246351242, -0.009616929106414318, 0.025026321411132812, -0.08027864247560501, -0.33285143971443176, 0.04419311136007309, 0.04611847549676895, -0.04805411398410797, 0.11171722412109375, 0.09962809830904007, -0.03511058911681175, 0.07650627940893173, 0.012189619243144989, -0.11959464848041534, 0.014523470774292946, 0.01571112684905529, -0.09889741986989975, 0.11399844288825989, 0.07783890515565872, 0.031239205971360207, 0.033389654010534286, -0.042143791913986206, -0.13308840990066528, 0.04855761677026749, -0.0014628645731136203, -0.03996938467025757, 0.06519230455160141, 0.0068825362250208855, -0.09896008670330048, 0.13105708360671997, 0.11774895340204239, -0.001342291128821671, 0.030811335891485214, -0.1546017825603485, -0.03095608949661255, -0.013916928321123123, 0.04583658277988434, 0.05571185424923897, 0.1092928797006607, -0.03970988467335701, 0.11546611040830612, -0.046847838908433914, 0.11656361073255539, 0.13404695689678192, -0.27711591124534607, -0.013774634338915348, 0.14150507748126984, 0.03755388408899307, 0.031246060505509377, -0.03764049708843231, 0.09234841167926788, 0.010574371553957462, -0.009135077707469463, -0.054559025913476944, -0.07839421927928925, -0.06956472247838974, 0.03881034255027771, -0.08538595587015152, -0.0028573249001055956, 0.22309143841266632, -0.029777048155665398, 0.0931403860449791, -0.061110686510801315, -0.083645299077034, 0.0022445949725806713, -0.04396601766347885, -0.031562261283397675, -0.0995510146021843, 0.08443354815244675, -0.04024428874254227, -0.08693728595972061, -0.10731299221515656, -0.022938303649425507, -0.15873323380947113, 0.16214832663536072, 0.03501884266734123, 0.03956814110279083, -0.21219894289970398, 0.07603893429040909, -0.04213596507906914, -0.10128775984048843, 0.025763655081391335, -0.0809730738401413, 0.0031352867372334003, 0.01420458871871233, -0.034850042313337326, -0.01257789321243763, 0.09354974329471588, 0.11913833022117615, -0.002085368847474456, 0.028482265770435333, -0.03459439426660538, 0.04555915296077728, 0.04445279389619827, 0.04635937884449959, -0.030874032527208328, -0.005519113503396511, 0.024999095126986504, -0.0903957337141037, -0.010871811769902706, -0.060442280024290085, -0.1946737915277481, 0.013364237733185291, 0.05735969915986061, 0.055262304842472076, 0.030765585601329803, 0.13551434874534607, 0.0010974886827170849, -0.0475224107503891, 0.03023342229425907, -0.020769428461790085, -0.016528211534023285, 0.029149476438760757, -0.0072809201665222645, 0.1526104062795639, 0.022983204573392868, 0.05690442770719528, -0.11451500654220581, 0.012773441150784492, -0.03330712020397186, -0.006917042192071676, -0.03216493874788284, -0.061537809669971466, 0.003289242973551154, 0.0014469954185187817, 0.013694697991013527, -0.12761977314949036, -0.15719962120056152, -0.003717299085110426, 0.00613630935549736, -0.05369097366929054, -0.10004933178424835, -0.10542158782482147, -0.03153182193636894, 0.046352777630090714, -0.053748197853565216, 0.03198752924799919, -0.039340607821941376, 0.09383489936590195, -0.03441528603434563, 0.0691300630569458, -0.0863635316491127, 0.0905333161354065, -0.06098577380180359, -0.04111234471201897, -0.0643690675497055, 0.12356391549110413, 0.011561519466340542, 0.04442533850669861, -0.03781363368034363, -0.01636880449950695, -0.11087207496166229, 0.06495212018489838, -0.03516015037894249, 0.22487092018127441, -0.08996163308620453, -0.09683383256196976, 0.22284504771232605, -0.04562665522098541, -0.12769415974617004, 0.12243670970201492, -0.03600937873125076, 0.09682484716176987, 0.11536505818367004, 0.16257616877555847, 0.03866875544190407, -0.0002237519365735352, 0.10846788436174393, 0.10610917955636978, -0.07603283226490021, 0.006744202226400375, 0.0250004380941391, -0.02382737584412098, -0.09139634668827057, 0.015165179036557674, 0.07776524871587753, 0.04803644120693207, -0.05478836968541145, -0.015317765064537525, 0.015090391971170902, -0.003627530997619033, 0.06564177572727203, -0.017049036920070648, 0.11691898107528687, -0.03955721855163574, -0.07620245963335037, -0.014626736752688885, 0.028113901615142822, -0.06986767798662186, 0.026787258684635162, -0.07962338626384735, 0.02948051132261753, -0.01967560686171055, 0.06687499582767487, -0.16950036585330963, -0.09430424869060516, -0.06010226905345917, 0.23349159955978394, 0.07496993243694305, 0.11698364466428757, 0.06350064277648926, -0.056928664445877075, 0.0006459777359850705, 0.037900060415267944, 0.19767099618911743, -0.006904584355652332, -0.07503941655158997, -0.11777795851230621, 0.10312607139348984, -0.07375676929950714, 0.06138577312231064, -0.0416308231651783, 0.007855354808270931, 0.019795136526226997, 0.11127804219722748, -0.04220014438033104, 0.039965033531188965, 0.012499134056270123, -0.03696384280920029, -0.05908297002315521, 0.0004571304307319224, 0.09440597146749496, -0.0005542659782804549, -0.10514124482870102, 0.2379530370235443, -0.21215155720710754, 0.12180843949317932, 0.1799643337726593, -0.2256188690662384, 0.008836638182401657, -0.10462760180234909, -0.016665222123265266, 0.01030759233981371, 0.03996801748871803, -0.040312353521585464, 0.24249082803726196, -0.014560520648956299, 0.17035135626792908, -0.04880015179514885, -0.05010494217276573, -0.0440804697573185, -0.05291803553700447, 0.0003277618088759482, 0.12486644089221954, 0.09157522767782211, -0.18372175097465515, 0.17465431988239288, 0.06325390189886093, 0.03004654310643673, 0.1566917598247528, 0.022896459326148033, 0.020663797855377197, 0.05599488690495491, -0.0012882096925750375, -0.03033529780805111, -0.07880529016256332, -0.20945574343204498, -0.012111871503293514, 0.07547834515571594, 0.04618273675441742, 0.10363037884235382, -0.1018955409526825, -0.030724551528692245, -0.006948297843337059, -0.030821966007351875, 0.03848150745034218, 0.13554143905639648, 0.015318007208406925, 0.12024796009063721, -0.019162237644195557, -0.06668011844158173, 0.0741129145026207, 0.01461794413626194, -0.09263674914836884, 0.18050695955753326, -0.1221487745642662, -0.3382752537727356, -0.10329627990722656, -0.20327065885066986, -0.04040617123246193, 0.0422586165368557, 0.11002974957227707, -0.1460546851158142, -0.029720865190029144, 0.0010455691954120994, 0.08435780555009842, -0.1366978883743286, 0.006720550823956728, -0.017843635752797127, -0.01294276025146246, -0.1374056041240692, -0.09384968876838684, -0.04747654125094414, -0.060003772377967834, -0.03218422830104828, 0.10381519794464111, -0.1596987098455429, 0.007801016326993704, 0.230968177318573, 0.04797196388244629, 0.07053504139184952, -0.036995481699705124, 0.17910921573638916, -0.08220451325178146, 0.016473548486828804, 0.24478016793727875, -0.05610832944512367, 0.0740312784910202, 0.10560029745101929, -0.005553957540541887, -0.052998270839452744, 0.03756273165345192, 0.00788428820669651, -0.0785532221198082, -0.21784749627113342, -0.1030275970697403, -0.11046822369098663, 0.04284128174185753, 0.05120398849248886, 0.04543844982981682, 0.1585974246263504, 0.06446543335914612, -0.05187172442674637, -0.011306295171380043, 0.08315242826938629, 0.08576013147830963, 0.24794787168502808, -0.06311704963445663, 0.1473274976015091, -0.020790869370102882, -0.16434483230113983, 0.07334780693054199, 0.06416254490613937, 0.07227631658315659, 0.06913222372531891, 0.11215730756521225, 0.0020037174690514803, 0.017364054918289185, 0.12614323198795319, 0.05889604985713959, -0.011050567030906677, -0.031410302966833115, -0.04586650803685188, -0.04347039759159088, -0.020151739940047264, 0.041160233318805695, 0.05188119783997536, -0.1600257307291031, -0.02415069006383419, 0.022831739857792854, 0.046689603477716446, -0.003216250566765666, 0.08608495444059372, -0.19217506051063538, -0.018159521743655205, 0.06477150321006775, -0.0016290671192109585, -0.09313707798719406, 0.08108778297901154, -0.009849769994616508, -0.09697907418012619, 0.03780587762594223, -0.03585495799779892, 0.1301390826702118, -0.0750122219324112, 0.07286842167377472, -0.1119815781712532, -0.02080838568508625, -0.0087605444714427, 0.11860883235931396, -0.3024371266365051, 0.1707288920879364, -0.0030656929593533278, -0.04842326417565346, -0.11293680220842361, -0.015061003156006336, 0.03821004554629326, 0.08916047215461731, 0.10371578484773636, -0.030773809179663658, -0.06436607241630554, 0.0791664570569992, -0.050910793244838715, 0.03525971621274948, 0.10187692940235138, -0.04662879928946495, -0.014911266043782234, -0.05685164034366608, 0.0027524156030267477, 0.02270045317709446, -0.10804066807031631, 0.014929873868823051, -0.19113284349441528, 0.07794220000505447, 0.0811065286397934, 0.0722472071647644, 0.04095001146197319, -0.029467018321156502, -0.1261810064315796, 0.2744207978248596, 0.007417048793286085, -0.09985779225826263, -0.11269644647836685, 0.04465123638510704, 0.05646880716085434, -0.07145541161298752, -0.028514720499515533, -0.07924950867891312, 0.052012015134096146, -0.07113154232501984, -0.1981293261051178, 0.11338871717453003, -0.09873685240745544, -0.04736494645476341, -0.03962721675634384, 0.2276533544063568, -0.027753405272960663, 0.02130931057035923, 0.0393831804394722, -0.001616212772205472, -0.12734149396419525, -0.09492160379886627, 0.004517016001045704, -0.0013660878175869584, 0.02586340345442295, 0.022777099162340164, -0.04388801380991936, 0.0049570053815841675, -0.06949588656425476, -0.0037953434512019157, 0.3158918023109436, 0.10998717695474625, -0.04474896565079689, 0.1561327874660492, 0.10242960602045059, -0.06360200047492981, -0.28859275579452515, -0.11298105865716934, -0.07240703701972961, -0.05466444417834282, -0.0838940367102623, -0.18133240938186646, 0.08497140556573868, -0.042584747076034546, -0.00881777424365282, 0.042027126997709274, -0.2644155025482178, -0.09412363916635513, 0.18815293908119202, -0.01533579919487238, 0.4300551414489746, -0.11307147145271301, -0.07450833916664124, -0.05387028306722641, -0.13561248779296875, 0.18766070902347565, -0.018648525699973106, 0.0966244488954544, 0.00443116994574666, 0.20654869079589844, 0.05815155804157257, -0.0008219819865189493, 0.0747876986861229, 0.011587066575884819, -0.0452013723552227, -0.09014920890331268, -0.09217863529920578, -0.020688166841864586, 0.005974666681140661, 0.034957773983478546, -0.0941787138581276, 0.05258546397089958, -0.11336535215377808, -0.05589618906378746, -0.07209338247776031, 0.026715638116002083, 0.02418643794953823, -0.06410122662782669, -0.006407043896615505, -0.048794936388731, -0.0010418962920084596, 0.00979152973741293, 0.21295785903930664, -0.11305148899555206, 0.12096642702817917, 0.04414689913392067, 0.1508360654115677, -0.08366664499044418, -0.03614836558699608, -0.04910365119576454, -0.05565084517002106, 0.0676501989364624, -0.1319035291671753, 0.04462771117687225, 0.10053624957799911, -0.030742639675736427, 0.0898696631193161, 0.11227817088365555, -0.02972952462732792, 0.0016581144882366061, 0.07279330492019653, -0.23832836747169495, -0.08509121090173721, -0.07718803733587265, 0.05435929819941521, 0.057659514248371124, 0.09007556736469269, 0.21964938938617706, 0.011087107472121716, -0.023847850039601326, 0.027587326243519783, 0.029717741534113884, -0.01658647321164608, 0.05797221511602402, 0.008770608343183994, 0.031205764040350914, -0.14632299542427063, 0.04562913626432419, -0.010501107200980186, -0.07197817414999008, 0.03429242596030235, 0.16717956960201263, -0.10209374874830246, -0.12234743684530258, -0.04288604483008385, 0.17517046630382538, -0.13247300684452057, -0.017495078966021538, -0.05478521063923836, -0.1241658553481102, 0.07977617532014847, 0.11423204839229584, 0.05072414129972458, 0.042339734733104706, -0.09691346436738968, -0.03881148621439934, -0.05552472919225693, 0.01957569271326065, 0.018891409039497375, -0.030404040589928627, -0.037885911762714386, 0.025801094248890877, -0.04172535613179207, 0.11203933507204056, -0.087384894490242, -0.09792038798332214, -0.16838693618774414, 0.03925701230764389, -0.049022991210222244, -0.07899222522974014, -0.09344983100891113, -0.03523614630103111, 0.014231358654797077, -0.03348008170723915, -0.018664700910449028, -0.02225758694112301, -0.0958842933177948, 0.03419994190335274, -0.048781368881464005, -0.005008503329008818, -0.08496184647083282, 0.017331385985016823, 0.04781922325491905, -0.023604100570082664, 0.1431105136871338, 0.12453559041023254, -0.11789791285991669, 0.10031480342149734, -0.16611437499523163, -0.06820093840360641, 0.09455996751785278, 0.02471991442143917, 0.043245621025562286, 0.028927266597747803, 0.005174829158931971, 0.04808570072054863, 0.05950818210840225, 0.03694291412830353, 0.041101954877376556, -0.07111897319555283, 0.061451081186532974, -0.06278520077466965, -0.11226452142000198, -0.04257739707827568, -0.005422866903245449, 0.00011432790051912889, 0.07346735894680023, 0.11052975058555603, -0.05098198726773262, 0.09580544382333755, -0.050767768174409866, 0.046003878116607666, 0.0289035402238369, -0.16526201367378235, 0.008764104917645454, -0.08482556790113449, 0.05248309671878815, 0.0030253108125180006, 0.15688744187355042, 0.028536081314086914, -0.03175791725516319, 0.02630779519677162, 0.05105529725551605, 0.06318540126085281, -0.00840448122471571, 0.19050461053848267, 0.09726009517908096, -0.04487645998597145, -0.09418396651744843, 0.08849480748176575, 0.05022666975855827, 0.05143674090504646, 0.1403687596321106, -0.020687401294708252, 0.012512898072600365, 0.07724163681268692, 0.014415515586733818, 0.017872430384159088, -0.07756411284208298, -0.09487451612949371, -0.011494439095258713, 0.025514457374811172, -0.02882363088428974, 0.1138797178864479, 0.16729387640953064, -0.0008394720498472452, 0.013234704732894897, -0.01801590994000435, -0.05735309422016144, -0.20129387080669403, -0.1959676295518875, -0.09400797635316849, -0.13690303266048431, -0.0009418319095857441, -0.13835963606834412, 0.03616710752248764, 0.042394787073135376, 0.09917435795068741, -0.039446551352739334, 0.019261397421360016, 0.026794444769620895, -0.10323353111743927, 0.039175424724817276, -0.04838612675666809, 0.09421038627624512, -0.007761404849588871, 0.005773975048214197, -0.046786144375801086, 0.02436385303735733, 0.02127891033887863, 0.038409680128097534, -0.012736459262669086, 0.024856114760041237, -0.11602245271205902, -0.09478921443223953, -0.058010075241327286, 0.0558818019926548, 0.0046934462152421474, 0.18179026246070862, 0.02449701726436615, -0.03384847193956375, 0.0275272186845541, 0.19317778944969177, -0.06196035072207451, -0.09709009528160095, -0.08241496980190277, 0.2182236760854721, -0.018931716680526733, 0.09253086894750595, -0.035876765847206116, 0.012440751306712627, -0.07121489197015762, 0.33243879675865173, 0.29320472478866577, -0.10524016618728638, 0.010426074266433716, -0.0019151283195242286, 0.0405552051961422, 0.1290767937898636, 0.07575080543756485, 0.11663594841957092, 0.256552129983902, -0.06501701474189758, -0.057690393179655075, -0.014668738469481468, -0.027142031118273735, -0.06502988189458847, 0.04214107245206833, 0.04939494654536247, -0.07117093354463577, -0.00912293791770935, 0.12242040783166885, -0.24606983363628387, 0.04577518254518509, -0.13518153131008148, -0.14807558059692383, -0.0726354643702507, 0.002261551097035408, 0.09914402663707733, 0.010166509076952934, 0.08546656370162964, -0.014570544473826885, -0.0710548534989357, 0.03896206244826317, 0.021210450679063797, -0.2144380509853363, 0.021960165351629257, 0.07259857654571533, -0.028754761442542076, -0.07154250144958496, -0.013138728216290474, 0.08338925242424011, 0.09720319509506226, 0.03173141926527023, -0.009079075418412685, 0.04570826143026352, -0.0000614441087236628, -0.06747788935899734, 0.035688117146492004, 0.022403022274374962, 0.01331246830523014, -0.05491582676768303, 0.07895619422197342, -0.17176033556461334, 0.020258452743291855, -0.03599786013364792, -0.06506339460611343, -0.006352625321596861, 0.02872123196721077, -0.06236473098397255, 0.0810769721865654, 0.08681372553110123, -0.010693355463445187, -0.015406738966703415, -0.019259916618466377, -0.012411676347255707, -0.028850549831986427, -0.07069326192140579, -0.09390060603618622, -0.15529757738113403, -0.12466321885585785, 0.08110006153583527, -0.008061634376645088, -0.2096063792705536, 0.012769150547683239, -0.13104628026485443, 0.04622570425271988, -0.10809949785470963, 0.09371429681777954, 0.08394473046064377, 0.020185640081763268, -0.007141938898712397, 0.003890183288604021, 0.036074474453926086, 0.07894916087388992, -0.13067346811294556, -0.08049263805150986 ]
null
null
null
AutoTokenizer
{}
null
Omar2027/AutoTokenizer
[ "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #region-us
AutoTokenizer
[]
[ "TAGS\n#region-us \n" ]
[ 6 ]
[ "passage: TAGS\n#region-us \n" ]
[ 0.024608636274933815, -0.026205500587821007, -0.009666500613093376, -0.10395516455173492, 0.08638657629489899, 0.059816278517246246, 0.01882290467619896, 0.020661840215325356, 0.23975107073783875, -0.005599027033895254, 0.1219947561621666, 0.0015615287702530622, -0.037353623658418655, 0.03733762726187706, -0.0035912662278860807, -0.17583473026752472, 0.03876631706953049, -0.018274923786520958, 0.01843859627842903, 0.026470553129911423, -0.07776834815740585, -0.07564429938793182, 0.015296397730708122, -0.10247814655303955, -0.083692267537117, 0.11002834886312485, 0.031466204673051834, -0.019670886918902397, 0.10779199749231339, -0.04243955761194229, 0.18699054419994354, -0.011512263678014278, -0.11213519424200058, -0.2536850869655609, 0.021806683391332626, -0.01765260472893715, -0.08747660368680954, 0.01506110467016697, 0.0665089413523674, -0.09014441072940826, -0.0588928684592247, 0.0795099288225174, -0.01132340170443058, 0.04246443510055542, -0.27593839168548584, -0.12684126198291779, -0.05297930911183357, -0.1421966552734375, 0.08651168644428253, 0.04035491496324539, 0.008764253929257393, 0.15506891906261444, -0.20897391438484192, 0.004104613792151213, 0.08255259692668915, -0.2538507878780365, 0.05591634660959244, 0.17671173810958862, 0.03623908758163452, 0.18037272989749908, 0.0060391901060938835, 0.11029672622680664, 0.0716743916273117, -0.024263937026262283, -0.17590197920799255, -0.08127854019403458, -0.04696211963891983, 0.16642488539218903, -0.06727185100317001, -0.14248386025428772, 0.34701237082481384, 0.00015008423360995948, 0.009657775051891804, 0.16921205818653107, -0.059524230659008026, -0.09972117841243744, 0.07259953022003174, 0.016484731808304787, 0.018492350354790688, 0.1471305936574936, 0.16307872533798218, -0.0458691343665123, -0.13837823271751404, -0.018630273640155792, -0.22798998653888702, 0.17510560154914856, -0.03248048573732376, 0.13137903809547424, -0.27447956800460815, 0.01684025302529335, -0.2570667266845703, 0.0032130838371813297, 0.04178816080093384, -0.06004921346902847, -0.0226522795855999, -0.013265985064208508, -0.08018817007541656, 0.004899587947875261, 0.06192673370242119, 0.1266920566558838, -0.06128726154565811, 0.06128238886594772, -0.09319206327199936, 0.141696035861969, 0.07166698575019836, 0.07868369668722153, 0.13037432730197906, 0.041205424815416336, -0.07187089323997498, -0.21872246265411377, -0.0026476888451725245, -0.06275863200426102, -0.09502086788415909, -0.0020165652967989445, -0.11606067419052124, 0.17244569957256317, -0.030802514404058456, -0.09825427830219269, -0.11208184063434601, 0.09148659557104111, -0.032992321997880936, -0.03437839448451996, -0.03552987426519394, -0.020977836102247238, 0.019381176680326462, 0.04704452306032181, -0.1548958420753479, -0.005131472367793322, 0.07039852440357208, 0.11502562463283539, -0.1346137970685959, -0.003783059772104025, -0.07908964157104492, 0.03039063885807991, 0.07654735445976257, -0.16510222852230072, 0.03158547356724739, -0.1124754324555397, -0.07531405985355377, 0.002912673633545637, -0.015710093080997467, -0.016202643513679504, 0.166526660323143, -0.0020451415330171585, 0.0714716836810112, -0.026345307007431984, -0.05890209600329399, -0.11243434250354767, -0.08489254862070084, 0.05390460044145584, 0.03670717030763626, 0.03266148269176483, -0.2193479984998703, 0.014805203303694725, -0.12762966752052307, 0.1360815018415451, -0.10566820204257965, -0.04705966264009476, -0.022842247039079666, 0.20562705397605896, 0.037286072969436646, 0.08762791007757187, -0.22171171009540558, 0.039756543934345245, -0.05404696613550186, 0.18480908870697021, -0.1502426266670227, -0.0799463614821434, 0.20813211798667908, -0.07964949309825897, -0.10115210711956024, 0.021235812455415726, 0.020391687750816345, 0.026287272572517395, 0.0766737088561058, 0.4564172327518463, -0.09766800701618195, -0.09146861732006073, 0.10178250074386597, 0.17055274546146393, -0.12427149713039398, -0.1827561855316162, 0.06446871906518936, -0.16666454076766968, -0.1973118633031845, 0.0018917324487119913, 0.09222044050693512, 0.038269978016614914, -0.07875611633062363, -0.020746968686580658, 0.06325206160545349, -0.0007678253459744155, 0.09095914661884308, 0.03755716234445572, 0.09034032374620438, -0.08716782182455063, 0.11115926504135132, -0.05017651244997978, 0.004037132486701012, 0.1343354731798172, 0.027325427159667015, -0.03223329409956932, 0.08694463223218918, -0.0485352948307991, 0.05295134335756302, -0.1662379503250122, -0.15068690478801727, 0.03398871049284935, 0.06283251196146011, 0.03186952322721481, 0.1280253529548645, 0.08141885697841644, -0.10732853412628174, 0.022690722718834877, -0.004228927195072174, 0.058398615568876266, 0.03891623765230179, 0.006107209715992212, 0.008764320984482765, 0.0961301177740097, -0.10607069730758667, -0.13589619100093842, -0.07336436957120895, -0.014715781435370445, 0.14371353387832642, -0.0302802175283432, 0.07690227776765823, -0.004240254405885935, 0.00013200697139836848, 0.06930823624134064, 0.08137880265712738, 0.016412746161222458, 0.08971183747053146, -0.05237193778157234, -0.05160155147314072, 0.10863113403320312, -0.13533565402030945, 0.17837053537368774, 0.14053137600421906, -0.20532016456127167, 0.029453208670020103, -0.06838275492191315, 0.03670361638069153, -0.008162540383636951, 0.0975119024515152, -0.08272241055965424, -0.02106042578816414, 0.013134466484189034, 0.0052274600602686405, -0.013007243163883686, 0.017682146281003952, -0.07295988500118256, -0.07787393033504486, -0.10233919322490692, 0.08436838537454605, 0.11562882363796234, -0.10282530635595322, 0.14214380085468292, 0.4384984076023102, 0.11495281755924225, 0.21582984924316406, -0.09581480920314789, -0.0412987545132637, 0.007486371789127588, 0.0001535322517156601, -0.04476691037416458, 0.08031861484050751, -0.15973517298698425, -0.038901735097169876, 0.027348900213837624, 0.07128690183162689, 0.11475157737731934, -0.14959022402763367, -0.09639324247837067, -0.00793045200407505, 0.0022841424215584993, -0.1249532699584961, 0.023905446752905846, -0.03974650055170059, 0.04015624523162842, 0.07232289016246796, -0.021535737439990044, 0.13939237594604492, -0.04166141897439957, -0.0639561116695404, 0.07585346698760986, -0.2017085999250412, -0.23179671168327332, -0.12309670448303223, -0.14680525660514832, 0.04366797208786011, 0.05154111236333847, 0.01726446859538555, -0.17635835707187653, -0.015074856579303741, 0.07706750929355621, 0.07820965349674225, -0.20886357128620148, -0.022814949974417686, -0.004290030337870121, 0.0895976573228836, -0.10227091610431671, -0.0017130117630586028, -0.04419664293527603, -0.10150232166051865, 0.0017003051470965147, 0.07279510796070099, -0.137485533952713, 0.13807645440101624, 0.21589438617229462, 0.07225540280342102, 0.07359948754310608, -0.019093448296189308, 0.09936179965734482, -0.10856141895055771, -0.16549113392829895, 0.08348225057125092, -0.06234746053814888, 0.047262318432331085, 0.17534415423870087, 0.03307317942380905, -0.13904969394207, -0.015682822093367577, -0.0402069091796875, -0.15603256225585938, -0.238995760679245, -0.09178274869918823, -0.1182505264878273, 0.16442428529262543, 0.0009358620154671371, 0.06651917099952698, 0.08258313685655594, -0.022042419761419296, 0.16447891294956207, -0.07379321753978729, -0.07578866183757782, -0.006978808436542749, 0.12375060468912125, -0.056660156697034836, -0.03080669604241848, -0.10566964000463486, -0.008295975625514984, 0.1151021271944046, 0.15304014086723328, 0.12214863300323486, 0.2957419455051422, 0.08268889784812927, 0.026645636186003685, 0.08958091586828232, 0.17622539401054382, 0.09495089203119278, 0.07838419824838638, -0.045413073152303696, -0.014814783819019794, 0.014317171648144722, -0.04022889584302902, 0.010141594335436821, 0.14683100581169128, -0.2679629921913147, -0.006678564939647913, -0.2710230350494385, 0.0965198427438736, -0.10913380235433578, 0.11837165057659149, -0.01015760749578476, 0.10194015502929688, 0.11082887649536133, 0.03233652561903, -0.03858073800802231, 0.16613617539405823, 0.08450309932231903, -0.11277695000171661, 0.001758623169735074, 0.03737903758883476, 0.09715615212917328, -0.02818971499800682, 0.12721189856529236, -0.11048974841833115, -0.1464834064245224, 0.013753619976341724, 0.07152791321277618, -0.15373679995536804, 0.3138748109340668, 0.012069208547472954, -0.13481520116329193, -0.01481647603213787, -0.09957809001207352, -0.006440147757530212, 0.1254177987575531, 0.09333524852991104, 0.07935678958892822, -0.2185502052307129, -0.13339371979236603, 0.05872276425361633, -0.00575496768578887, 0.22408108413219452, -0.034034017473459244, -0.11356475204229355, -0.027013886719942093, 0.04241163283586502, -0.06043251231312752, 0.08524788916110992, 0.023536119610071182, -0.08113526552915573, -0.032957352697849274, 0.05323701351881027, 0.012368366122245789, 0.00524376705288887, 0.09360801428556442, 0.020107939839363098, -0.0009265501867048442, 0.01785753294825554, 0.047885000705718994, -0.0675911232829094, -0.1984109878540039, 0.09357594698667526, -0.05215044692158699, 0.0015536568826064467, -0.08013670891523361, -0.15122665464878082, -0.08837161958217621, -0.16009655594825745, 0.12540200352668762, -0.034406669437885284, 0.12700119614601135, -0.06619787961244583, 0.17341409623622894, -0.07871770113706589, 0.04481020197272301, -0.047349292784929276, 0.050332702696323395, -0.007268077693879604, -0.07756082713603973, 0.16585899889469147, -0.15564003586769104, 0.01809087023139, 0.19572502374649048, -0.018915493041276932, 0.07177707552909851, 0.021322092041373253, -0.0636206790804863, 0.23147478699684143, 0.3014698624610901, 0.008138049393892288, 0.1665448248386383, 0.3018903136253357, -0.07466315478086472, -0.2642788887023926, -0.05505012720823288, -0.2841376066207886, -0.05371501296758652, 0.10716094076633453, -0.22523896396160126, 0.06986407935619354, 0.14383509755134583, -0.06471995264291763, 0.30228954553604126, -0.21825523674488068, 0.012589273042976856, 0.15434536337852478, -0.08868814259767532, 0.5515313148498535, -0.1133413165807724, -0.17677772045135498, -0.008122089318931103, -0.08741296827793121, 0.10602109134197235, -0.0340677872300148, 0.06877441704273224, 0.013465235009789467, 0.04797380417585373, 0.048932258039712906, -0.03111894056200981, 0.22701001167297363, 0.008710170164704323, 0.09015397727489471, -0.07378865778446198, -0.18624304234981537, 0.11639340221881866, -0.04359482601284981, -0.08891059458255768, 0.0849778801202774, -0.05942516401410103, -0.11078983545303345, 0.04663389176130295, -0.07950539886951447, -0.024862350896000862, 0.08423490077257156, -0.04678233340382576, -0.042606171220541, -0.008054176345467567, -0.1618063747882843, -0.0002289071271661669, 0.31360217928886414, -0.07096036523580551, 0.16695955395698547, 0.03677211329340935, 0.00038613268407061696, -0.11027684062719345, 0.030288029462099075, -0.05203165486454964, -0.021576624363660812, 0.09578979015350342, -0.11096979677677155, 0.03204701095819473, 0.14160704612731934, -0.04864364117383957, 0.05846960097551346, 0.09256096184253693, -0.0849417969584465, 0.007583672646433115, 0.17753590643405914, -0.17537221312522888, -0.1273445188999176, -0.006135711446404457, -0.09862716495990753, 0.14055661857128143, 0.04394126310944557, 0.05191568285226822, 0.16669964790344238, 0.03967129811644554, -0.029474308714270592, -0.02817419543862343, -0.1153380498290062, -0.0201893113553524, 0.040153320878744125, 0.00045633706031367183, -0.08791285753250122, 0.2262638509273529, 0.06409153342247009, -0.1328488290309906, -0.051157206296920776, 0.2161225974559784, -0.06805316358804703, -0.04911920800805092, -0.223562553524971, 0.10752306133508682, -0.07112517952919006, -0.0965060144662857, 0.05453834682703018, -0.02270081453025341, 0.005106312222778797, 0.181985542178154, 0.03941008821129799, 0.11070270836353302, 0.03738937899470329, -0.02448922023177147, 0.15798696875572205, -0.142850860953331, -0.14191335439682007, -0.025354057550430298, -0.08757315576076508, -0.13844476640224457, -0.026804137974977493, 0.1617041826248169, -0.09177309274673462, -0.14772607386112213, -0.2621181011199951, 0.10968475043773651, -0.16432365775108337, -0.10192688554525375, -0.03469514101743698, -0.08968492597341537, 0.0696166530251503, 0.030301768332719803, -0.03093348816037178, -0.06706760823726654, -0.18593791127204895, 0.0816768929362297, 0.06349513679742813, 0.045533183962106705, -0.017847947776317596, 0.0067379772663116455, 0.1720137596130371, 0.025955144315958023, 0.10040043294429779, 0.16762186586856842, 0.011397695168852806, 0.2246655523777008, -0.1671202927827835, -0.11496317386627197, 0.1336962729692459, -0.026543032377958298, 0.06762003898620605, 0.16792191565036774, -0.0772583931684494, 0.015526676550507545, -0.028136352077126503, 0.07066910713911057, -0.11003983020782471, -0.105624258518219, 0.007937257178127766, 0.02567129209637642, -0.2755882740020752, -0.005599735304713249, -0.19717298448085785, 0.14788752794265747, 0.02579621411859989, 0.03297143429517746, 0.10257530212402344, 0.10404334217309952, 0.08312062919139862, -0.0017710148822516203, 0.03226327523589134, -0.1176818460226059, 0.02753005363047123, -0.059239376336336136, -0.020663779228925705, 0.017624232918024063, 0.36952024698257446, -0.03603357449173927, -0.046802736818790436, 0.003710439894348383, 0.1307835876941681, -0.02139742486178875, 0.017395347356796265, 0.13209912180900574, 0.12607666850090027, -0.08595693111419678, -0.1504845917224884, 0.04888554662466049, -0.04565655067563057, -0.02836887165904045, 0.1464131623506546, 0.05905961990356445, 0.1050296202301979, 0.0908031314611435, -0.014463032595813274, -0.00318976235575974, 0.012856799177825451, -0.15486004948616028, 0.06223496049642563, -0.010558074340224266, 0.012565906159579754, 0.017934376373887062, 0.15238402783870697, -0.005540105979889631, 0.07739730179309845, -0.09889880567789078, 0.004208535887300968, -0.13498884439468384, -0.07913459837436676, 0.03617347031831741, -0.13393273949623108, 0.04141177982091904, -0.01871878281235695, 0.029611799865961075, 0.30386561155319214, 0.02558239921927452, -0.020639164373278618, 0.12512871623039246, -0.1214587539434433, -0.12050267308950424, -0.001594188273884356, -0.029960084706544876, 0.0791488066315651, -0.02633434161543846, -0.0997740775346756, -0.1001306027173996, -0.15166029334068298, -0.09759195148944855, 0.05182836204767227, -0.04993441700935364, -0.059362251311540604, -0.17634081840515137, -0.05707859992980957, -0.05147340148687363, 0.14025864005088806, -0.12263951450586319, 0.15159130096435547, -0.014490418136119843, 0.004084470681846142, 0.04405883327126503, 0.1950942426919937, -0.03644494712352753, 0.08714226633310318, 0.0154351145029068, 0.1522706001996994, -0.05119588226079941, 0.14720745384693146, -0.10931728035211563, -0.04014137014746666, -0.06710435450077057, 0.21513493359088898, 0.25630924105644226, -0.06136954948306084, -0.008937356993556023, -0.012760217301547527, 0.058654606342315674, 0.1073930487036705, 0.16049085557460785, 0.002326392102986574, 0.2802925705909729, -0.03133585304021835, 0.04815128445625305, 0.02901598811149597, 0.013607407920062542, -0.06336209923028946, 0.03397751972079277, 0.07539387792348862, -0.035039983689785004, -0.1412304788827896, 0.15837742388248444, -0.21980468928813934, 0.18157227337360382, 0.11640069633722305, -0.19996967911720276, -0.013728445395827293, -0.04882071167230606, 0.1689416468143463, -0.0856364443898201, 0.1637246012687683, -0.0903693437576294, -0.2108195722103119, -0.2056000679731369, 0.03867346793413162, -0.34623071551322937, -0.254462867975235, 0.10422009229660034, 0.1488201916217804, 0.04015883058309555, -0.018507536500692368, -0.019967829808592796, -0.018367022275924683, 0.04877542704343796, -0.0067357709631323814, 0.06014643982052803, 0.031397558748722076, -0.02988368645310402, -0.24127542972564697, -0.029804671183228493, 0.023964406922459602, -0.07093082368373871, 0.07464958727359772, -0.06874357163906097, -0.022495782002806664, 0.08059766888618469, -0.03066304884850979, 0.03298592567443848, -0.035373736172914505, -0.16326889395713806, 0.027529051527380943, 0.03900543600320816, 0.036012712866067886, 0.00634160777553916, 0.0008072225609794259, -0.03455270454287529, 0.0644603744149208, -0.16716794669628143, -0.16015739738941193, 0.14140215516090393, -0.06745140254497528, 0.2779497504234314, -0.05812826007604599, -0.0809100940823555, 0.04766704887151718, -0.03426874056458473, 0.1807648241519928, -0.07756473124027252, 0.047254521399736404, 0.12766779959201813, 0.011127962730824947, 0.03121316432952881, -0.3092964291572571, 0.11082969605922699, -0.000795336440205574, -0.006093299947679043, -0.07581598311662674 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-distilled-clinc This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the clinc_oos dataset. It achieves the following results on the evaluation set: - Loss: 0.1259 - Accuracy: 0.9332 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 48 - eval_batch_size: 48 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 7 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 318 | 0.5952 | 0.7355 | | 0.7663 | 2.0 | 636 | 0.3130 | 0.8742 | | 0.7663 | 3.0 | 954 | 0.2024 | 0.9206 | | 0.3043 | 4.0 | 1272 | 0.1590 | 0.9235 | | 0.181 | 5.0 | 1590 | 0.1378 | 0.9303 | | 0.181 | 6.0 | 1908 | 0.1287 | 0.9329 | | 0.1468 | 7.0 | 2226 | 0.1259 | 0.9332 | ### Framework versions - Transformers 4.16.2 - Pytorch 1.10.2+cu102 - Datasets 1.18.3 - Tokenizers 0.11.0
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["clinc_oos"], "metrics": ["accuracy"], "model-index": [{"name": "distilbert-base-uncased-distilled-clinc", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "clinc_oos", "type": "clinc_oos", "args": "plus"}, "metrics": [{"type": "accuracy", "value": 0.9332258064516129, "name": "Accuracy"}]}, {"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "clinc_oos", "type": "clinc_oos", "config": "small", "split": "test"}, "metrics": [{"type": "accuracy", "value": 0.8587272727272727, "name": "Accuracy", "verified": true}, {"type": "precision", "value": 0.8619245385984416, "name": "Precision Macro", "verified": true}, {"type": "precision", "value": 0.8587272727272727, "name": "Precision Micro", "verified": true}, {"type": "precision", "value": 0.8797945801452213, "name": "Precision Weighted", "verified": true}, {"type": "recall", "value": 0.9359690949227375, "name": "Recall Macro", "verified": true}, {"type": "recall", "value": 0.8587272727272727, "name": "Recall Micro", "verified": true}, {"type": "recall", "value": 0.8587272727272727, "name": "Recall Weighted", "verified": true}, {"type": "f1", "value": 0.8922503214655346, "name": "F1 Macro", "verified": true}, {"type": "f1", "value": 0.8587272727272727, "name": "F1 Micro", "verified": true}, {"type": "f1", "value": 0.8506829426037475, "name": "F1 Weighted", "verified": true}, {"type": "loss", "value": 0.9798759818077087, "name": "loss", "verified": true}]}]}]}
text-classification
Omar95farag/distilbert-base-uncased-distilled-clinc
[ "transformers", "pytorch", "distilbert", "text-classification", "generated_from_trainer", "dataset:clinc_oos", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #distilbert #text-classification #generated_from_trainer #dataset-clinc_oos #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
distilbert-base-uncased-distilled-clinc ======================================= This model is a fine-tuned version of distilbert-base-uncased on the clinc\_oos dataset. It achieves the following results on the evaluation set: * Loss: 0.1259 * Accuracy: 0.9332 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 48 * eval\_batch\_size: 48 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 7 ### Training results ### Framework versions * Transformers 4.16.2 * Pytorch 1.10.2+cu102 * Datasets 1.18.3 * Tokenizers 0.11.0
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 48\n* eval\\_batch\\_size: 48\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 7", "### Training results", "### Framework versions\n\n\n* Transformers 4.16.2\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.3\n* Tokenizers 0.11.0" ]
[ "TAGS\n#transformers #pytorch #distilbert #text-classification #generated_from_trainer #dataset-clinc_oos #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 48\n* eval\\_batch\\_size: 48\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 7", "### Training results", "### Framework versions\n\n\n* Transformers 4.16.2\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.3\n* Tokenizers 0.11.0" ]
[ 66, 98, 4, 35 ]
[ "passage: TAGS\n#transformers #pytorch #distilbert #text-classification #generated_from_trainer #dataset-clinc_oos #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 48\n* eval\\_batch\\_size: 48\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 7### Training results### Framework versions\n\n\n* Transformers 4.16.2\n* Pytorch 1.10.2+cu102\n* Datasets 1.18.3\n* Tokenizers 0.11.0" ]
[ -0.10624545812606812, 0.08661370724439621, -0.001675302512012422, 0.12667740881443024, 0.16414259374141693, 0.027780307456851006, 0.10833047330379486, 0.12529399991035461, -0.09676161408424377, 0.01764763705432415, 0.11109938472509384, 0.16573821008205414, 0.02727479673922062, 0.12059886008501053, -0.0833330750465393, -0.2401067614555359, 0.0036538911517709494, 0.03820004314184189, -0.06479282677173615, 0.12576577067375183, 0.09774724394083023, -0.1148993968963623, 0.10119415074586868, 0.009369483217597008, -0.17720749974250793, 0.011948203667998314, 0.00437526311725378, -0.06610886752605438, 0.11769925802946091, 0.037738751620054245, 0.10006029158830643, 0.007521272636950016, 0.08987169712781906, -0.19710056483745575, 0.0052537983283400536, 0.042830996215343475, -0.014574709348380566, 0.07258975505828857, 0.033733099699020386, 0.00429279450327158, 0.15523386001586914, -0.1016901507973671, 0.051369454711675644, 0.02079678699374199, -0.11698374152183533, -0.2054462730884552, -0.07629501074552536, 0.03196282684803009, 0.09208063036203384, 0.13466230034828186, 0.0023685914929956198, 0.13163594901561737, -0.11682996898889542, 0.08739764243364334, 0.20839162170886993, -0.26006028056144714, -0.06244510039687157, 0.014851790852844715, 0.008691859431564808, 0.04943062365055084, -0.10961295664310455, -0.057031307369470596, 0.044204991310834885, 0.03823542222380638, 0.09808842092752457, -0.043747156858444214, -0.09082157909870148, 0.021116076037287712, -0.13106630742549896, -0.037052903324365616, 0.18354421854019165, 0.06819751113653183, -0.04085879400372505, -0.0277774129062891, -0.05475873872637749, -0.16221114993095398, -0.031153161078691483, 0.006313842721283436, 0.06877177953720093, -0.02135160006582737, -0.027934862300753593, 0.02066228538751602, -0.11018606275320053, -0.041875340044498444, -0.09638924151659012, 0.1328384131193161, 0.02471908926963806, 0.01347474567592144, -0.02444152534008026, 0.09727754443883896, 0.025043804198503494, -0.12183479964733124, -0.014653751626610756, 0.038588788360357285, 0.019101377576589584, -0.040593527257442474, -0.06183131784200668, -0.014543533325195312, 0.029200507327914238, 0.11040481925010681, -0.0364483967423439, 0.031150054186582565, 0.03503870591521263, 0.03343271464109421, -0.07681206613779068, 0.19677694141864777, -0.01689527928829193, -0.012729267589747906, 0.024718957021832466, 0.03762296587228775, 0.010555448941886425, -0.005993723403662443, -0.11755143851041794, 0.011896833777427673, 0.0736306756734848, -0.004603134468197823, -0.066510409116745, 0.06431866437196732, -0.06725683808326721, -0.02574598416686058, -0.007015057373791933, -0.10293290764093399, 0.048775479197502136, 0.0030792311299592257, -0.08594737946987152, -0.01405243668705225, 0.035174984484910965, 0.03747870773077011, -0.03680947422981262, 0.09333605319261551, -0.0840643048286438, 0.035992298275232315, -0.09255073219537735, -0.07946687936782837, 0.012115021236240864, -0.08966752141714096, 0.047315336763858795, -0.10226605087518692, -0.17974629998207092, -0.04104987904429436, 0.056152161210775375, -0.008434792049229145, -0.08084121346473694, -0.0863407775759697, -0.07220278680324554, 0.013732184655964375, -0.008486711420118809, 0.11462639272212982, -0.07101387530565262, 0.08675852417945862, 0.030769089236855507, 0.044093284755945206, -0.07671461999416351, 0.05738384276628494, -0.1311824470758438, 0.009378565475344658, -0.11698029935359955, 0.033711813390254974, -0.01985064335167408, 0.08308614045381546, -0.06462167203426361, -0.10104264318943024, 0.025394175201654434, 0.0054634646512568, 0.04978695884346962, 0.09157764911651611, -0.1532628834247589, -0.06351479887962341, 0.12321314960718155, -0.06001051887869835, -0.12868846952915192, 0.10369223356246948, -0.05773185193538666, 0.03567530959844589, 0.05510607361793518, 0.14567899703979492, 0.06105898320674896, -0.06416695564985275, 0.003098964225500822, -0.006001901812851429, 0.06258789449930191, -0.07852640002965927, 0.09415728598833084, 0.010566333308815956, 0.0039642308838665485, 0.03385186940431595, -0.033370088785886765, 0.04006491228938103, -0.07630950957536697, -0.1042398139834404, -0.04934892803430557, -0.08084649592638016, -0.0016543042147532105, 0.0734124481678009, 0.0706235021352768, -0.10589120537042618, -0.07376530021429062, 0.031325772404670715, 0.10027821362018585, -0.05397748574614525, 0.02102045901119709, -0.07092329114675522, 0.06760144233703613, -0.04289942607283592, -0.01955045387148857, -0.1699264496564865, -0.003038486000150442, 0.002304807770997286, 0.022768843919038773, 0.006894058547914028, 0.02451452612876892, 0.05912678316235542, 0.05587853118777275, -0.03254430368542671, -0.028835518285632133, -0.030031055212020874, -0.0018815200310200453, -0.11814401298761368, -0.19009888172149658, -0.021653855219483376, -0.016852395609021187, 0.15822266042232513, -0.22692525386810303, 0.03535640984773636, -0.010086862370371819, 0.072825588285923, 0.007788150571286678, -0.0008785433601588011, -0.05714300647377968, 0.08109021931886673, -0.05168398469686508, -0.05488601699471474, 0.06610358506441116, 0.020073294639587402, -0.08726269751787186, -0.06353074312210083, -0.07652324438095093, 0.19656583666801453, 0.14065159857273102, -0.09869696199893951, -0.04856465011835098, -0.00517510948702693, -0.07814083248376846, -0.02716170810163021, -0.046486709266901016, 0.05983247980475426, 0.21780931949615479, -0.032003145664930344, 0.1297861486673355, -0.06588423997163773, -0.0318605899810791, 0.024485627189278603, -0.042225223034620285, 0.014542308636009693, 0.13199740648269653, 0.13476765155792236, -0.10322287678718567, 0.15792612731456757, 0.14622047543525696, -0.07679162174463272, 0.11325104534626007, -0.049291953444480896, -0.06331951916217804, -0.022104009985923767, -0.03023199923336506, -0.011291947215795517, 0.08796243369579315, -0.16563519835472107, 0.010783323086798191, 0.022067073732614517, 0.01916361227631569, 0.020224913954734802, -0.2131257951259613, -0.03599212318658829, 0.04881007596850395, -0.02724633179605007, -0.0407792329788208, -0.02870332822203636, 0.006082163657993078, 0.09576424211263657, -0.0030934142414480448, -0.10181446373462677, 0.06044106185436249, 0.007986070588231087, -0.08148123323917389, 0.2135593742132187, -0.08600444346666336, -0.15453870594501495, -0.12422402203083038, -0.070499949157238, -0.06579140573740005, 0.015857376158237457, 0.0727567970752716, -0.07187854498624802, -0.040649399161338806, -0.08440690487623215, 0.01487342081964016, 0.010372673161327839, 0.038042839616537094, 0.03278288617730141, 0.017748940736055374, 0.06876927614212036, -0.09140811860561371, -0.03843558952212334, -0.04291360452771187, -0.06842868030071259, 0.03790505975484848, 0.0237526074051857, 0.12410717457532883, 0.11937254667282104, -0.013731243088841438, 0.004653175361454487, -0.007785776164382696, 0.20609793066978455, -0.06433255225419998, -0.049978990107774734, 0.1328423172235489, 0.0021499430295079947, 0.03200327977538109, 0.1153682991862297, 0.04959066957235336, -0.09048789739608765, 0.006136697251349688, 0.03174986317753792, -0.021378276869654655, -0.22624154388904572, -0.04562852904200554, -0.06672821193933487, -0.02309809811413288, 0.0934201329946518, 0.036685239523649216, 0.042990341782569885, 0.07133898138999939, 0.0477842278778553, 0.1073324978351593, -0.04294200986623764, 0.05199198052287102, 0.11117658019065857, 0.05917287617921829, 0.10520516335964203, -0.0354352705180645, -0.05575265362858772, 0.05012309551239014, -0.030116738751530647, 0.21013526618480682, 0.010185904800891876, 0.11510082334280014, 0.043794896453619, 0.16108214855194092, -0.01831556111574173, 0.07672398537397385, 0.0167236365377903, -0.030705789104104042, -0.013402105309069157, -0.02734566479921341, -0.04555124044418335, 0.0414036326110363, -0.06098964810371399, 0.0853642076253891, -0.15584582090377808, 0.03114396333694458, 0.05485876277089119, 0.2688453197479248, 0.016223864629864693, -0.34212788939476013, -0.0888296514749527, 0.012130691669881344, -0.04305506497621536, -0.03709181770682335, 0.0427396222949028, 0.07281257957220078, -0.09228929877281189, 0.0186627097427845, -0.03835136443376541, 0.10240116715431213, -0.051816605031490326, 0.044700298458337784, 0.07746624946594238, 0.08140487968921661, 0.015751400962471962, 0.0997595340013504, -0.31339651346206665, 0.26015761494636536, -0.0089661730453372, 0.08360947668552399, -0.08687826246023178, 0.009262009523808956, 0.028384380042552948, 0.08191762119531631, 0.08687793463468552, -0.009808114729821682, -0.06163103133440018, -0.18145106732845306, -0.07123439759016037, 0.042825888842344284, 0.04731786623597145, -0.0637107640504837, 0.0936666876077652, -0.034781020134687424, 0.0061266799457371235, 0.05843605473637581, 0.003931433893740177, -0.034027379006147385, -0.09870649129152298, -0.012150215916335583, 0.047027021646499634, -0.023297326639294624, -0.07002723217010498, -0.09958356618881226, -0.09439932554960251, 0.1691814512014389, -0.011394262313842773, -0.024294551461935043, -0.1177077665925026, 0.08908537030220032, 0.06005393713712692, -0.08767413347959518, 0.017104744911193848, 0.023349657654762268, 0.06504029780626297, 0.046827610582113266, -0.08238987624645233, 0.11820191144943237, -0.06702904403209686, -0.1612655520439148, -0.06228325888514519, 0.10062097758054733, 0.028552599251270294, 0.06633459776639938, -0.005764489062130451, 0.008801233023405075, -0.0498594231903553, -0.08109511435031891, 0.011145233176648617, 0.03170829266309738, 0.10250439494848251, 0.03001207672059536, -0.04883759841322899, 0.004445324186235666, -0.06476007401943207, -0.043352171778678894, 0.18288610875606537, 0.21222206950187683, -0.08037137240171432, 0.021194729954004288, -0.005716520827263594, -0.08523989468812943, -0.16512037813663483, 0.03378669172525406, 0.046648453921079636, 0.03036806918680668, 0.009886867366731167, -0.15465109050273895, 0.14928971230983734, 0.11900141835212708, -0.005335756111890078, 0.10451516509056091, -0.3111400604248047, -0.11550368368625641, 0.14224043488502502, 0.12709300220012665, 0.16305097937583923, -0.13727107644081116, 0.004044313449412584, -0.04171145334839821, -0.1441129595041275, 0.11030212789773941, -0.07537272572517395, 0.10684889554977417, -0.04263949394226074, 0.07826168090105057, 0.009041279554367065, -0.05011666938662529, 0.1284075379371643, 0.03275668993592262, 0.09763318300247192, -0.08655527979135513, -0.03404286876320839, 0.012705272994935513, -0.035325318574905396, 0.019317690283060074, -0.08947952836751938, 0.030018365010619164, -0.1383252888917923, -0.03522719070315361, -0.06014218181371689, 0.03838466480374336, -0.037650927901268005, -0.05012674629688263, -0.023006591945886612, 0.023655137047171593, 0.07945786416530609, -0.0005160000291652977, 0.16101843118667603, 0.026242852210998535, 0.11536789685487747, 0.07357963919639587, 0.07939513027667999, -0.06947912275791168, -0.05948319658637047, -0.02622530236840248, 0.0008000234374776483, 0.05014382675290108, -0.13452868163585663, 0.025227652862668037, 0.15763047337532043, 0.007529782131314278, 0.1506948620080948, 0.09004689007997513, 0.007861464284360409, 0.0004651809576898813, 0.05320296436548233, -0.15838226675987244, -0.05525226145982742, -0.02450280450284481, -0.0536542609333992, -0.11243969947099686, 0.04080123081803322, 0.09136443585157394, -0.07456167042255402, -0.00839966256171465, -0.015421366319060326, 0.03918054327368736, -0.08475194126367569, 0.16483937203884125, 0.03385642543435097, 0.0430915430188179, -0.09975975006818771, 0.06975029408931732, 0.06645585596561432, -0.08133267611265182, 0.008599556051194668, 0.05780448392033577, -0.07381686568260193, -0.051215000450611115, 0.10503540188074112, 0.19727861881256104, -0.047214001417160034, -0.06890563666820526, -0.149373397231102, -0.1414700597524643, 0.0943099707365036, 0.13196294009685516, 0.11843199282884598, 0.022104019299149513, -0.052315305918455124, -0.023137765005230904, -0.1314932107925415, 0.06254056841135025, 0.037306442856788635, 0.05900677293539047, -0.14460036158561707, 0.10690417140722275, -0.018484750762581825, 0.03945007920265198, -0.00870188232511282, 0.02992800995707512, -0.11472684144973755, 0.0078105442225933075, -0.09752985090017319, -0.00624309154227376, -0.03553354740142822, 0.026057498529553413, 0.014136948622763157, -0.06719724833965302, -0.0630708709359169, 0.02496851235628128, -0.111801378428936, -0.029223687946796417, 0.03424796462059021, 0.07440638542175293, -0.09005357325077057, -0.05312865599989891, 0.017807789146900177, -0.0708303153514862, 0.058172766119241714, 0.08078032732009888, 0.017433179542422295, 0.03032946027815342, -0.14578209817409515, 0.02278359793126583, 0.06946385651826859, 0.030435001477599144, 0.07379169017076492, -0.10157911479473114, -0.0012754268245771527, 0.039258502423763275, 0.020963123068213463, 0.01132298819720745, 0.07531888782978058, -0.1406656950712204, -0.02317158132791519, -0.02010808326303959, -0.09866500645875931, -0.060595475137233734, 0.011903170496225357, 0.11070434004068375, 0.012640939094126225, 0.21487723290920258, -0.0604538768529892, 0.052157994359731674, -0.20187847316265106, 0.0025488717947155237, -0.0060036676004529, -0.09438788145780563, -0.10401283949613571, -0.07848162204027176, 0.06472758203744888, -0.0469835102558136, 0.13775980472564697, 0.04087274521589279, 0.06881444901227951, 0.018975816667079926, -0.035288915038108826, 0.041509099304676056, 0.02712130732834339, 0.20629234611988068, 0.03768259659409523, -0.041152432560920715, 0.09367280453443527, 0.02521691471338272, 0.11829809099435806, 0.12822729349136353, 0.18956659734249115, 0.13584229350090027, 0.011763541027903557, 0.1176275834441185, 0.03303269296884537, -0.05606920272111893, -0.14921078085899353, 0.03549724072217941, -0.04008746147155762, 0.09828417003154755, -0.031331274658441544, 0.18333755433559418, 0.05132355913519859, -0.17042890191078186, 0.02572915144264698, -0.0660528764128685, -0.08140847086906433, -0.10943658649921417, -0.049496423453092575, -0.10059632360935211, -0.1432524025440216, 0.004406220279633999, -0.11890363693237305, 0.009153006598353386, 0.09143856912851334, -0.0016739394050091505, -0.026751795783638954, 0.14665937423706055, 0.003909492865204811, 0.036602895706892014, 0.05738265439867973, -0.012590755708515644, -0.038713496178388596, -0.11501245945692062, -0.07498740404844284, -0.0236178170889616, -0.04367200657725334, 0.03350403904914856, -0.06353305280208588, -0.03165990486741066, 0.04408576339483261, -0.028648607432842255, -0.0851980447769165, 0.009468551725149155, 0.000173506501596421, 0.05525674670934677, 0.0536142997443676, 0.029148604720830917, 0.024480387568473816, 0.009207469411194324, 0.20991875231266022, -0.07517315447330475, -0.06621250510215759, -0.11692255735397339, 0.21683713793754578, 0.06020543724298477, -0.03414818271994591, 0.043686266988515854, -0.06646700948476791, 0.001029499457217753, 0.2298547625541687, 0.1771164834499359, -0.06673350930213928, -0.010836801491677761, 0.004666190594434738, -0.010337800718843937, -0.0258333720266819, 0.09770184755325317, 0.14682403206825256, 0.04163895919919014, -0.08909823000431061, -0.06535607576370239, -0.055312883108854294, -0.00045484775910153985, -0.02851088158786297, 0.06234606355428696, 0.028497260063886642, 0.0067201051861047745, -0.016291219741106033, 0.035747792571783066, -0.06454046070575714, -0.08127111941576004, 0.08535357564687729, -0.20631788671016693, -0.1522548645734787, -0.024599147960543633, 0.09891993552446365, 0.014324173331260681, 0.06625299900770187, -0.025346511974930763, -0.02390080690383911, 0.09565301984548569, -0.015720145776867867, -0.10567818582057953, -0.06193286553025246, 0.08726299554109573, -0.10930600762367249, 0.2146630436182022, -0.04492291063070297, 0.0726730078458786, 0.11938219517469406, 0.08248142898082733, -0.08028523623943329, 0.06609353423118591, 0.026878705248236656, -0.05050823837518692, 0.05012219026684761, 0.06251740455627441, -0.04755255579948425, 0.06857821345329285, 0.04519965127110481, -0.12029611319303513, 0.018508493900299072, -0.077506884932518, -0.042303312569856644, -0.030754268169403076, -0.03269191458821297, -0.07689218968153, 0.12946265935897827, 0.21577420830726624, -0.03254956007003784, -0.010651476681232452, -0.06824064999818802, 0.04790510609745979, 0.03427461162209511, 0.007373668719083071, -0.059570539742708206, -0.19542329013347626, 0.009732285514473915, 0.043384235352277756, -0.019896280020475388, -0.21857193112373352, -0.09237757325172424, 0.0007231556810438633, -0.0963556319475174, -0.10558971017599106, 0.051904816180467606, 0.08665534108877182, 0.03936268016695976, -0.07960645109415054, -0.05255027860403061, -0.06807971745729446, 0.1509479433298111, -0.12444648891687393, -0.09482388198375702 ]
null
null
transformers
#keytotext [![pypi Version](https://img.shields.io/pypi/v/keytotext.svg?logo=pypi&logoColor=white)](https://pypi.org/project/keytotext/) [![Downloads](https://static.pepy.tech/personalized-badge/keytotext?period=total&units=none&left_color=grey&right_color=orange&left_text=Pip%20Downloads)](https://pepy.tech/project/keytotext) [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/gagan3012/keytotext/blob/master/notebooks/K2T.ipynb) [![Streamlit App](https://static.streamlit.io/badges/streamlit_badge_black_white.svg)](https://share.streamlit.io/gagan3012/keytotext/UI/app.py) [![API Call](https://img.shields.io/badge/-FastAPI-red?logo=fastapi&labelColor=white)](https://github.com/gagan3012/keytotext#api) [![Docker Call](https://img.shields.io/badge/-Docker%20Image-blue?logo=docker&labelColor=white)](https://hub.docker.com/r/gagan30/keytotext) [![HuggingFace](https://img.shields.io/badge/%F0%9F%A4%97-Models%20on%20Hub-yellow)](https://huggingface.co/models?filter=keytotext) [![Documentation Status](https://readthedocs.org/projects/keytotext/badge/?version=latest)](https://keytotext.readthedocs.io/en/latest/?badge=latest) [![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black) ![keytotext](https://socialify.git.ci/gagan3012/keytotext/image?description=1&forks=1&language=1&owner=1&stargazers=1&theme=Light) Idea is to build a model which will take keywords as inputs and generate sentences as outputs. Potential use case can include: - Marketing - Search Engine Optimization - Topic generation etc. - Fine tuning of topic modeling models
{"language": "en", "license": "MIT", "tags": ["keytotext", "k2t", "Keywords to Sentences"], "datasets": ["WebNLG", "Dart"], "metrics": ["NLG"], "thumbnail": "Keywords to Sentences"}
text2text-generation
OnsElleuch/logisgenerator
[ "transformers", "pytorch", "t5", "text2text-generation", "keytotext", "k2t", "Keywords to Sentences", "en", "dataset:WebNLG", "dataset:Dart", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #t5 #text2text-generation #keytotext #k2t #Keywords to Sentences #en #dataset-WebNLG #dataset-Dart #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
#keytotext ![pypi Version](URL ![Downloads](URL ![Open In Colab](URL ![Streamlit App](URL ![API Call](URL ![Docker Call](URL ![HuggingFace](URL ![Documentation Status](URL ![Code style: black](URL !keytotext Idea is to build a model which will take keywords as inputs and generate sentences as outputs. Potential use case can include: - Marketing - Search Engine Optimization - Topic generation etc. - Fine tuning of topic modeling models
[]
[ "TAGS\n#transformers #pytorch #t5 #text2text-generation #keytotext #k2t #Keywords to Sentences #en #dataset-WebNLG #dataset-Dart #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 78 ]
[ "passage: TAGS\n#transformers #pytorch #t5 #text2text-generation #keytotext #k2t #Keywords to Sentences #en #dataset-WebNLG #dataset-Dart #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.03903217613697052, 0.07688179612159729, -0.0061773881316185, 0.007946956902742386, 0.10117848217487335, 0.02843649871647358, 0.09411219507455826, 0.15868787467479706, 0.006724410690367222, -0.057724714279174805, 0.1404922604560852, 0.2025938332080841, 0.017940107733011246, 0.13757465779781342, -0.06942672282457352, -0.2564336359500885, 0.03815453499555588, 0.037285901606082916, -0.019459573552012444, 0.14391419291496277, 0.08486589789390564, -0.05302097275853157, 0.04979008436203003, -0.019126372411847115, -0.1216764822602272, 0.054594650864601135, 0.030159002169966698, -0.13637526333332062, 0.10469778627157211, 0.018528535962104797, 0.08310289680957794, 0.05006900057196617, -0.05860939249396324, -0.2161225974559784, 0.025030845776200294, 0.008722852915525436, -0.045329101383686066, 0.03354315459728241, 0.03338639438152313, -0.046357449144124985, 0.05640164762735367, -0.06243095174431801, -0.017651746049523354, 0.04220695421099663, -0.09832804650068283, -0.04187982156872749, -0.05007176101207733, 0.043057892471551895, 0.08319853991270065, 0.0737672746181488, -0.040723055601119995, 0.15581513941287994, -0.07629773765802383, 0.14664098620414734, 0.1368003636598587, -0.3269534111022949, 0.012729325331747532, -0.0035922133829444647, 0.004724225029349327, 0.06208911910653114, -0.02152496948838234, 0.06762268394231796, 0.012842950411140919, 0.017713449895381927, 0.04426150768995285, -0.09849993884563446, -0.24517373740673065, 0.04561421275138855, -0.07974745333194733, -0.020331554114818573, 0.289760559797287, -0.02673063613474369, 0.08189614117145538, -0.06428142637014389, -0.08825667947530746, 0.03182201087474823, -0.023756423965096474, -0.030087575316429138, -0.05645490437746048, 0.04441112279891968, -0.010185645893216133, -0.08027484267950058, -0.15373766422271729, 0.005433336831629276, -0.20648396015167236, 0.10115378350019455, 0.024363122880458832, 0.035511452704668045, -0.18737928569316864, 0.05462529510259628, 0.03907693922519684, -0.12055029720067978, 0.04595165699720383, -0.053805362433195114, 0.018455632030963898, -0.0044195144437253475, -0.03764479234814644, -0.08237682282924652, 0.15702591836452484, -0.021671166643500328, 0.003712681820616126, 0.003861523699015379, -0.08378283679485321, 0.05623997002840042, 0.062132399529218674, 0.07123706489801407, -0.02089962735772133, -0.07825490087270737, 0.05290557071566582, -0.1396183967590332, 0.009325655177235603, -0.04026453197002411, -0.09688538312911987, -0.06370905786752701, 0.10109986364841461, 0.08713588118553162, 0.08178628981113434, 0.10759539902210236, -0.025620993226766586, -0.017821846529841423, 0.012486998923122883, -0.07545977830886841, -0.027385571971535683, 0.0017266568029299378, -0.014242880046367645, 0.10150834172964096, 0.0165895763784647, 0.07835089415311813, -0.134853333234787, 0.019631991162896156, -0.07318528741598129, -0.0271466001868248, 0.03464220091700554, -0.08904528617858887, 0.05325685441493988, -0.0927010253071785, 0.0202367901802063, -0.1247885525226593, -0.17206141352653503, 0.01621382124722004, -0.03229447826743126, 0.0012832974316552281, -0.0709434524178505, -0.08504429459571838, -0.04151918739080429, 0.05391307175159454, -0.0692572221159935, 0.008550930768251419, -0.04580128937959671, 0.11258784681558609, -0.07574059069156647, 0.089997299015522, -0.11348620057106018, 0.042653366923332214, -0.15919475257396698, -0.07423004508018494, -0.053964316844940186, 0.10881427675485611, -0.014113583602011204, 0.12761420011520386, -0.08176441490650177, -0.007769408170133829, -0.03169988840818405, 0.034524861723184586, -0.045046985149383545, 0.24183803796768188, -0.12320122867822647, -0.10078071057796478, 0.2652226984500885, -0.039097171276807785, -0.16684569418430328, 0.07894851267337799, 0.0009547838126309216, 0.14258721470832825, 0.15320105850696564, 0.2657999098300934, 0.07426482439041138, -0.02078670635819435, 0.00763301644474268, 0.11893578618764877, -0.0869615450501442, -0.035517383366823196, 0.01223339419811964, -0.018862366676330566, -0.013180753216147423, 0.054628338664770126, 0.12370645999908447, 0.052965909242630005, -0.027910005301237106, -0.05075078457593918, -0.018453463912010193, -0.018193617463111877, 0.08231326192617416, -0.02094157226383686, 0.07350154966115952, -0.07670266926288605, -0.028671003878116608, 0.03127618506550789, 0.006361823063343763, -0.019136076793074608, 0.04919913783669472, -0.03604045882821083, -0.027698056772351265, -0.0022763845045119524, 0.052324894815683365, -0.1082569807767868, -0.12414075434207916, -0.012394402176141739, 0.18391548097133636, 0.03078749030828476, 0.0533563457429409, 0.030673427507281303, -0.07504592090845108, -0.01635466329753399, -0.0037628004793077707, 0.1724305897951126, 0.03408222645521164, -0.08487121015787125, -0.11941272765398026, 0.1011456772685051, -0.032992489635944366, -0.015726841986179352, 0.04496084898710251, -0.004146311432123184, 0.1094067245721817, 0.10359770059585571, -0.01346167828887701, 0.06498674303293228, 0.020176568999886513, -0.005791624076664448, -0.03126189485192299, 0.02556530199944973, 0.06685677170753479, -0.006117179058492184, -0.12800662219524384, 0.20311257243156433, -0.07922592759132385, 0.1098286360502243, 0.17013971507549286, -0.1630544364452362, 0.052374500781297684, -0.05337982624769211, -0.03546663746237755, -0.02642180025577545, 0.052608367055654526, -0.02748814783990383, 0.1092694103717804, 0.04293446242809296, 0.1276484578847885, -0.055808719247579575, -0.08623116463422775, -0.0014845937257632613, -0.07218868285417557, -0.03463004156947136, 0.07184917479753494, -0.05464067682623863, -0.2767590880393982, 0.151351660490036, 0.14284931123256683, 0.07120442390441895, 0.218695268034935, -0.020985165610909462, -0.021876879036426544, 0.03600084409117699, -0.02205980382859707, -0.0722251608967781, -0.06470290571451187, -0.13558830320835114, -0.0033426089212298393, 0.07773612439632416, 0.021976018324494362, 0.07477480918169022, -0.10188990086317062, -0.04983449727296829, -0.03033437579870224, -0.024645287543535233, -0.0009964666096493602, 0.09942738711833954, 0.06909600645303726, 0.13495886325836182, 0.01318643894046545, 0.04001353681087494, 0.08726658672094345, -0.011574335396289825, -0.12594421207904816, 0.183069109916687, -0.20184381306171417, -0.3897164463996887, -0.06920620054006577, -0.08202673494815826, -0.0433579757809639, -0.023706885054707527, 0.134678915143013, -0.158100888133049, 0.007603765465319157, 0.0012842394644394517, 0.04123978316783905, -0.06566179543733597, 0.0188005268573761, -0.06776919960975647, 0.02648961916565895, -0.09561283141374588, -0.0856005847454071, -0.05406622216105461, -0.013013672083616257, -0.069023497402668, 0.1326999068260193, -0.1190083771944046, 0.0657150074839592, 0.15551768243312836, 0.01996716484427452, 0.04895695298910141, -0.08226631581783295, 0.16687576472759247, -0.06638176739215851, 0.06961362808942795, 0.13507559895515442, -0.028373628854751587, 0.08798438310623169, 0.15129351615905762, -0.013650152832269669, -0.0232254546135664, 0.049437958747148514, 0.018354935571551323, -0.026459643617272377, -0.26186463236808777, -0.12826767563819885, -0.07437894493341446, 0.08185781538486481, -0.0017615138785913587, 0.04023335501551628, 0.14391960203647614, 0.06444190442562103, -0.027538392692804337, -0.020490268245339394, 0.022095942869782448, 0.06953368335962296, 0.25473225116729736, -0.02774951606988907, 0.1169879361987114, -0.06025315448641777, -0.11819146573543549, 0.08268919587135315, 0.11264616996049881, 0.050551220774650574, 0.01648484170436859, 0.10656298696994781, 0.05184439942240715, 0.11337745189666748, 0.04947604611515999, 0.07878274470567703, 0.03159871697425842, 0.0007107985438778996, -0.015850944444537163, -0.05846184492111206, -0.01383562758564949, 0.05181414261460304, 0.034767843782901764, -0.10243432223796844, -0.03443082794547081, -0.01150580495595932, 0.12185569107532501, 0.13546621799468994, 0.08351317793130875, -0.12537677586078644, 0.005218727979809046, 0.07765185832977295, -0.044701624661684036, -0.10390593111515045, 0.08798053115606308, -0.005960504990071058, -0.13519296050071716, 0.11843277513980865, -0.032555289566516876, 0.0880533754825592, -0.04732156917452812, 0.06625839322805405, -0.052118733525276184, -0.09939749538898468, 0.002937457524240017, 0.13495902717113495, -0.3500785231590271, 0.18479159474372864, 0.020522842183709145, -0.03739010915160179, -0.11735451966524124, -0.024229316040873528, -0.020654641091823578, 0.018029769882559776, 0.15043169260025024, 0.0040740673430264, 0.022580603137612343, -0.023493951186537743, -0.055756766349077225, 0.054868098348379135, 0.08672568202018738, -0.03158225491642952, 0.02804802730679512, -0.047701746225357056, -0.008736777119338512, 0.001647863071411848, -0.03301027789711952, -0.0661146268248558, -0.13900674879550934, 0.05676543340086937, 0.08871498703956604, -0.018053695559501648, 0.011931195855140686, -0.030276063829660416, -0.023456571623682976, 0.2034592479467392, -0.06985244899988174, -0.13113707304000854, -0.12003286927938461, -0.034163400530815125, 0.059256117790937424, -0.10904631018638611, 0.007828399538993835, -0.06547124683856964, 0.016343815252184868, -0.08073829859495163, -0.19135794043540955, 0.10531875491142273, -0.07745100557804108, -0.033804330974817276, -0.04372392222285271, 0.1793138086795807, -0.01073919702321291, 0.015988832339644432, 0.03128870576620102, 0.010686389170587063, -0.06278391927480698, -0.06007697433233261, 0.041904956102371216, -0.029453665018081665, 0.035968679934740067, 0.02807362750172615, -0.06198997050523758, -0.02893623523414135, -0.10285694152116776, -0.04338344931602478, 0.3439633548259735, 0.13939659297466278, -0.06957757472991943, 0.21206419169902802, 0.1378963142633438, -0.10091788321733475, -0.31464821100234985, -0.09116147458553314, -0.06621747463941574, -0.05524633452296257, -0.014028315432369709, -0.17787262797355652, 0.09584298729896545, -0.042744219303131104, -0.024806752800941467, 0.03977423906326294, -0.2556311786174774, -0.082899309694767, 0.16976392269134521, 0.01596090942621231, 0.2834286689758301, -0.13093051314353943, -0.08922819793224335, -0.03937128558754921, -0.12698093056678772, 0.19979402422904968, -0.08418729156255722, 0.06153452768921852, -0.004666736349463463, 0.1776805818080902, 0.04617675021290779, -0.011990601196885109, 0.012206112034618855, 0.003995159175246954, -0.02480645291507244, -0.06926066428422928, -0.09844497591257095, 0.08762900531291962, -0.008778671734035015, 0.020381171256303787, -0.06595955789089203, 0.05049559473991394, -0.0949656069278717, -0.0006382031133398414, -0.12513716518878937, 0.07758493721485138, -0.007955621927976608, -0.09927115589380264, -0.019221661612391472, -0.04650452733039856, 0.06919635832309723, -0.020818738266825676, 0.15716107189655304, -0.03872097283601761, 0.11444004625082016, 0.10615856200456619, 0.09696070849895477, -0.0019096651813015342, 0.06675493717193604, -0.062093157321214676, -0.09167221933603287, 0.027807479724287987, -0.13657724857330322, 0.05464359000325203, 0.11414086073637009, -0.018889576196670532, 0.042606230825185776, 0.06232642009854317, -0.030024895444512367, -0.03841933608055115, 0.10966654866933823, -0.24939408898353577, 0.008710821159183979, -0.09975350648164749, -0.07740998268127441, 0.03193715959787369, 0.056537844240665436, 0.20132313668727875, 0.019023699685931206, -0.06170305982232094, -0.020352233201265335, 0.05099378526210785, -0.04595917835831642, 0.08228398114442825, 0.03132554888725281, 0.029461823403835297, -0.13575321435928345, 0.10601908713579178, 0.015725204721093178, -0.04993676394224167, 0.08034761995077133, 0.17141593992710114, -0.11595481634140015, -0.08235012739896774, 0.018206652253866196, 0.09943213313817978, -0.09247054904699326, -0.02033846080303192, -0.05460567772388458, -0.09290233254432678, 0.046055182814598083, 0.2177998423576355, 0.020063476637005806, 0.1118638664484024, -0.04517297074198723, -0.04891025274991989, -0.047661323100328445, 0.10537919402122498, 0.0014809079002588987, -0.04026184603571892, -0.10279271751642227, 0.05047539249062538, -0.04830300062894821, 0.17378509044647217, -0.06399358063936234, -0.0592978410422802, -0.10721044987440109, 0.023348625749349594, -0.06541787832975388, -0.029305675998330116, -0.0824657455086708, -0.03429308906197548, -0.021074894815683365, -0.031610239297151566, -0.03873627260327339, -0.03024277836084366, -0.0764424055814743, 0.02725931815803051, -0.016917074099183083, 0.07398903369903564, -0.11142293363809586, -0.05420057475566864, 0.07131079584360123, 0.00017522448615636677, 0.13711436092853546, 0.08519066870212555, -0.10575205832719803, 0.1608172357082367, -0.12382415682077408, -0.056644801050424576, 0.11797840893268585, 0.0074418759904801846, 0.050511684268713, 0.03570277988910675, -0.018321089446544647, 0.048318617045879364, 0.04591407999396324, 0.048932358622550964, 0.03741343691945076, -0.0999658852815628, -0.02282877452671528, -0.057729750871658325, -0.1148778423666954, -0.08515831083059311, -0.027044016867876053, 0.10125566273927689, -0.01629621535539627, 0.10339433699846268, -0.08726240694522858, 0.10801132768392563, -0.08006537705659866, 0.013444874435663223, 0.006591364275664091, -0.15751560032367706, -0.08619159460067749, -0.06383692473173141, 0.036279238760471344, -0.029388047754764557, 0.17760130763053894, -0.019755007699131966, -0.005182204302400351, 0.030871329829096794, -0.016395168378949165, -0.01168188638985157, 0.024826066568493843, 0.2648567259311676, 0.04261944815516472, -0.10843217372894287, -0.0639122724533081, 0.06574015319347382, -0.01914919540286064, 0.023890914395451546, 0.1350153535604477, -0.003828831482678652, 0.10972201079130173, 0.08876129984855652, 0.005746243055909872, 0.03159472718834877, -0.03779241442680359, -0.1040029525756836, 0.001482845051214099, 0.04779302701354027, -0.010397776961326599, 0.05456288903951645, 0.1640816479921341, -0.07143199443817139, 0.0039795925840735435, -0.01002996601164341, -0.07117529958486557, -0.17887158691883087, -0.16366270184516907, -0.10530594736337662, -0.060769401490688324, 0.019913695752620697, -0.161081925034523, 0.06848270446062088, -0.021497240290045738, 0.04664139449596405, -0.07486061006784439, 0.08085466921329498, 0.10443606972694397, -0.11811155080795288, 0.10797092318534851, -0.02451854571700096, 0.03894788771867752, 0.011804025620222092, 0.030835168436169624, -0.050016775727272034, 0.06286969780921936, -0.010726607404649258, 0.05615726858377457, -0.07746545970439911, 0.018269198015332222, -0.18184438347816467, -0.12726320326328278, -0.030422940850257874, 0.03916741535067558, 0.0003359842812642455, 0.07872729003429413, 0.07855517417192459, -0.045163124799728394, 0.04764260724186897, 0.20738162100315094, -0.07398447394371033, -0.1045374646782875, -0.023633794859051704, 0.1414511501789093, 0.0523374080657959, 0.07870625704526901, 0.026478974148631096, -0.031166523694992065, -0.09295572340488434, 0.2916955053806305, 0.2669696509838104, -0.031840093433856964, 0.024861881509423256, -0.01131228357553482, 0.030655555427074432, 0.09602819383144379, 0.13856270909309387, 0.07178739458322525, 0.2547357678413391, -0.06891409307718277, 0.009557278826832771, -0.029259970411658287, -0.056100137531757355, -0.07058566808700562, 0.06937406957149506, 0.06757080554962158, -0.04952109605073929, -0.027773842215538025, 0.13015484809875488, -0.22022801637649536, 0.06527146697044373, -0.11274199187755585, -0.17195047438144684, -0.06769340485334396, -0.003549414686858654, 0.11553569883108139, 0.025867212563753128, 0.06929931044578552, 0.018015000969171524, -0.047917440533638, 0.05909794569015503, 0.03272463381290436, -0.20910649001598358, 0.044403325766325, 0.05030481889843941, -0.18334515392780304, 0.008507284335792065, -0.02192201465368271, 0.12356370687484741, 0.09216096252202988, 0.01715623028576374, -0.04188229516148567, 0.046647462993860245, 0.019858231768012047, 0.004706459119915962, 0.024247458204627037, 0.03574637696146965, 0.04167858511209488, -0.06116395443677902, 0.13154110312461853, -0.12377110868692398, 0.059437453746795654, -0.02207612246274948, -0.01617778278887272, -0.051059331744909286, 0.038034092634916306, -0.06564220786094666, 0.048679251223802567, 0.06022518500685692, -0.0346447192132473, 0.04353766143321991, -0.07530952990055084, -0.07014861702919006, -0.006826259661465883, -0.008663292042911053, -0.0499395914375782, -0.08878384530544281, -0.090488500893116, 0.10712049156427383, 0.025276314467191696, -0.2150489240884781, 0.02117997035384178, -0.11495615541934967, 0.024585455656051636, -0.17837341129779816, 0.13729168474674225, 0.06493893265724182, -0.004544015508145094, -0.0009455821709707379, -0.07390233874320984, 0.03382629528641701, 0.10528970509767532, -0.1486518681049347, -0.06735222786664963 ]
null
null
transformers
#harry potter dialogpt model
{"tags": ["conversational"]}
text-generation
Optimal/Harry
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
#harry potter dialogpt model
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 51 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.009697278961539268, 0.03208012506365776, -0.007204889785498381, 0.004809224978089333, 0.16726240515708923, 0.014898733235895634, 0.09765533357858658, 0.13672804832458496, -0.007841327227652073, -0.031050153076648712, 0.14490588009357452, 0.20411323010921478, -0.006439372431486845, 0.0661218985915184, -0.07572533935308456, -0.2683109939098358, 0.05759621039032936, 0.046649303287267685, 0.016515716910362244, 0.1200079694390297, 0.08573378622531891, -0.05473608896136284, 0.08714032918214798, -0.014583407901227474, -0.150366872549057, 0.017733458429574966, 0.043394338339567184, -0.12260226160287857, 0.11910516023635864, 0.05462685227394104, 0.07063519209623337, 0.014929565601050854, -0.07541623711585999, -0.1631229966878891, 0.03031250834465027, 0.01425902172923088, -0.0594632662832737, 0.04757995903491974, 0.059961482882499695, -0.10165371745824814, 0.10819483548402786, 0.09530027210712433, -0.013078106567263603, 0.06798283755779266, -0.16849711537361145, -0.020869607105851173, -0.01446688175201416, 0.009899779222905636, 0.05550243332982063, 0.09964893013238907, -0.03413357585668564, 0.10497362166643143, -0.09214533120393753, 0.11017382889986038, 0.10932035744190216, -0.32057443261146545, -0.005767723545432091, 0.09167823940515518, 0.039358653128147125, 0.07352814823389053, -0.04467793554067612, 0.06258884817361832, 0.018015462905168533, 0.017986174672842026, -0.014015024527907372, -0.07283061742782593, -0.11612214148044586, 0.04717336222529411, -0.08668071031570435, -0.059868961572647095, 0.2244078367948532, -0.05464440956711769, 0.06881742179393768, -0.05281897634267807, -0.10522868484258652, -0.04308144748210907, -0.029833965003490448, 0.00475557055324316, -0.07660607248544693, 0.08692064881324768, 0.00869679357856512, -0.09547875821590424, -0.1376667022705078, -0.02496783249080181, -0.1776352822780609, 0.16140350699424744, 0.02465328387916088, 0.05232657864689827, -0.2027255892753601, 0.09623090922832489, 0.017906051129102707, -0.08045592904090881, 0.022091427817940712, -0.10046248883008957, 0.029131146147847176, 0.013760408386588097, -0.04754498973488808, -0.061387211084365845, 0.0843690037727356, 0.11199145019054413, -0.01731434464454651, 0.025486016646027565, -0.039331406354904175, 0.08100687712430954, 0.03553595021367073, 0.09077847748994827, 0.007288969587534666, -0.028338588774204254, 0.025842782109975815, -0.13719046115875244, -0.003647835226729512, -0.07116208970546722, -0.16572439670562744, -0.021088803187012672, 0.02994808368384838, 0.08289173990488052, 0.015449047088623047, 0.11682453751564026, -0.03272046521306038, -0.025152435526251793, 0.03602350503206253, -0.047656361013650894, -0.012649794109165668, 0.016648368909955025, 0.013163427822291851, 0.12399329990148544, -0.0022096503525972366, 0.03235051408410072, -0.13653022050857544, 0.031423524022102356, -0.06793295592069626, -0.003740974934771657, -0.03486552834510803, -0.040637075901031494, 0.009043924510478973, -0.06862333416938782, 0.003486064961180091, -0.15030112862586975, -0.15063877403736115, 0.007587034720927477, -0.007836631499230862, -0.04107699543237686, -0.06370922178030014, -0.06952770054340363, -0.013550350442528725, 0.04251532256603241, -0.07093454152345657, -0.011352915316820145, -0.06403283774852753, 0.11004766076803207, -0.03197755664587021, 0.07921615242958069, -0.11953279376029968, 0.08390819281339645, -0.11260783672332764, -0.02386913076043129, -0.060801517218351364, 0.09317506104707718, -0.0006014376995153725, 0.09549830108880997, -0.006563255097717047, -0.017931854352355003, -0.07981178909540176, 0.06445012241601944, -0.042872510850429535, 0.21701598167419434, -0.0615808479487896, -0.11181682348251343, 0.28781595826148987, -0.052628401666879654, -0.1370542049407959, 0.11647392809391022, 0.008682746440172195, 0.05777018144726753, 0.10703510791063309, 0.19733482599258423, -0.015276194550096989, 0.004040541127324104, 0.09471915662288666, 0.11263324320316315, -0.11276852339506149, -0.033160366117954254, 0.013019153848290443, -0.04081077128648758, -0.10867965966463089, 0.04689536616206169, 0.09810488671064377, 0.07090286910533905, -0.04786505550146103, -0.03377414867281914, -0.01366397924721241, 0.0052589005790650845, 0.08885077387094498, -0.007157256826758385, 0.10962837189435959, -0.05819983780384064, -0.03796621412038803, -0.029282379895448685, -0.012126247398555279, -0.03951939567923546, 0.03137664496898651, -0.043376367539167404, 0.10821941494941711, -0.011204327456653118, 0.06364280730485916, -0.16185984015464783, -0.07691477984189987, -0.017002692446112633, 0.1581239402294159, 0.024538565427064896, 0.09859629720449448, 0.0552486926317215, -0.040398042649030685, -0.0012767292791977525, 0.012792680412530899, 0.15581141412258148, -0.022091681137681007, -0.065607450902462, -0.052166227251291275, 0.08642971515655518, -0.05641226842999458, 0.04504093527793884, -0.05937713757157326, 0.012367865070700645, 0.05064384639263153, 0.10342344641685486, -0.00018274025933351368, 0.03323284164071083, -0.008164864964783192, 0.002145637758076191, -0.058205123990774155, 0.007405933458358049, 0.10799351334571838, 0.00036868182360194623, -0.07365862280130386, 0.22074243426322937, -0.17796069383621216, 0.1765957772731781, 0.1893044263124466, -0.299345999956131, 0.017949223518371582, -0.10759581625461578, -0.04561871662735939, 0.014407722279429436, 0.05567655712366104, -0.0454222597181797, 0.1703362911939621, -0.009871348738670349, 0.18874616920948029, -0.04946064203977585, -0.04464937001466751, -0.0200483538210392, -0.05118836089968681, -0.0024189651012420654, 0.07781197130680084, 0.10685696452856064, -0.13992026448249817, 0.1964332014322281, 0.1621224284172058, 0.048237916082143784, 0.19945049285888672, 0.015346456319093704, -0.011589210480451584, 0.0909530371427536, 0.005220826715230942, -0.058739423751831055, -0.07409929484128952, -0.2594851851463318, -0.030033592134714127, 0.07992640137672424, 0.0422382652759552, 0.1212305948138237, -0.11349532753229141, -0.038956157863140106, -0.01763172075152397, -0.023146281018853188, 0.021672505885362625, 0.0914369598031044, 0.06075398623943329, 0.13201528787612915, -0.001710098935291171, -0.007300339173525572, 0.10524573177099228, 0.01783694699406624, -0.09354141354560852, 0.18308524787425995, -0.13652534782886505, -0.37097251415252686, -0.13911493122577667, -0.18057456612586975, -0.05449081212282181, 0.05712554603815079, 0.11679314076900482, -0.12011238187551498, -0.018752124160528183, 0.01578843593597412, 0.10931742936372757, -0.08449502289295197, 0.0021454424131661654, -0.06880278885364532, 0.0321490578353405, -0.10310184955596924, -0.09194442629814148, -0.055416494607925415, -0.031392451375722885, -0.08001253753900528, 0.1423761546611786, -0.10777941346168518, 0.04476889222860336, 0.20262959599494934, 0.04653622955083847, 0.05625178664922714, -0.044105201959609985, 0.19377262890338898, -0.11264272034168243, -0.01661740615963936, 0.19215328991413116, -0.048360925167798996, 0.07476246356964111, 0.1232115849852562, -0.006348740309476852, -0.08765771239995956, 0.03011748194694519, -0.02085109055042267, -0.07988511025905609, -0.23219464719295502, -0.13938382267951965, -0.12429051846265793, 0.09477275609970093, 0.028005298227071762, 0.056365787982940674, 0.17219258844852448, 0.06577219814062119, -0.038416244089603424, 0.006410336587578058, 0.02959546446800232, 0.08237514644861221, 0.23417828977108002, -0.06035616248846054, 0.1364797055721283, -0.03420931473374367, -0.14982740581035614, 0.08169995993375778, 0.0713929831981659, 0.10213395953178406, 0.06678459793329239, 0.0804823637008667, 0.0149586396291852, 0.06188136339187622, 0.1311223804950714, 0.08191446959972382, 0.019586285576224327, -0.02480296604335308, -0.03388110175728798, -0.025523077696561813, -0.05937909707427025, 0.040128443390131, 0.06589099019765854, -0.16763372719287872, -0.039227183908224106, -0.09338314831256866, 0.09657008945941925, 0.0873042419552803, 0.06609832495450974, -0.1842060089111328, -0.008006223477423191, 0.08488986641168594, -0.03854905813932419, -0.13727426528930664, 0.09535189718008041, 0.01523482333868742, -0.15144726634025574, 0.03139317408204079, -0.04061909019947052, 0.12188644707202911, -0.07804752141237259, 0.09809603542089462, -0.08108244836330414, -0.07448557764291763, 0.02123199962079525, 0.1261177361011505, -0.30527687072753906, 0.20240111649036407, -0.0024993624538183212, -0.06486981362104416, -0.1243603527545929, -0.0032166161108762026, 0.002410882618278265, 0.07357452809810638, 0.10519039630889893, -0.007196315098553896, 0.001897757756523788, -0.06300821900367737, -0.01829923689365387, 0.032471053302288055, 0.13080233335494995, -0.0401318334043026, -0.021158374845981598, -0.050194524228572845, -0.001653497340157628, -0.03173094615340233, -0.06934895366430283, 0.02002747356891632, -0.19509181380271912, 0.08751901984214783, 0.04166261479258537, 0.09648149460554123, 0.029994789510965347, 0.004265148192644119, -0.09651939570903778, 0.24698667228221893, -0.07148019969463348, -0.10072879493236542, -0.10919588059186935, -0.046813901513814926, 0.03569883480668068, -0.05628936365246773, 0.04309194162487984, -0.0788632407784462, 0.028997479006648064, -0.06352769583463669, -0.19235502183437347, 0.12410202622413635, -0.09027006477117538, -0.04412810131907463, -0.02371402643620968, 0.2110891044139862, -0.05598580464720726, 0.010335659608244896, 0.02930437959730625, 0.01208863127976656, -0.11645778268575668, -0.09678568691015244, 0.031018631532788277, -0.007351789623498917, 0.050603240728378296, 0.041841957718133926, -0.05915454775094986, -0.017138581722974777, -0.052199993282556534, -0.022926922887563705, 0.3496883809566498, 0.14231905341148376, -0.043836336582899094, 0.19347235560417175, 0.12347975373268127, -0.07452994585037231, -0.3159443140029907, -0.1066238060593605, -0.10937739163637161, -0.04680149629712105, -0.07012093812227249, -0.2002030611038208, 0.06474938243627548, 0.00662544509395957, -0.013415241613984108, 0.12749312818050385, -0.2561831772327423, -0.07571036368608475, 0.15906259417533875, -0.017980827018618584, 0.3745945692062378, -0.1168576180934906, -0.10926306992769241, -0.03950892388820648, -0.14175476133823395, 0.16968177258968353, -0.01989765651524067, 0.11221715062856674, -0.009765521623194218, 0.14388824999332428, 0.05548359826207161, -0.023479344323277473, 0.08544106781482697, 0.004999885335564613, -0.03290518373250961, -0.10304180532693863, -0.05676887184381485, 0.007092386484146118, 0.02477436140179634, 0.018026655539870262, -0.041834570467472076, 0.02227151393890381, -0.11731979995965958, -0.04657655209302902, -0.08982590585947037, 0.04431166127324104, 0.03899754583835602, -0.07325074821710587, -0.002380647463724017, -0.07165111601352692, -0.012272949330508709, 0.022334342822432518, 0.20356793701648712, -0.08029330521821976, 0.16448934376239777, 0.09239562600851059, 0.12419285625219345, -0.14376309514045715, -0.00019283240544609725, -0.0762530043721199, -0.05611240118741989, 0.07737895101308823, -0.09433035552501678, 0.058893077075481415, 0.10901971161365509, -0.04567738622426987, 0.08828683942556381, 0.10377411544322968, 0.008936077356338501, 0.003213887568563223, 0.10916902124881744, -0.2667325437068939, -0.0296600554138422, -0.07532413303852081, 0.000883326749317348, 0.09092561900615692, 0.08562852442264557, 0.18840822577476501, 0.025361526757478714, -0.04293036088347435, -0.002770674182102084, 0.028597986325621605, -0.039021048694849014, 0.051667019724845886, 0.001123449532315135, 0.01947369985282421, -0.1530752182006836, 0.072522833943367, 0.01490565575659275, -0.15215420722961426, 0.021316176280379295, 0.16572684049606323, -0.11656328290700912, -0.1283872276544571, -0.06520111113786697, 0.08313824236392975, -0.11755692958831787, -0.01578943058848381, -0.03279297426342964, -0.13145680725574493, 0.07992171496152878, 0.12629036605358124, 0.05557859688997269, 0.0972496047616005, -0.06061713397502899, -0.020469192415475845, -0.018721895292401314, -0.014099318534135818, -0.012384648434817791, -0.007667020428925753, -0.055978111922740936, 0.0590752474963665, -0.026677248999476433, 0.1425808072090149, -0.09221141785383224, -0.1037059873342514, -0.16142144799232483, 0.0374140702188015, -0.11013076454401016, -0.08825794607400894, -0.08821134269237518, -0.050188567489385605, 0.002360827289521694, -0.019856395199894905, -0.04037635400891304, -0.05829505994915962, -0.12300454825162888, 0.0338277705013752, -0.040771447122097015, 0.024727050215005875, -0.07512269169092178, 0.015856385231018066, 0.08507686108350754, -0.03285100311040878, 0.15655414760112762, 0.1450488418340683, -0.1006515845656395, 0.10741901397705078, -0.14806775748729706, -0.09138492494821548, 0.11116421222686768, 0.015329592861235142, 0.0449691042304039, 0.09723787009716034, 0.013362943194806576, 0.0635865181684494, 0.032776717096567154, 0.05308786407113075, 0.027619892731308937, -0.11959987878799438, 0.06483134627342224, -0.03626115620136261, -0.14700546860694885, -0.049338050186634064, -0.05282869189977646, 0.01647452637553215, 0.013054544106125832, 0.09622690081596375, -0.05301849544048309, 0.10698331147432327, -0.04055701196193695, 0.0346808135509491, 0.017554637044668198, -0.1730053424835205, -0.03816922754049301, -0.08538098633289337, 0.03681723028421402, 0.014741539023816586, 0.25266793370246887, 0.030072299763560295, 0.012416383251547813, 0.032671261578798294, 0.08285367488861084, 0.03899408504366875, 0.010228337720036507, 0.17482228577136993, 0.1162426546216011, -0.06621865928173065, -0.10445023328065872, 0.0729617029428482, 0.016332454979419708, 0.01286179106682539, 0.13617953658103943, 0.008365051820874214, 0.005795429926365614, 0.08649782836437225, -0.016865963116288185, 0.009968153201043606, -0.10052056610584259, -0.13426925241947174, -0.022176474332809448, 0.05151832848787308, -0.04655967652797699, 0.11727844923734665, 0.1406494379043579, -0.01806013658642769, 0.03222079202532768, -0.021771740168333054, -0.05699979141354561, -0.1683429479598999, -0.1429590880870819, -0.06883849948644638, -0.13416796922683716, 0.00897989235818386, -0.11180389672517776, 0.05395037308335304, 0.06001098081469536, 0.06750501692295074, -0.06899319589138031, 0.10220931470394135, 0.04626858979463577, -0.11440542340278625, 0.06264589726924896, -0.0296088308095932, 0.09430401772260666, -0.02759445086121559, -0.019505485892295837, -0.09039592742919922, 0.014574515633285046, 0.011419114656746387, 0.06245238706469536, -0.04707273095846176, 0.007463190704584122, -0.14696238934993744, -0.08972041308879852, -0.0523175448179245, 0.0718572810292244, -0.050409089773893356, 0.14282815158367157, 0.00775480642914772, -0.0170906875282526, 0.039554283022880554, 0.22787313163280487, -0.07476283609867096, -0.04778539761900902, -0.05269690603017807, 0.20717895030975342, 0.02975541539490223, 0.1171872541308403, -0.022938819602131844, -0.006106364540755749, -0.0919521227478981, 0.3764844834804535, 0.30030161142349243, -0.09031439572572708, 0.011794124729931355, 0.02137952297925949, 0.04502861574292183, 0.1316293478012085, 0.1216534823179245, 0.10318691283464432, 0.3006802201271057, -0.07452366501092911, -0.04653361067175865, -0.012629742734134197, -0.023858042433857918, -0.09059546142816544, 0.1021224707365036, 0.04839762672781944, -0.06382183730602264, -0.03313443064689636, 0.0954432487487793, -0.25862133502960205, 0.1277991235256195, -0.12311873584985733, -0.17578600347042084, -0.06654827296733856, 0.009760108776390553, 0.10465722531080246, 0.015642458572983742, 0.0946015790104866, 0.007128213066607714, -0.11252258718013763, 0.06305865943431854, 0.03397420793771744, -0.22762253880500793, 0.0006893770187161863, 0.06642123311758041, -0.07006710022687912, -0.0024247700348496437, -0.026499588042497635, 0.05657242611050606, 0.0656052976846695, 0.054629553109407425, -0.00971333310008049, 0.03816632181406021, 0.0034184439573436975, -0.0585215799510479, 0.016623929142951965, 0.05121519789099693, 0.02472509816288948, -0.09763528406620026, 0.06927435845136642, -0.1574270874261856, 0.04766253009438515, -0.0030655991286039352, -0.04124255105853081, 0.006064958870410919, 0.008823691867291927, -0.06491616368293762, 0.05165379121899605, 0.07916834205389023, -0.0016257909592241049, -0.0062433634884655476, -0.057178743183612823, -0.02632102556526661, -0.027755750343203545, -0.09291748702526093, -0.10495562851428986, -0.14682936668395996, -0.11640441417694092, 0.09368976950645447, -0.01011267676949501, -0.1848134547472, 0.022154374048113823, -0.08606051653623581, 0.08319322764873505, -0.1670055389404297, 0.08040720224380493, 0.07041648775339127, 0.013038921169936657, -0.0031511052511632442, -0.02002427540719509, 0.054132770746946335, 0.086809903383255, -0.10407156497240067, -0.07400695979595184 ]
null
null
transformers
# Finetuned DialoGPT model for Eng-Spa translation DialoGPT-small model was used and finetuned on English to Spanish translations, extracted from http://storage.googleapis.com/download.tensorflow.org/data/spa-eng.zip some examples of translations | Role | Response | | :---: |------------------------| | User | please, sing me a song | | Bot | Por favor, canta una canción. | | User | I really want to go to China | | Bot | Realmente quiero ir a China. | | User | Can you do me a favor? | | Bot | ¿Me puedes hacer un favor? | | User | I don't know what you are talking about | | Bot | No sé de qué estás hablando. | | User | I don't want to go to China | | Bot | No quiero ir a China. | # Using the model example code for trying out the model ```python from transformers import AutoModelWithLMHead, AutoTokenizer tokenizer = AutoTokenizer.from_pretrained('microsoft/DialoGPT-small') model = AutoModelWithLMHead.from_pretrained('OscarNav/dialoGPT_translate') # Let's traslate 5 sentences for step in range(5): # encode the new user input, add the eos_token and return a tensor in Pytorch new_user_input_ids = tokenizer.encode(input(">> User:") + tokenizer.eos_token, return_tensors='pt') # generated a response while limiting the total chat history to 1000 tokens, chat_history_ids = model.generate( new_user_input_ids, max_length=1000, pad_token_id=tokenizer.eos_token_id, top_p=0.92, top_k = 50 ) # pretty print last ouput tokens from bot print("DialoGPT: {}".format(tokenizer.decode(chat_history_ids[:, new_user_input_ids.shape[-1]:][0], skip_special_tokens=True))) ```
{}
text-generation
OscarNav/dialoGPT_translate
[ "transformers", "pytorch", "gpt2", "text-generation", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Finetuned DialoGPT model for Eng-Spa translation ================================================ DialoGPT-small model was used and finetuned on English to Spanish translations, extracted from URL some examples of translations Using the model =============== example code for trying out the model
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 47 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.027653997763991356, 0.02414041943848133, -0.0068230400793254375, 0.010564634576439857, 0.18164798617362976, 0.033704131841659546, 0.08821956068277359, 0.13570955395698547, -0.0068973456509411335, -0.013526750728487968, 0.1547490805387497, 0.20799952745437622, -0.0026462990790605545, 0.0791444480419159, -0.0664469450712204, -0.2753458023071289, 0.05913490429520607, 0.0680282786488533, -0.007687992881983519, 0.12075648456811905, 0.07187031954526901, -0.0549883171916008, 0.0886516347527504, -0.02030559629201889, -0.17324471473693848, 0.01953965798020363, 0.04816993698477745, -0.12518654763698578, 0.1176358312368393, 0.05111858248710632, 0.09795232862234116, 0.008365745656192303, -0.06405694782733917, -0.13635118305683136, 0.022147029638290405, 0.03033585101366043, -0.058860234916210175, 0.0636059120297432, 0.1087222546339035, -0.09939044713973999, 0.09311723709106445, 0.08541663736104965, -0.0255570225417614, 0.05364618077874184, -0.15825888514518738, -0.06378549337387085, -0.02499648556113243, 0.007804732769727707, 0.06256697326898575, 0.10073644667863846, -0.017566369846463203, 0.10258800536394119, -0.0975269079208374, 0.10333853214979172, 0.1500675231218338, -0.3112771809101105, 0.009987793862819672, 0.09499151259660721, 0.04119991883635521, 0.03931105509400368, -0.02533094584941864, 0.05045793950557709, 0.025268254801630974, 0.027277586981654167, 0.007437177933752537, -0.0750175341963768, -0.1137726753950119, 0.049895867705345154, -0.09199702739715576, -0.07458660751581192, 0.22324641048908234, -0.07399588078260422, 0.060080595314502716, -0.025852523744106293, -0.11121725291013718, -0.05274823680520058, -0.013890148140490055, 0.018784796819090843, -0.06587869673967361, 0.08765926212072372, 0.024050135165452957, -0.06755640357732773, -0.1323474794626236, -0.04128742218017578, -0.18628640472888947, 0.17943057417869568, 0.015332846902310848, 0.05883103236556053, -0.1924149990081787, 0.11635245382785797, -0.004000017885118723, -0.08559784293174744, 0.024640021845698357, -0.09488005936145782, 0.03717249631881714, -0.005796557758003473, -0.06343648582696915, -0.07624655961990356, 0.078512042760849, 0.13449318706989288, -0.0038929670117795467, 0.031459223479032516, -0.03913462534546852, 0.08946967869997025, 0.023094916716217995, 0.11019261926412582, -0.01329297386109829, -0.00601809611544013, 0.043852973729372025, -0.14449132978916168, -0.008341594599187374, -0.06913956254720688, -0.1527271568775177, -0.05108632892370224, 0.05306483805179596, 0.08953460305929184, 0.008545879274606705, 0.09067165106534958, -0.04840036481618881, -0.026439275592565536, 0.06191498041152954, -0.07166212797164917, -0.0057375445030629635, 0.0005479406099766493, 0.020326290279626846, 0.12346802651882172, -0.006863993126899004, 0.01816580630838871, -0.1344953328371048, 0.07597071677446365, -0.0810447409749031, 0.0016609809827059507, -0.037295255810022354, -0.051307324320077896, 0.016753138974308968, -0.09774310886859894, 0.014272624626755714, -0.15190516412258148, -0.18175770342350006, 0.015764877200126648, 0.0044948384165763855, -0.03198384866118431, -0.035312067717313766, -0.03263629972934723, -0.023609675467014313, 0.04306609928607941, -0.06790579855442047, 0.009302832186222076, -0.05678845942020416, 0.10395034402608871, -0.032171644270420074, 0.06649759411811829, -0.10738259553909302, 0.0829162523150444, -0.12368609756231308, -0.004673504736274481, -0.09571383893489838, 0.07571588456630707, -0.0049130916595458984, 0.11728651076555252, -0.028541911393404007, -0.03454771637916565, -0.07556727528572083, 0.04999465495347977, -0.02550712786614895, 0.18951213359832764, -0.060080599039793015, -0.12557648122310638, 0.2583121061325073, -0.07503679394721985, -0.1294521689414978, 0.09354755282402039, 0.013357079587876797, 0.03000263124704361, 0.08708256483078003, 0.17770351469516754, 0.03385210409760475, 0.011724604293704033, 0.08526027947664261, 0.1101398766040802, -0.11245359480381012, -0.0934135690331459, 0.01582467369735241, -0.04410967230796814, -0.14348545670509338, 0.0551721565425396, 0.06396481394767761, 0.08126390725374222, -0.04889657348394394, -0.02648499235510826, -0.04211905598640442, 0.005280596204102039, 0.08378548920154572, 0.011136471293866634, 0.12981148064136505, -0.04937934875488281, -0.03142275661230087, -0.018193937838077545, -0.012411710806190968, -0.03191297501325607, 0.03591127321124077, -0.019667068496346474, 0.13700194656848907, -0.048340748995542526, 0.053371917456388474, -0.18971459567546844, -0.07922437787055969, 0.0010099048959091306, 0.123023621737957, -0.014106693677604198, 0.08013445883989334, 0.05753817409276962, -0.018720267340540886, -0.004700321704149246, -0.01032867468893528, 0.1544346958398819, -0.021616755053400993, -0.06661882251501083, -0.04162381589412689, 0.0662311464548111, -0.05831345543265343, -0.0033040468115359545, -0.05776660889387131, 0.013589667156338692, 0.05048443749547005, 0.10443682968616486, -0.0023575187660753727, 0.03253777325153351, -0.02123248018324375, 0.018250472843647003, -0.07885172218084335, -0.0028943256475031376, 0.09839999675750732, -0.003195167751982808, -0.06114937365055084, 0.191707044839859, -0.16508106887340546, 0.2123199850320816, 0.18989497423171997, -0.2840019166469574, 0.008855658583343029, -0.07930868119001389, -0.03107025846838951, 0.019292673096060753, 0.04051336646080017, -0.035391807556152344, 0.12321244925260544, 0.0030509934294968843, 0.1893225461244583, -0.05120055004954338, -0.054668959230184555, -0.0003608512051869184, -0.05736381933093071, 0.0013126746052876115, 0.06707432866096497, 0.11558198183774948, -0.12564630806446075, 0.1973772495985031, 0.17830142378807068, 0.02446782775223255, 0.16028088331222534, 0.003589105326682329, -0.02908729389309883, 0.07800903916358948, 0.001039333757944405, -0.03403163328766823, -0.08341804146766663, -0.19453173875808716, -0.01920945756137371, 0.08615871518850327, 0.05208343267440796, 0.11178864538669586, -0.1340440809726715, -0.039688125252723694, -0.016580121591687202, -0.013963420875370502, 0.004052120726555586, 0.08927994221448898, 0.05621529743075371, 0.11766386777162552, -0.008479462936520576, 0.004914911463856697, 0.11690844595432281, 0.024292193353176117, -0.0974007099866867, 0.20369629561901093, -0.12859489023685455, -0.35919657349586487, -0.17192909121513367, -0.16941924393177032, -0.046767693012952805, 0.06603047996759415, 0.10566895455121994, -0.11921820044517517, -0.03283723443746567, 0.01984371617436409, 0.10511579364538193, -0.0874844342470169, 0.025252653285861015, -0.07854585349559784, 0.039858005940914154, -0.08228866755962372, -0.07852846384048462, -0.058627899736166, -0.02397638000547886, -0.06844961643218994, 0.15293799340724945, -0.10580270737409592, 0.04606963321566582, 0.19703397154808044, 0.035209350287914276, 0.05708123743534088, -0.03352535888552666, 0.19375872611999512, -0.09711813181638718, -0.014181635342538357, 0.20692157745361328, -0.04432303458452225, 0.08276087045669556, 0.10658510029315948, -0.0009211950236931443, -0.0905555859208107, 0.023672347888350487, -0.03327333554625511, -0.09995128959417343, -0.2413795441389084, -0.12423769384622574, -0.12672755122184753, 0.07157120853662491, 0.06113129481673241, 0.06719478219747543, 0.1604551076889038, 0.09354656934738159, -0.019843624904751778, 0.04505275562405586, -0.0036725422833114862, 0.07906411588191986, 0.20365294814109802, -0.0204415675252676, 0.13615357875823975, -0.050657231360673904, -0.13334059715270996, 0.09257177263498306, 0.06900633871555328, 0.15225820243358612, 0.054498545825481415, 0.05270633473992348, 0.006767008453607559, 0.06716175377368927, 0.1454283893108368, 0.13071000576019287, 0.014545821584761143, -0.016409022733569145, -0.021825823932886124, -0.011036834679543972, -0.05876464396715164, 0.04085689038038254, 0.02777833305299282, -0.1610528975725174, -0.05520197004079819, -0.12001585215330124, 0.08774644136428833, 0.09219257533550262, 0.06569026410579681, -0.2342914491891861, 0.007060535252094269, 0.08197256177663803, -0.028898365795612335, -0.1258426308631897, 0.08190665394067764, -0.021697908639907837, -0.14926569163799286, 0.0494246669113636, -0.061497997492551804, 0.12161173671483994, -0.07084709405899048, 0.08109014481306076, -0.03937468305230141, -0.062106676399707794, 0.020281726494431496, 0.1271398812532425, -0.29730626940727234, 0.20356124639511108, -0.001819691271521151, -0.05869410187005997, -0.11437822878360748, 0.01959572173655033, 0.01367559190839529, 0.11016108095645905, 0.10386832803487778, 0.005328167695552111, -0.0475030355155468, -0.12364684045314789, -0.022924374788999557, 0.024910306558012962, 0.12441114336252213, -0.05739542469382286, -0.008891535922884941, -0.044362228363752365, -0.0058176638558506966, -0.028876133263111115, -0.053936153650283813, 0.025268638506531715, -0.16888569295406342, 0.08389513194561005, 0.017658868804574013, 0.09978678822517395, 0.01261826977133751, -0.013697084039449692, -0.09944134950637817, 0.23519866168498993, -0.07718266546726227, -0.11035529524087906, -0.1205357164144516, -0.04611735790967941, 0.0686027929186821, -0.0741099938750267, 0.0634869635105133, -0.08208895474672318, 0.024847982451319695, -0.047674816101789474, -0.21411024034023285, 0.1248590424656868, -0.09078147262334824, -0.047217957675457, -0.038028888404369354, 0.1873915195465088, -0.07860055565834045, 0.003835690440610051, 0.01727161929011345, 0.03052649088203907, -0.11501652747392654, -0.10535892844200134, 0.02131424844264984, -0.005508285015821457, 0.06073078140616417, 0.04357268661260605, -0.06716573983430862, 0.01641303487122059, -0.022389056161046028, -0.006917606573551893, 0.32454678416252136, 0.14079391956329346, -0.04770330339670181, 0.17363035678863525, 0.11376409232616425, -0.08209476619958878, -0.31482723355293274, -0.08535979688167572, -0.09984239190816879, -0.03735451400279999, -0.06232178583741188, -0.21656104922294617, 0.09480288624763489, 0.04200942441821098, -0.015409117564558983, 0.1568077802658081, -0.24411429464817047, -0.0795927420258522, 0.15950311720371246, -0.007333407178521156, 0.3560895025730133, -0.12491796165704727, -0.11301901936531067, -0.05532994866371155, -0.1397564709186554, 0.15002089738845825, -0.009417316876351833, 0.11106741428375244, -0.03287123143672943, 0.10856477171182632, 0.048215944319963455, -0.05544896051287651, 0.09160676598548889, 0.026295991614460945, -0.003711326979100704, -0.10597866773605347, -0.01747799478471279, 0.043585844337940216, 0.006319248117506504, 0.031217962503433228, -0.03127649053931236, 0.033463045954704285, -0.12691029906272888, -0.04727448150515556, -0.08006873726844788, 0.05846472829580307, 0.052333541214466095, -0.0737200528383255, -0.0010956452460959554, -0.06611854583024979, -0.016030769795179367, 0.003143493551760912, 0.19045160710811615, -0.03460016846656799, 0.14779594540596008, 0.0818052664399147, 0.09073434770107269, -0.1361592561006546, -0.0061243316158652306, -0.06888517737388611, -0.057741593569517136, 0.08706554025411606, -0.10988334566354752, 0.06429524719715118, 0.11854783445596695, -0.04650293290615082, 0.07134203612804413, 0.11840200424194336, 0.015247469767928123, -0.0033181030303239822, 0.13015136122703552, -0.2568117082118988, 0.019211336970329285, -0.0754370167851448, -0.03775216266512871, 0.08088402450084686, 0.07995659112930298, 0.16486960649490356, 0.036187540739774704, -0.042049095034599304, -0.003924929536879063, 0.009187355637550354, -0.039663419127464294, 0.08243577927350998, 0.012240500189363956, 0.023174172267317772, -0.15248477458953857, 0.071900375187397, 0.015580810606479645, -0.12336304783821106, 0.011253113858401775, 0.1477922946214676, -0.13801799714565277, -0.11707340180873871, -0.03374985232949257, 0.08742405474185944, -0.14541642367839813, -0.0241269338876009, -0.04783749580383301, -0.12825986742973328, 0.09339214116334915, 0.11613135039806366, 0.07497538626194, 0.10595441609621048, -0.0529337078332901, -0.02668607421219349, -0.03682107478380203, -0.022537073120474815, -0.0017330512637272477, 0.032638516277074814, -0.08304216712713242, 0.0579586885869503, -0.020800847560167313, 0.14298540353775024, -0.08964299410581589, -0.07169508188962936, -0.1581236720085144, 0.03564200550317764, -0.12593989074230194, -0.07035141438245773, -0.08840593695640564, -0.05227470397949219, -0.007837125100195408, -0.01494099572300911, -0.0388214997947216, -0.04472146928310394, -0.12364204227924347, 0.01879296824336052, -0.05806630104780197, 0.02100815810263157, -0.07383234053850174, 0.00039667764212936163, 0.08932872861623764, -0.0410015694797039, 0.13851116597652435, 0.13557660579681396, -0.08107975125312805, 0.11907198280096054, -0.13537484407424927, -0.0908876284956932, 0.1157127171754837, 0.013428857550024986, 0.03907458856701851, 0.06849293410778046, 0.037317484617233276, 0.06514574587345123, 0.016511039808392525, 0.05237346887588501, 0.006972990930080414, -0.1299850195646286, 0.03433857858181, -0.042786743491888046, -0.1481933295726776, -0.05744143947958946, -0.05092177540063858, 0.039562974125146866, 0.02438235841691494, 0.10801149904727936, -0.03665049374103546, 0.11085481196641922, -0.058541763573884964, 0.01499281544238329, 0.004919432103633881, -0.18287403881549835, -0.044654008001089096, -0.07792776077985764, 0.02775009535253048, 0.022204352542757988, 0.2720205783843994, 0.0410233810544014, 0.020275471732020378, 0.017097288742661476, 0.11327627301216125, 0.057128578424453735, 0.015525308437645435, 0.214890718460083, 0.11996994912624359, -0.06049320101737976, -0.10806480050086975, 0.0858595222234726, 0.02164783701300621, 0.007426374591886997, 0.14070266485214233, 0.008503482677042484, -0.015597577206790447, 0.0887407436966896, -0.03357330709695816, 0.0031263602431863546, -0.11658911406993866, -0.13779941201210022, -0.028487415984272957, 0.0629650130867958, -0.0040870243683457375, 0.0956285297870636, 0.13609373569488525, -0.026881180703639984, 0.03953414782881737, -0.007877747528254986, -0.054916199296712875, -0.1785028725862503, -0.15742821991443634, -0.0790708139538765, -0.13561099767684937, 0.014744875021278858, -0.10368648171424866, 0.04369770362973213, 0.09560346603393555, 0.055915698409080505, -0.05440305173397064, 0.10839936882257462, 0.060064028948545456, -0.1045473963022232, 0.056569941341876984, -0.032912541180849075, 0.06427399069070816, -0.001812951872125268, -0.02503552846610546, -0.09098561853170395, 0.0020124134607613087, 0.0017788249533623457, 0.0514003150165081, -0.05152478814125061, 0.024474015459418297, -0.15132632851600647, -0.09570280462503433, -0.04949872940778732, 0.07316448539495468, -0.06007300689816475, 0.1162300780415535, -0.001420395914465189, -0.017011309042572975, 0.03990921378135681, 0.2064858227968216, -0.07188161462545395, -0.04990030825138092, -0.047407180070877075, 0.22449158132076263, 0.04847963526844978, 0.10619479417800903, -0.013415440917015076, -0.00436578830704093, -0.07670432329177856, 0.36612021923065186, 0.2802904546260834, -0.06149837002158165, 0.012722660787403584, 0.03524370491504669, 0.030115660279989243, 0.13885097205638885, 0.1454230099916458, 0.09396251291036606, 0.27579233050346375, -0.08266803622245789, -0.052018675953149796, -0.015770163387060165, -0.020211221650242805, -0.09714096784591675, 0.11003416776657104, 0.04697350785136223, -0.06982195377349854, -0.044631510972976685, 0.09750646352767944, -0.24107815325260162, 0.1615772694349289, -0.07760030031204224, -0.15214353799819946, -0.06177033111453056, 0.012448563240468502, 0.10150322318077087, 0.00011545186134753749, 0.08784360438585281, -0.009687529876828194, -0.10291683673858643, 0.05749227851629257, 0.02730483002960682, -0.23568211495876312, -0.007146455347537994, 0.053680915385484695, -0.04540037736296654, 0.013332240283489227, -0.01917567476630211, 0.04910791665315628, 0.06717875599861145, 0.055140718817710876, -0.0426395982503891, 0.03817736729979515, -0.010196289978921413, -0.05020907521247864, 0.029649224132299423, 0.044778332114219666, 0.017814766615629196, -0.13065220415592194, 0.05277646332979202, -0.13968263566493988, 0.041911475360393524, -0.029653942212462425, -0.027413733303546906, -0.004670299123972654, -0.019546283408999443, -0.06313455104827881, 0.057941507548093796, 0.08424945920705795, 0.001472705160267651, -0.007915833964943886, -0.08050897717475891, -0.011023934930562973, -0.012819311581552029, -0.08308050036430359, -0.10086389631032944, -0.1384236365556717, -0.10634621232748032, 0.12701933085918427, -0.017066750675439835, -0.19125573337078094, 0.01284839678555727, -0.09708964824676514, 0.060041818767786026, -0.1797112077474594, 0.0843181237578392, 0.06071038171648979, 0.01623542606830597, -0.004114143084734678, -0.029135411605238914, 0.039420004934072495, 0.08210206776857376, -0.10779064148664474, -0.09044761955738068 ]
null
null
transformers
### Introduction: This model belongs to text-classification. You can determine the emotion behind a sentence. ### Label Explaination: LABEL_0: Positive (have positive emotion) LABEL_1: Negative (have negative emotion) ### Usage: ```python >>> from transformers import pipeline >>> ec = pipeline('text-classification', model='Osiris/emotion_classifier') >>> ec("Hello, I'm a good model.") ``` ### Accuracy: We reach 83.82% for validation dataset, and 84.42% for test dataset.
{}
text-classification
Osiris/emotion_classifier
[ "transformers", "pytorch", "roberta", "text-classification", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #roberta #text-classification #autotrain_compatible #endpoints_compatible #region-us
### Introduction: This model belongs to text-classification. You can determine the emotion behind a sentence. ### Label Explaination: LABEL_0: Positive (have positive emotion) LABEL_1: Negative (have negative emotion) ### Usage: ### Accuracy: We reach 83.82% for validation dataset, and 84.42% for test dataset.
[ "### Introduction:\nThis model belongs to text-classification. You can determine the emotion behind a sentence.", "### Label Explaination:\nLABEL_0: Positive (have positive emotion)\n\nLABEL_1: Negative (have negative emotion)", "### Usage:", "### Accuracy:\nWe reach 83.82% for validation dataset, and 84.42% for test dataset." ]
[ "TAGS\n#transformers #pytorch #roberta #text-classification #autotrain_compatible #endpoints_compatible #region-us \n", "### Introduction:\nThis model belongs to text-classification. You can determine the emotion behind a sentence.", "### Label Explaination:\nLABEL_0: Positive (have positive emotion)\n\nLABEL_1: Negative (have negative emotion)", "### Usage:", "### Accuracy:\nWe reach 83.82% for validation dataset, and 84.42% for test dataset." ]
[ 37, 25, 30, 5, 26 ]
[ "passage: TAGS\n#transformers #pytorch #roberta #text-classification #autotrain_compatible #endpoints_compatible #region-us \n### Introduction:\nThis model belongs to text-classification. You can determine the emotion behind a sentence.### Label Explaination:\nLABEL_0: Positive (have positive emotion)\n\nLABEL_1: Negative (have negative emotion)### Usage:### Accuracy:\nWe reach 83.82% for validation dataset, and 84.42% for test dataset." ]
[ -0.07753189653158188, -0.012537935748696327, -0.0027592312544584274, 0.02366776578128338, 0.11673633754253387, 0.033208463340997696, -0.022188007831573486, 0.09010177850723267, 0.019750282168388367, 0.054005399346351624, 0.1143728569149971, 0.20197159051895142, -0.0014222735771909356, 0.0634693130850792, -0.14991696178913116, -0.2272961288690567, -0.009573235176503658, 0.09487899392843246, 0.21402834355831146, 0.17112979292869568, 0.090422622859478, -0.07076625525951385, 0.08323071897029877, -0.0011194408871233463, -0.1112617775797844, 0.015558685176074505, 0.06367205828428268, -0.04731361195445061, 0.11797157675027847, 0.018222708255052567, 0.03983763977885246, 0.033510033041238785, -0.012376457452774048, -0.2177865207195282, 0.021276317536830902, -0.05018427595496178, 0.033143214881420135, 0.02299637533724308, -0.045037753880023956, -0.15749670565128326, 0.20787595212459564, -0.010197021067142487, 0.04777894541621208, 0.04573741927742958, -0.07652534544467926, -0.14581353962421417, -0.003778852056711912, 0.10472959280014038, 0.09887024760246277, 0.10955099016427994, -0.08101029694080353, 0.21679121255874634, -0.16819554567337036, 0.07834440469741821, 0.1739186942577362, -0.15868814289569855, -0.04308094456791878, 0.004343646112829447, 0.013304105959832668, -0.05584566667675972, -0.12550745904445648, 0.06132431700825691, 0.11147385090589523, 0.05595242977142334, -0.04589682072401047, -0.06773458421230316, 0.006808755919337273, 0.02385384775698185, -0.0585898794233799, -0.049151238054037094, 0.25117382407188416, 0.01676960662007332, -0.043723758310079575, -0.11028171330690384, -0.01324848085641861, -0.027178898453712463, 0.0053762090392410755, -0.051682669669389725, 0.009629199281334877, 0.010486534796655178, -0.05391276255249977, 0.11661522090435028, -0.13673613965511322, -0.0018849314656108618, -0.15112777054309845, 0.20393049716949463, -0.048568855971097946, 0.01665649190545082, -0.0020273609552532434, 0.038390256464481354, -0.11150132864713669, -0.09165588021278381, -0.0008546974277123809, -0.09804981201887131, 0.07075214385986328, -0.010825449600815773, -0.13823743164539337, -0.021283922716975212, -0.03096693940460682, -0.05728892982006073, -0.06965175271034241, -0.013842128217220306, -0.001776662771590054, 0.0005930926417931914, 0.156104177236557, 0.2342488169670105, 0.017079008743166924, -0.014825723133981228, -0.08245309442281723, -0.04170599579811096, 0.03755335509777069, 0.006048708688467741, -0.04633535072207451, -0.01656216010451317, 0.12343219667673111, 0.02747504599392414, -0.13168294727802277, 0.0583685077726841, -0.15777994692325592, 0.0038177769165486097, -0.13014504313468933, -0.029025651514530182, 0.004560397006571293, 0.03073018603026867, -0.04411017522215843, 0.2140374779701233, -0.04682794585824013, -0.01739976927638054, -0.023452583700418472, -0.018363678827881813, -0.016250306740403175, -0.0005490643088705838, -0.06042579934000969, -0.09432711452245712, 0.07039553672075272, -0.005969332065433264, 0.04828936234116554, -0.12602846324443817, -0.1593744158744812, -0.07189800590276718, 0.022471720352768898, -0.091036356985569, -0.002313310047611594, -0.06570116430521011, -0.009494940750300884, 0.07732376456260681, -0.02579706907272339, -0.07700841873884201, -0.02008889988064766, 0.03221132233738899, 0.06397217512130737, 0.1194918304681778, 0.0018619323382154107, 0.01733708567917347, -0.17771311104297638, -0.0052134450525045395, -0.07178003340959549, 0.08329800516366959, -0.01189703494310379, 0.12907743453979492, -0.005112852901220322, -0.09482768923044205, 0.051771629601716995, 0.03195974603295326, -0.023375313729047775, 0.196397602558136, -0.17847026884555817, -0.12323517352342606, 0.09864558279514313, -0.10084472596645355, -0.10639498382806778, 0.1127878800034523, -0.042395059019327164, 0.16550220549106598, 0.15740709006786346, 0.12327159196138382, -0.04485765099525452, -0.07243424654006958, -0.04470157250761986, 0.08437474071979523, -0.08880948275327682, 0.0558350570499897, 0.007468458265066147, 0.08110688626766205, -0.17044022679328918, 0.07223448157310486, 0.07861416786909103, 0.019227437674999237, -0.0779411569237709, -0.05806121602654457, -0.040060754865407944, -0.034032437950372696, 0.08533761650323868, 0.1314052790403366, 0.023071525618433952, -0.07469552010297775, -0.08264964073896408, -0.07507466524839401, 0.04773494973778725, -0.07192597538232803, 0.0049599651247262955, -0.06485873460769653, 0.14174331724643707, 0.003624152159318328, -0.02291444130241871, -0.16956846415996552, 0.12736137211322784, -0.01992061361670494, 0.16809651255607605, -0.022505126893520355, 0.051839299499988556, 0.060149721801280975, -0.10557518899440765, -0.033884041011333466, 0.016679923981428146, 0.07930397987365723, 0.03274815157055855, -0.06488685309886932, -0.17080645263195038, 0.09848436713218689, -0.01972164772450924, 0.15816503763198853, -0.1216600090265274, 0.00005036448419559747, 0.012780433520674706, -0.014404522255063057, -0.05381641909480095, 0.0645962581038475, -0.021278316155076027, -0.002524263458326459, -0.06038383021950722, 0.038480788469314575, 0.10749630630016327, -0.018692681565880775, -0.14593760669231415, 0.14068078994750977, -0.1257006675004959, -0.056324463337659836, 0.11166618019342422, -0.16593730449676514, -0.10540325194597244, -0.04309896379709244, -0.06650663167238235, 0.0747302994132042, 0.017891118302941322, 0.043123599141836166, 0.14217588305473328, -0.01557839848101139, 0.04033351317048073, -0.04741101711988449, 0.011829276569187641, 0.027929114177823067, -0.12368837743997574, -0.030082248151302338, 0.15975573658943176, -0.05943246930837631, -0.1581951528787613, 0.07993508875370026, 0.15583761036396027, -0.053618162870407104, 0.11416646838188171, 0.0051039173267781734, -0.019669506698846817, -0.039463117718696594, -0.10371555387973785, -0.07898777723312378, 0.02484639547765255, -0.10687406361103058, -0.04585040733218193, 0.04517755284905434, -0.02330878935754299, -0.012941739521920681, -0.12448259443044662, -0.02686939388513565, 0.06598653644323349, 0.038905512541532516, 0.004781720228493214, 0.0691438540816307, 0.040835555642843246, 0.1122651994228363, -0.004465161822736263, -0.07060430943965912, 0.017317987978458405, 0.021290654316544533, -0.14492745697498322, 0.18820823729038239, -0.07178483158349991, -0.30003267526626587, -0.08899220824241638, -0.045682184398174286, -0.02196180261671543, 0.03245134651660919, 0.014287387020885944, -0.1681082844734192, -0.031553078442811966, -0.028247108682990074, 0.07140585035085678, -0.022411318495869637, -0.025933589786291122, 0.0076162065379321575, -0.012088957242667675, -0.05645372346043587, -0.029671769589185715, -0.07626509666442871, -0.11648423969745636, -0.018012620508670807, 0.09149883687496185, -0.19008027017116547, 0.03342941403388977, 0.2577474117279053, 0.0035853609442710876, 0.011465786024928093, -0.07777588069438934, 0.12590546905994415, -0.09815159440040588, -0.06442281603813171, 0.09842623770236969, -0.06660658121109009, 0.003237225813791156, 0.20163998007774353, -0.030451372265815735, -0.10720162838697433, 0.03435754403471947, 0.02886778488755226, -0.04011356085538864, -0.18277882039546967, -0.09694217890501022, -0.004868358373641968, 0.10114017874002457, -0.0656614825129509, -0.017486510798335075, 0.07626713812351227, 0.0829591378569603, 0.05222751200199127, -0.18173666298389435, -0.07593300938606262, 0.08302268385887146, 0.27868199348449707, -0.07476923614740372, 0.08118672668933868, -0.011333958245813847, -0.08168588578701019, 0.09897258132696152, -0.1308194249868393, 0.06600706279277802, -0.004630828741937876, -0.03869835287332535, 0.022173207253217697, 0.04258662834763527, 0.007440523710101843, 0.09162552654743195, -0.08093508332967758, -0.06566974520683289, -0.08577626943588257, 0.013469778001308441, -0.14496150612831116, 0.06331773847341537, 0.01599181443452835, 0.10205432772636414, -0.20092423260211945, -0.09776571393013, 0.0998140424489975, 0.03438752517104149, 0.15633557736873627, -0.22939153015613556, -0.09700937569141388, 0.0350734107196331, -0.038289401680231094, -0.03171331062912941, 0.05568322911858559, -0.10834791511297226, -0.13419948518276215, 0.11344484984874725, 0.02308621257543564, 0.076853446662426, -0.062400076538324356, 0.08412306010723114, -0.22425928711891174, -0.11703027784824371, 0.02573641575872898, 0.1238103061914444, -0.21768926084041595, 0.21434101462364197, 0.02095182240009308, -0.02327636443078518, -0.08366028964519501, -0.05063496530056, 0.044217318296432495, 0.11516807228326797, 0.11513324081897736, -0.0059356833808124065, 0.0858602374792099, -0.08929771929979324, -0.04001060128211975, 0.05056856572628021, 0.019360926002264023, 0.026218362152576447, 0.0600900873541832, 0.013491596095263958, 0.04929136857390404, 0.019598737359046936, 0.1478164941072464, -0.04294643923640251, -0.01742594502866268, -0.010817771777510643, 0.04976284131407738, -0.051847219467163086, 0.06907624751329422, -0.11825784295797348, -0.03946685418486595, 0.22496430575847626, 0.11842932552099228, -0.01753094233572483, -0.10149873793125153, 0.10123982280492783, 0.03091823309659958, -0.042722370475530624, -0.1029796451330185, -0.04328014329075813, 0.07304517924785614, -0.01917605847120285, -0.06782855093479156, 0.11237264424562454, -0.035725269466638565, -0.1292945146560669, -0.024095993489027023, 0.12949922680854797, 0.055891893804073334, 0.1215326339006424, 0.06894514709711075, -0.005288387183099985, -0.12648612260818481, -0.0715789645910263, 0.05632712319493294, 0.09431669861078262, -0.032574158161878586, 0.034641146659851074, 0.010139163583517075, -0.08590687066316605, -0.12285404652357101, 0.0059237671084702015, 0.22743478417396545, 0.15741506218910217, -0.033468618988990784, 0.06827694177627563, 0.09405317157506943, -0.08874734491109848, -0.2350790798664093, -0.0036178871523588896, 0.00046225902042351663, 0.053572189062833786, 0.03752068057656288, 0.013782513327896595, 0.08569932729005814, -0.07408616691827774, -0.024406759068369865, -0.14348025619983673, -0.05657622218132019, -0.043764371424913406, 0.2644229531288147, 0.02279738336801529, 0.4042169451713562, -0.09957221895456314, 0.0022569610737264156, -0.09878835082054138, -0.1312715709209442, 0.13927897810935974, -0.0320226326584816, 0.06967373937368393, -0.02191927284002304, 0.25183311104774475, 0.04630707576870918, 0.0020703652407974005, 0.08657416701316833, 0.008316513150930405, 0.029648635536432266, -0.08476812392473221, -0.08852998167276382, -0.045208584517240524, 0.004784608259797096, 0.1076890081167221, -0.05455109104514122, 0.013530581258237362, -0.1824178695678711, -0.10478346794843674, -0.10643530637025833, 0.016499442979693413, -0.012508172541856766, -0.02932741679251194, -0.11382580548524857, 0.029695414006710052, 0.043664198368787766, -0.053552236407995224, -0.0383804552257061, -0.12596409022808075, 0.051774460822343826, 0.09847872704267502, 0.21898552775382996, 0.020321235060691833, 0.04319535568356514, 0.04000113904476166, -0.06201794371008873, 0.03680604323744774, -0.21081911027431488, 0.030178777873516083, 0.13492342829704285, -0.03888748213648796, 0.1407804787158966, 0.10119141638278961, -0.06504442542791367, 0.016286686062812805, 0.06602577865123749, -0.11689333617687225, 0.02515796385705471, -0.04167969897389412, -0.10582031309604645, -0.08462726324796677, -0.055238235741853714, 0.10435697436332703, -0.06686702370643616, -0.009088655933737755, -0.020521273836493492, 0.0371038094162941, -0.015306572429835796, 0.04481132701039314, 0.012648937292397022, -0.04328588396310806, -0.08262767642736435, -0.035975854843854904, -0.07247472554445267, -0.17994725704193115, 0.10537277162075043, -0.01950337179005146, -0.06983533501625061, -0.05821467936038971, 0.08072721213102341, 0.2509264349937439, -0.19903795421123505, -0.07900111377239227, -0.05623204633593559, -0.22149446606636047, 0.03555773198604584, 0.15287138521671295, 0.13664059340953827, 0.07204143702983856, -0.11936599761247635, -0.0439593642950058, -0.04239175096154213, 0.041632357984781265, 0.12115650624036789, -0.07681579142808914, -0.000779671419877559, 0.005984514486044645, 0.002573844976723194, 0.09444453567266464, -0.09584103524684906, -0.0693143829703331, -0.10874496400356293, -0.0005131616489961743, -0.17092937231063843, -0.04716615378856659, -0.056337788701057434, -0.017547769472002983, 0.06574752181768417, 0.01590661145746708, 0.008185548707842827, -0.044647131115198135, -0.063967764377594, 0.09540213644504547, 0.028908781707286835, 0.06025657430291176, -0.08562736958265305, -0.03500210866332054, 0.12001367658376694, -0.0233418308198452, 0.10306482762098312, 0.10788020491600037, -0.03027842938899994, 0.15016289055347443, -0.2864997386932373, 0.03456965833902359, 0.12311229109764099, -0.04969102889299393, 0.03814145177602768, -0.09324130415916443, 0.00017333158757537603, 0.07916771620512009, 0.021504435688257217, 0.07976065576076508, 0.03075690194964409, -0.008347132243216038, 0.045632850378751755, 0.11804008483886719, -0.17813608050346375, -0.051170215010643005, -0.014941372908651829, -0.06016816198825836, 0.04908591881394386, 0.11417791992425919, -0.048879463225603104, 0.027467483654618263, -0.08082135021686554, 0.013758125714957714, -0.002603835891932249, -0.027945462614297867, -0.13757134974002838, -0.09698235243558884, 0.045501358807086945, 0.03466888517141342, 0.17278562486171722, 0.06828699260950089, 0.07247013598680496, -0.0372452586889267, 0.10535045713186264, 0.10366775095462799, -0.03646600618958473, 0.07408291101455688, 0.05939684435725212, -0.050002746284008026, -0.004857935942709446, 0.060416411608457565, 0.06696141511201859, 0.03811456263065338, 0.17013441026210785, -0.025199420750141144, 0.0939381793141365, 0.14544665813446045, -0.056837327778339386, 0.15766116976737976, 0.049342796206474304, -0.11548256129026413, -0.07039577513933182, 0.07862936705350876, 0.03591643273830414, 0.160745307803154, 0.12356042861938477, -0.0160747691988945, 0.07242835313081741, -0.11434492468833923, -0.09981761127710342, -0.20610372722148895, -0.11506997048854828, -0.08795611560344696, -0.10872188955545425, 0.05483059585094452, -0.1360541582107544, -0.04118528962135315, -0.052318889647722244, 0.06056788191199303, -0.10484075546264648, 0.05075925216078758, 0.017629368230700493, -0.0676826536655426, 0.15656471252441406, -0.033825166523456573, 0.0021005882881581783, -0.018174681812524796, 0.0853070616722107, -0.03694168105721474, -0.02279348112642765, 0.06768076866865158, 0.04038530960679054, -0.05107146129012108, -0.03711077570915222, -0.10199050605297089, -0.08387402445077896, -0.013860736042261124, 0.03207544982433319, 0.009350542910397053, 0.05443903058767319, -0.016289152204990387, 0.02888631820678711, -0.01128415483981371, 0.11677610129117966, -0.05167364329099655, 0.051331259310245514, -0.09509355574846268, 0.3333888649940491, -0.05174610763788223, 0.02857082709670067, -0.02571849711239338, -0.0014740002807229757, -0.006489710882306099, 0.27166303992271423, 0.12488564103841782, -0.0674646645784378, -0.007279340177774429, -0.03514484688639641, 0.043430618941783905, 0.06874227523803711, 0.037629954516887665, 0.07292278856039047, 0.20924268662929535, -0.14121004939079285, 0.07973019778728485, -0.0829000398516655, -0.07338888943195343, 0.05530434101819992, 0.01248928066343069, 0.12588314712047577, -0.025680767372250557, -0.07956413924694061, 0.14345143735408783, -0.15120716392993927, 0.07262318581342697, 0.06849077343940735, -0.10676411539316177, -0.04298226162791252, 0.0040047611109912395, 0.09499562531709671, 0.10928940027952194, 0.10763926059007645, -0.001164881745353341, -0.022744925692677498, 0.10192606598138809, -0.00039012610795907676, -0.19913773238658905, 0.03873838484287262, 0.15367478132247925, -0.007757942657917738, -0.04819326475262642, -0.028473231941461563, 0.1250787377357483, 0.10395914316177368, 0.012079743668437004, 0.010320753790438175, 0.12092803418636322, 0.01944996416568756, -0.0549575611948967, 0.05977844446897507, 0.11672999709844589, 0.04552806541323662, -0.05315880849957466, 0.09493446350097656, -0.1852264553308487, 0.03555363789200783, -0.09194609522819519, -0.05587353557348251, -0.10031905025243759, 0.09540534019470215, -0.07357264310121536, 0.05452784150838852, 0.1763780266046524, 0.004459199961274862, -0.014849084429442883, -0.07170037925243378, -0.026312991976737976, -0.07041984051465988, -0.07453347742557526, 0.030379436910152435, -0.05953309312462807, -0.012774056755006313, 0.2666732966899872, -0.010467331856489182, -0.2747441232204437, -0.010907744988799095, -0.02664041705429554, -0.029935961589217186, -0.020636985078454018, 0.036903806030750275, -0.03244408592581749, 0.08590483665466309, -0.032060157507658005, -0.12374801188707352, 0.07078083604574203, 0.16706688702106476, -0.05964409187436104, -0.10595928877592087 ]
null
null
transformers
### Introduction: This model belongs to text-classification. You can check whether the sentence consists any emotion. ### Label Explaination: LABEL_1: Non Neutral (have some emotions) LABEL_0: Neutral (have no emotion) ### Usage: ```python >>> from transformers import pipeline >>> nnc = pipeline('text-classification', model='Osiris/neutral_non_neutral_classifier') >>> nnc("Hello, I'm a good model.") ``` ### Accuracy: We reach 93.98% for validation dataset, and 91.92% for test dataset.
{}
text-classification
Osiris/neutral_non_neutral_classifier
[ "transformers", "pytorch", "roberta", "text-classification", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #roberta #text-classification #autotrain_compatible #endpoints_compatible #region-us
### Introduction: This model belongs to text-classification. You can check whether the sentence consists any emotion. ### Label Explaination: LABEL_1: Non Neutral (have some emotions) LABEL_0: Neutral (have no emotion) ### Usage: ### Accuracy: We reach 93.98% for validation dataset, and 91.92% for test dataset.
[ "### Introduction:\nThis model belongs to text-classification. You can check whether the sentence consists any emotion.", "### Label Explaination:\nLABEL_1: Non Neutral (have some emotions)\n\nLABEL_0: Neutral (have no emotion)", "### Usage:", "### Accuracy:\nWe reach 93.98% for validation dataset, and 91.92% for test dataset." ]
[ "TAGS\n#transformers #pytorch #roberta #text-classification #autotrain_compatible #endpoints_compatible #region-us \n", "### Introduction:\nThis model belongs to text-classification. You can check whether the sentence consists any emotion.", "### Label Explaination:\nLABEL_1: Non Neutral (have some emotions)\n\nLABEL_0: Neutral (have no emotion)", "### Usage:", "### Accuracy:\nWe reach 93.98% for validation dataset, and 91.92% for test dataset." ]
[ 37, 27, 32, 5, 26 ]
[ "passage: TAGS\n#transformers #pytorch #roberta #text-classification #autotrain_compatible #endpoints_compatible #region-us \n### Introduction:\nThis model belongs to text-classification. You can check whether the sentence consists any emotion.### Label Explaination:\nLABEL_1: Non Neutral (have some emotions)\n\nLABEL_0: Neutral (have no emotion)### Usage:### Accuracy:\nWe reach 93.98% for validation dataset, and 91.92% for test dataset." ]
[ -0.07817394286394119, 0.05780757963657379, -0.0029483214020729065, 0.03544383496046066, 0.10621963441371918, 0.04256858304142952, -0.00778046902269125, 0.09806018322706223, 0.047896966338157654, 0.03849336877465248, 0.10247829556465149, 0.18463344871997833, -0.011804779060184956, 0.029840588569641113, -0.14909182488918304, -0.22989311814308167, -0.014712799340486526, 0.07904250919818878, 0.21993902325630188, 0.17106306552886963, 0.07862221449613571, -0.0849166139960289, 0.07858511060476303, -0.020633233711123466, -0.09653119742870331, 0.0052728429436683655, 0.042172808200120926, -0.030986886471509933, 0.09844689816236496, 0.006556719541549683, 0.04447184130549431, 0.0625266507267952, -0.04031570628285408, -0.22398081421852112, 0.023120520636439323, -0.05759653449058533, 0.036318302154541016, 0.024508807808160782, -0.041045181453228, -0.17313066124916077, 0.1941913664340973, -0.019410571083426476, 0.05323771759867668, 0.05881427973508835, -0.07996753603219986, -0.12996533513069153, 0.005083439406007528, 0.08066663891077042, 0.06296920776367188, 0.12362419813871384, -0.0879039466381073, 0.25961363315582275, -0.14240160584449768, 0.08786244690418243, 0.158608078956604, -0.15390165150165558, -0.04080289602279663, 0.016578814014792442, -0.009869435802102089, -0.06581930816173553, -0.12374803423881531, 0.07132400572299957, 0.11423450708389282, 0.04982254281640053, -0.034090690314769745, -0.06520765274763107, 0.00596381351351738, -0.018177729099988937, -0.06975993514060974, -0.03269576653838158, 0.24168364703655243, 0.040490955114364624, -0.03730609640479088, -0.07730574160814285, -0.014996557496488094, -0.0033238385803997517, 0.020098555833101273, -0.03366474434733391, 0.0036776037886738777, -0.007001831196248531, -0.029717497527599335, 0.09476574510335922, -0.13968311250209808, -0.004326589405536652, -0.15600068867206573, 0.19348643720149994, -0.04191650450229645, 0.02059890516102314, 0.005733050871640444, 0.041669812053442, -0.10610537976026535, -0.14297132194042206, -0.008252186700701714, -0.10370314121246338, 0.046588242053985596, -0.016517331823706627, -0.158965602517128, -0.02334189973771572, -0.00305058341473341, -0.030390771105885506, -0.043040309101343155, -0.004524143412709236, -0.006422386504709721, -0.004673946648836136, 0.1715969741344452, 0.1963914930820465, 0.01829572021961212, -0.003344564000144601, -0.07484868168830872, -0.03721645474433899, 0.058711305260658264, 0.008255094289779663, -0.05443577468395233, -0.011888707987964153, 0.10693049430847168, 0.026996124535799026, -0.10417890548706055, 0.0653473436832428, -0.14704447984695435, -0.002098750090226531, -0.11663536727428436, -0.04601751267910004, -0.005731841549277306, 0.03849881887435913, -0.02656063437461853, 0.20698414742946625, -0.045582812279462814, -0.005645006895065308, -0.01322135329246521, -0.047735489904880524, -0.022749528288841248, -0.007505813613533974, -0.05756605789065361, -0.08020427823066711, 0.06572170555591583, -0.020272981375455856, 0.04089228808879852, -0.14403386414051056, -0.1501270830631256, -0.07171617448329926, 0.033175740391016006, -0.11011743545532227, 0.002249137032777071, -0.07241442054510117, -0.009533269330859184, 0.0632813423871994, -0.022175651043653488, -0.07499738037586212, -0.023088814690709114, 0.04166106879711151, 0.08247236162424088, 0.13464927673339844, 0.00780764315277338, 0.022125961259007454, -0.21294903755187988, -0.022936586290597916, -0.0580255351960659, 0.09581080824136734, -0.03764082491397858, 0.10261388123035431, 0.0035931807942688465, -0.0850033238530159, 0.053975909948349, 0.016660025343298912, -0.04058864712715149, 0.2170446217060089, -0.19109153747558594, -0.11556405574083328, 0.11872851848602295, -0.10290758311748505, -0.09858103096485138, 0.12380492687225342, -0.03948913514614105, 0.12025836110115051, 0.13724882900714874, 0.13550473749637604, -0.03148551285266876, -0.04185483604669571, -0.046523064374923706, 0.09441672265529633, -0.03494194149971008, 0.09423615038394928, 0.006276941392570734, 0.06754913926124573, -0.16939988732337952, 0.08307842910289764, 0.0907868817448616, -0.020346375182271004, -0.0823034718632698, -0.07281612604856491, -0.04897434636950493, -0.04712524637579918, 0.13409000635147095, 0.13607367873191833, 0.02422526106238365, -0.09131588786840439, -0.09041302651166916, -0.06036181375384331, 0.05327094346284866, -0.0537647008895874, -0.0010642659617587924, -0.046587519347667694, 0.14015699923038483, -0.02007376030087471, -0.02999994345009327, -0.18597884476184845, 0.07580775022506714, -0.012435336597263813, 0.19121290743350983, 0.03170793130993843, 0.09965863823890686, 0.053421489894390106, -0.07601285725831985, -0.05834602937102318, 0.020961644127964973, 0.07549593597650528, 0.044955167919397354, -0.03427717834711075, -0.1890379637479782, 0.09922078996896744, -0.050664272159338, 0.1392170637845993, -0.10575687885284424, -0.003446861170232296, 0.01220298744738102, 0.037757374346256256, -0.05245703458786011, 0.05830579251050949, 0.00030390455503948033, -0.009210661984980106, -0.07440885156393051, 0.03259189799427986, 0.10727610439062119, -0.015739787369966507, -0.11339262872934341, 0.08978106826543808, -0.07847413420677185, -0.02618902362883091, 0.12971045076847076, -0.16483373939990997, -0.11942610144615173, -0.012927014380693436, -0.0639103427529335, 0.07716748863458633, 0.003747875802218914, 0.05076749622821808, 0.11653180420398712, -0.020667413249611855, 0.039095740765333176, -0.059736136347055435, 0.01863950490951538, 0.028844159096479416, -0.1367933452129364, -0.0406215563416481, 0.16544605791568756, -0.05770521238446236, -0.18298682570457458, 0.08087871968746185, 0.1423923373222351, -0.06622596085071564, 0.13252098858356476, -0.0005489597097039223, -0.014343863353133202, -0.044395338743925095, -0.10811246931552887, -0.07820131629705429, 0.027589887380599976, -0.1237727701663971, -0.026926876977086067, 0.062294501811265945, -0.020782891660928726, -0.012795782648026943, -0.11103340983390808, -0.02045104093849659, 0.07551952451467514, 0.04003741964697838, 0.022501608356833458, 0.0832037404179573, 0.031016826629638672, 0.11354046314954758, -0.008620815351605415, -0.06348883360624313, 0.033328600227832794, 0.018794210627675056, -0.14381033182144165, 0.18329526484012604, -0.06668977439403534, -0.27575236558914185, -0.05600805953145027, -0.03511423245072365, -0.07300565391778946, 0.03438718616962433, 0.023338310420513153, -0.1529867798089981, -0.033003564924001694, -0.042370084673166275, 0.028361903503537178, -0.006231903564184904, -0.004595191217958927, -0.005291575565934181, -0.011370507068932056, -0.07009134441614151, -0.001652441918849945, -0.07612679153680801, -0.10337427258491516, -0.025799183174967766, 0.10274588316679001, -0.16664338111877441, 0.04568763077259064, 0.26431214809417725, -0.015420248731970787, 0.021474815905094147, -0.06821046769618988, 0.13927553594112396, -0.07484360784292221, -0.05201268196105957, 0.0956992357969284, -0.030012216418981552, 0.007000387646257877, 0.20778566598892212, -0.02736271359026432, -0.09488938003778458, 0.044127319008111954, 0.0345950685441494, -0.039750754833221436, -0.18143707513809204, -0.11386387795209885, -0.003095544409006834, 0.0726253017783165, -0.06540533900260925, -0.035567816346883774, 0.08667206019163132, 0.09182730317115784, 0.04837622493505478, -0.1986527144908905, -0.11158211529254913, 0.07561984658241272, 0.2975884974002838, -0.06894905865192413, 0.09406723827123642, -0.01213387306779623, -0.07304207980632782, 0.08946540951728821, -0.10924200713634491, 0.038747671991586685, -0.00726141594350338, -0.04056982323527336, 0.030344562605023384, 0.05400768667459488, -0.002148108556866646, 0.0597664974629879, -0.03887626901268959, -0.056958578526973724, -0.08966181427240372, -0.005288020707666874, -0.12544383108615875, 0.06667102873325348, 0.05603824555873871, 0.10461123287677765, -0.2157570868730545, -0.10059867799282074, 0.11641235649585724, 0.03401964157819748, 0.18149244785308838, -0.21816253662109375, -0.09994703531265259, 0.03255128115415573, -0.024709224700927734, -0.027632681652903557, 0.046948742121458054, -0.11614683270454407, -0.12360353767871857, 0.13100218772888184, 0.031558599323034286, 0.059643082320690155, -0.05315731093287468, 0.08035137504339218, -0.2425098717212677, -0.09555724263191223, 0.02694268524646759, 0.13853201270103455, -0.2598012685775757, 0.23184530436992645, 0.01846565678715706, -0.013645380735397339, -0.08256707340478897, -0.0747818648815155, 0.033825382590293884, 0.13770371675491333, 0.11266064643859863, -0.01680881902575493, 0.07026796042919159, -0.05469227209687233, -0.06314490735530853, 0.06723709404468536, 0.030875392258167267, 0.0036141392774879932, 0.05685581639409065, 0.019966943189501762, 0.02218908816576004, 0.005385733209550381, 0.17410579323768616, -0.08633290976285934, -0.017615940421819687, 0.005497251637279987, 0.04605359956622124, -0.12431690096855164, 0.08264822512865067, -0.14298373460769653, -0.070027194917202, 0.248585507273674, 0.08781667798757553, -0.04678158089518547, -0.08652983605861664, 0.04587605223059654, 0.03242569416761398, -0.04124891757965088, -0.09018666297197342, -0.05259224399924278, 0.10045021027326584, -0.027307555079460144, -0.06767471134662628, 0.09178764373064041, -0.047519393265247345, -0.14010968804359436, -0.03306684270501137, 0.1281953901052475, 0.04284857586026192, 0.11988140642642975, 0.07922206819057465, 0.0012828786857426167, -0.12378375977277756, -0.07361331582069397, 0.059062931686639786, 0.10335566103458405, 0.003590366803109646, 0.06021195277571678, 0.002567790448665619, -0.1046563908457756, -0.1360660046339035, -0.013080980628728867, 0.2154030203819275, 0.18433347344398499, -0.04674791544675827, 0.055740125477313995, 0.14647795259952545, -0.09199795871973038, -0.2447812706232071, 0.021700745448470116, -0.02149384841322899, 0.04682344198226929, 0.04036593437194824, 0.008676236495375633, 0.05868079513311386, -0.09210898727178574, -0.026488129049539566, -0.1252175271511078, -0.005129847675561905, -0.056016720831394196, 0.2955743074417114, 0.010297955013811588, 0.3474605679512024, -0.08514535427093506, -0.016473006457090378, -0.09273393452167511, -0.12220528721809387, 0.18545781075954437, -0.0376388356089592, 0.10025034844875336, -0.033409420400857925, 0.2052684873342514, 0.047289781272411346, -0.00979264173656702, 0.0934574156999588, 0.017659857869148254, 0.018908847123384476, -0.06826157122850418, -0.0688745528459549, -0.010560227558016777, -0.0009133520070463419, 0.0970783680677414, -0.055791161954402924, 0.04762428253889084, -0.12538617849349976, -0.09780935943126678, -0.12988373637199402, 0.03108198009431362, -0.01369619369506836, -0.04717705398797989, -0.125356987118721, 0.043090660125017166, 0.05420689284801483, -0.05756760761141777, -0.0632118284702301, -0.14878800511360168, 0.057413019239902496, 0.20479290187358856, 0.2120358645915985, -0.01858840137720108, 0.06051065772771835, 0.007800050545483828, -0.08947759121656418, 0.036138132214546204, -0.17242304980754852, 0.01821122318506241, 0.14403915405273438, -0.01541210152208805, 0.12109272927045822, 0.09020295739173889, -0.06879301369190216, -0.014888632111251354, 0.058381762355566025, -0.12449032813310623, 0.031077802181243896, -0.04919794946908951, -0.1312147080898285, -0.08823510259389877, -0.047212984412908554, 0.095622718334198, -0.08841396868228912, -0.0046347444877028465, -0.012637047097086906, 0.0023190942592918873, -0.020555023103952408, 0.04345544055104256, 0.007528306916356087, -0.008352293632924557, -0.08270329236984253, 0.004362734034657478, -0.0749502032995224, -0.15432193875312805, 0.09894642233848572, -0.04033733531832695, -0.0808161199092865, -0.08560508489608765, 0.05646608769893646, 0.2085483968257904, -0.21085041761398315, -0.058370109647512436, -0.08109387010335922, -0.2037365436553955, 0.03842494636774063, 0.2008093297481537, 0.15219774842262268, 0.06654727458953857, -0.11983197182416916, -0.054229848086833954, -0.05076364427804947, 0.04945547133684158, 0.14025360345840454, -0.04811249300837517, -0.01877581514418125, 0.03972422704100609, -0.009025591425597668, 0.10376680642366409, -0.10364269465208054, -0.06703374534845352, -0.10137148201465607, 0.0005730935372412205, -0.18252401053905487, -0.031613145023584366, -0.031909409910440445, -0.022823520004749298, 0.05898790434002876, 0.001416027545928955, 0.011533518321812153, -0.043520260602235794, -0.06026178225874901, 0.09219618886709213, 0.032544881105422974, 0.060353148728609085, -0.07894144207239151, -0.03523818030953407, 0.10508421063423157, -0.02310214191675186, 0.08436070382595062, 0.07536353915929794, -0.04148644953966141, 0.13461266458034515, -0.2915789484977722, 0.028300143778324127, 0.10862552374601364, -0.0242772214114666, 0.04067421704530716, -0.09900728613138199, -0.006856958381831646, 0.06790584325790405, 0.03502470627427101, 0.0830204114317894, 0.056791987270116806, -0.011362707242369652, 0.024427084252238274, 0.10012923926115036, -0.1426856964826584, -0.05723001807928085, -0.030921034514904022, -0.05256362631917, 0.04057199880480766, 0.1144947037100792, -0.05876810848712921, 0.04357760027050972, -0.09257577359676361, 0.02874520793557167, -0.0240674689412117, -0.036839257925748825, -0.14137199521064758, -0.07248955965042114, 0.053467027842998505, 0.023357946425676346, 0.18987199664115906, 0.0536583848297596, 0.07496261596679688, -0.03719843178987503, 0.0850757360458374, 0.10724955797195435, -0.028167828917503357, 0.11508328467607498, 0.06546321511268616, -0.05772068724036217, -0.026619713753461838, 0.07399909198284149, 0.052149705588817596, 0.041908230632543564, 0.20895236730575562, 0.008099738508462906, 0.07196728140115738, 0.14153170585632324, -0.024174757301807404, 0.15121129155158997, 0.03326991945505142, -0.12016259133815765, -0.024062972515821457, 0.097537562251091, 0.027015522122383118, 0.11638719588518143, 0.1194566935300827, -0.04068232327699661, 0.07438614964485168, -0.11168499290943146, -0.09597300738096237, -0.1928815096616745, -0.14836642146110535, -0.09742866456508636, -0.08371853828430176, 0.04479360952973366, -0.14418481290340424, -0.05114126577973366, -0.0749366357922554, 0.06899988651275635, -0.10817977786064148, 0.08827885240316391, -0.026300448924303055, -0.05525529384613037, 0.1677754670381546, -0.027338063344359398, -0.030694179236888885, -0.0068738460540771484, 0.05376946181058884, -0.0285932794213295, -0.004881837405264378, 0.037923671305179596, 0.03217589855194092, -0.06610792130231857, -0.047806546092033386, -0.0908481702208519, -0.08776503801345825, -0.0020077619701623917, 0.0350535623729229, 0.051208898425102234, 0.06699354201555252, -0.0034089433029294014, 0.013529004529118538, 0.0007732722442597151, 0.12027899920940399, -0.06726759672164917, 0.02539125084877014, -0.0994495153427124, 0.3568602502346039, -0.06908753514289856, 0.03975912928581238, -0.007197429891675711, -0.013718296773731709, -0.016359440982341766, 0.2560424208641052, 0.12326530367136002, -0.07542672753334045, -0.003148553427308798, -0.016919435933232307, 0.04054538533091545, 0.019335564225912094, 0.034229692071676254, 0.05861060693860054, 0.18554703891277313, -0.14150996506214142, 0.04611692205071449, -0.06450651586055756, -0.0762723982334137, 0.03735312074422836, -0.0034541510976850986, 0.1338992714881897, -0.02513747103512287, -0.07238075137138367, 0.11902892589569092, -0.1712799370288849, 0.06516970694065094, 0.061566002666950226, -0.11588257551193237, -0.07243502885103226, 0.002458555856719613, 0.01691684126853943, 0.0971546471118927, 0.09835933148860931, -0.012053840793669224, -0.015388799831271172, 0.11598469316959381, -0.011211580596864223, -0.15235817432403564, 0.032475024461746216, 0.14130116999149323, 0.01750189997255802, -0.02281028777360916, -0.029602477326989174, 0.14391449093818665, 0.11058719456195831, 0.022428356111049652, 0.025732610374689102, 0.1489851474761963, 0.03350447118282318, -0.047487810254096985, 0.07207785546779633, 0.13833682239055634, 0.04516205936670303, -0.02555399388074875, 0.10746099799871445, -0.1945631206035614, 0.043363336473703384, -0.07545240223407745, -0.07093781977891922, -0.10384511947631836, 0.1137593537569046, -0.07533078640699387, 0.05492743104696274, 0.1891607642173767, -0.004706402309238911, -0.02804439142346382, -0.054537318646907806, 0.01589673012495041, -0.036670658737421036, -0.07401782274246216, 0.012192349880933762, -0.07030095905065536, -0.005396174732595682, 0.2662068009376526, -0.002038888866081834, -0.29996800422668457, -0.04260655492544174, -0.009180496446788311, -0.019319407641887665, -0.02318379282951355, 0.04059810936450958, -0.04733572155237198, 0.08847133070230484, -0.03794019669294357, -0.1240612268447876, 0.06368522346019745, 0.15927134454250336, -0.11084330826997757, -0.10767532885074615 ]
null
null
null
git lfs install git clone https://huggingface.co/r3dhummingbird/DialoGPT-medium-joshua
{}
null
OsmyReal/Ayuda
[ "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #region-us
git lfs install git clone URL
[]
[ "TAGS\n#region-us \n" ]
[ 6 ]
[ "passage: TAGS\n#region-us \n" ]
[ 0.024608636274933815, -0.026205500587821007, -0.009666500613093376, -0.10395516455173492, 0.08638657629489899, 0.059816278517246246, 0.01882290467619896, 0.020661840215325356, 0.23975107073783875, -0.005599027033895254, 0.1219947561621666, 0.0015615287702530622, -0.037353623658418655, 0.03733762726187706, -0.0035912662278860807, -0.17583473026752472, 0.03876631706953049, -0.018274923786520958, 0.01843859627842903, 0.026470553129911423, -0.07776834815740585, -0.07564429938793182, 0.015296397730708122, -0.10247814655303955, -0.083692267537117, 0.11002834886312485, 0.031466204673051834, -0.019670886918902397, 0.10779199749231339, -0.04243955761194229, 0.18699054419994354, -0.011512263678014278, -0.11213519424200058, -0.2536850869655609, 0.021806683391332626, -0.01765260472893715, -0.08747660368680954, 0.01506110467016697, 0.0665089413523674, -0.09014441072940826, -0.0588928684592247, 0.0795099288225174, -0.01132340170443058, 0.04246443510055542, -0.27593839168548584, -0.12684126198291779, -0.05297930911183357, -0.1421966552734375, 0.08651168644428253, 0.04035491496324539, 0.008764253929257393, 0.15506891906261444, -0.20897391438484192, 0.004104613792151213, 0.08255259692668915, -0.2538507878780365, 0.05591634660959244, 0.17671173810958862, 0.03623908758163452, 0.18037272989749908, 0.0060391901060938835, 0.11029672622680664, 0.0716743916273117, -0.024263937026262283, -0.17590197920799255, -0.08127854019403458, -0.04696211963891983, 0.16642488539218903, -0.06727185100317001, -0.14248386025428772, 0.34701237082481384, 0.00015008423360995948, 0.009657775051891804, 0.16921205818653107, -0.059524230659008026, -0.09972117841243744, 0.07259953022003174, 0.016484731808304787, 0.018492350354790688, 0.1471305936574936, 0.16307872533798218, -0.0458691343665123, -0.13837823271751404, -0.018630273640155792, -0.22798998653888702, 0.17510560154914856, -0.03248048573732376, 0.13137903809547424, -0.27447956800460815, 0.01684025302529335, -0.2570667266845703, 0.0032130838371813297, 0.04178816080093384, -0.06004921346902847, -0.0226522795855999, -0.013265985064208508, -0.08018817007541656, 0.004899587947875261, 0.06192673370242119, 0.1266920566558838, -0.06128726154565811, 0.06128238886594772, -0.09319206327199936, 0.141696035861969, 0.07166698575019836, 0.07868369668722153, 0.13037432730197906, 0.041205424815416336, -0.07187089323997498, -0.21872246265411377, -0.0026476888451725245, -0.06275863200426102, -0.09502086788415909, -0.0020165652967989445, -0.11606067419052124, 0.17244569957256317, -0.030802514404058456, -0.09825427830219269, -0.11208184063434601, 0.09148659557104111, -0.032992321997880936, -0.03437839448451996, -0.03552987426519394, -0.020977836102247238, 0.019381176680326462, 0.04704452306032181, -0.1548958420753479, -0.005131472367793322, 0.07039852440357208, 0.11502562463283539, -0.1346137970685959, -0.003783059772104025, -0.07908964157104492, 0.03039063885807991, 0.07654735445976257, -0.16510222852230072, 0.03158547356724739, -0.1124754324555397, -0.07531405985355377, 0.002912673633545637, -0.015710093080997467, -0.016202643513679504, 0.166526660323143, -0.0020451415330171585, 0.0714716836810112, -0.026345307007431984, -0.05890209600329399, -0.11243434250354767, -0.08489254862070084, 0.05390460044145584, 0.03670717030763626, 0.03266148269176483, -0.2193479984998703, 0.014805203303694725, -0.12762966752052307, 0.1360815018415451, -0.10566820204257965, -0.04705966264009476, -0.022842247039079666, 0.20562705397605896, 0.037286072969436646, 0.08762791007757187, -0.22171171009540558, 0.039756543934345245, -0.05404696613550186, 0.18480908870697021, -0.1502426266670227, -0.0799463614821434, 0.20813211798667908, -0.07964949309825897, -0.10115210711956024, 0.021235812455415726, 0.020391687750816345, 0.026287272572517395, 0.0766737088561058, 0.4564172327518463, -0.09766800701618195, -0.09146861732006073, 0.10178250074386597, 0.17055274546146393, -0.12427149713039398, -0.1827561855316162, 0.06446871906518936, -0.16666454076766968, -0.1973118633031845, 0.0018917324487119913, 0.09222044050693512, 0.038269978016614914, -0.07875611633062363, -0.020746968686580658, 0.06325206160545349, -0.0007678253459744155, 0.09095914661884308, 0.03755716234445572, 0.09034032374620438, -0.08716782182455063, 0.11115926504135132, -0.05017651244997978, 0.004037132486701012, 0.1343354731798172, 0.027325427159667015, -0.03223329409956932, 0.08694463223218918, -0.0485352948307991, 0.05295134335756302, -0.1662379503250122, -0.15068690478801727, 0.03398871049284935, 0.06283251196146011, 0.03186952322721481, 0.1280253529548645, 0.08141885697841644, -0.10732853412628174, 0.022690722718834877, -0.004228927195072174, 0.058398615568876266, 0.03891623765230179, 0.006107209715992212, 0.008764320984482765, 0.0961301177740097, -0.10607069730758667, -0.13589619100093842, -0.07336436957120895, -0.014715781435370445, 0.14371353387832642, -0.0302802175283432, 0.07690227776765823, -0.004240254405885935, 0.00013200697139836848, 0.06930823624134064, 0.08137880265712738, 0.016412746161222458, 0.08971183747053146, -0.05237193778157234, -0.05160155147314072, 0.10863113403320312, -0.13533565402030945, 0.17837053537368774, 0.14053137600421906, -0.20532016456127167, 0.029453208670020103, -0.06838275492191315, 0.03670361638069153, -0.008162540383636951, 0.0975119024515152, -0.08272241055965424, -0.02106042578816414, 0.013134466484189034, 0.0052274600602686405, -0.013007243163883686, 0.017682146281003952, -0.07295988500118256, -0.07787393033504486, -0.10233919322490692, 0.08436838537454605, 0.11562882363796234, -0.10282530635595322, 0.14214380085468292, 0.4384984076023102, 0.11495281755924225, 0.21582984924316406, -0.09581480920314789, -0.0412987545132637, 0.007486371789127588, 0.0001535322517156601, -0.04476691037416458, 0.08031861484050751, -0.15973517298698425, -0.038901735097169876, 0.027348900213837624, 0.07128690183162689, 0.11475157737731934, -0.14959022402763367, -0.09639324247837067, -0.00793045200407505, 0.0022841424215584993, -0.1249532699584961, 0.023905446752905846, -0.03974650055170059, 0.04015624523162842, 0.07232289016246796, -0.021535737439990044, 0.13939237594604492, -0.04166141897439957, -0.0639561116695404, 0.07585346698760986, -0.2017085999250412, -0.23179671168327332, -0.12309670448303223, -0.14680525660514832, 0.04366797208786011, 0.05154111236333847, 0.01726446859538555, -0.17635835707187653, -0.015074856579303741, 0.07706750929355621, 0.07820965349674225, -0.20886357128620148, -0.022814949974417686, -0.004290030337870121, 0.0895976573228836, -0.10227091610431671, -0.0017130117630586028, -0.04419664293527603, -0.10150232166051865, 0.0017003051470965147, 0.07279510796070099, -0.137485533952713, 0.13807645440101624, 0.21589438617229462, 0.07225540280342102, 0.07359948754310608, -0.019093448296189308, 0.09936179965734482, -0.10856141895055771, -0.16549113392829895, 0.08348225057125092, -0.06234746053814888, 0.047262318432331085, 0.17534415423870087, 0.03307317942380905, -0.13904969394207, -0.015682822093367577, -0.0402069091796875, -0.15603256225585938, -0.238995760679245, -0.09178274869918823, -0.1182505264878273, 0.16442428529262543, 0.0009358620154671371, 0.06651917099952698, 0.08258313685655594, -0.022042419761419296, 0.16447891294956207, -0.07379321753978729, -0.07578866183757782, -0.006978808436542749, 0.12375060468912125, -0.056660156697034836, -0.03080669604241848, -0.10566964000463486, -0.008295975625514984, 0.1151021271944046, 0.15304014086723328, 0.12214863300323486, 0.2957419455051422, 0.08268889784812927, 0.026645636186003685, 0.08958091586828232, 0.17622539401054382, 0.09495089203119278, 0.07838419824838638, -0.045413073152303696, -0.014814783819019794, 0.014317171648144722, -0.04022889584302902, 0.010141594335436821, 0.14683100581169128, -0.2679629921913147, -0.006678564939647913, -0.2710230350494385, 0.0965198427438736, -0.10913380235433578, 0.11837165057659149, -0.01015760749578476, 0.10194015502929688, 0.11082887649536133, 0.03233652561903, -0.03858073800802231, 0.16613617539405823, 0.08450309932231903, -0.11277695000171661, 0.001758623169735074, 0.03737903758883476, 0.09715615212917328, -0.02818971499800682, 0.12721189856529236, -0.11048974841833115, -0.1464834064245224, 0.013753619976341724, 0.07152791321277618, -0.15373679995536804, 0.3138748109340668, 0.012069208547472954, -0.13481520116329193, -0.01481647603213787, -0.09957809001207352, -0.006440147757530212, 0.1254177987575531, 0.09333524852991104, 0.07935678958892822, -0.2185502052307129, -0.13339371979236603, 0.05872276425361633, -0.00575496768578887, 0.22408108413219452, -0.034034017473459244, -0.11356475204229355, -0.027013886719942093, 0.04241163283586502, -0.06043251231312752, 0.08524788916110992, 0.023536119610071182, -0.08113526552915573, -0.032957352697849274, 0.05323701351881027, 0.012368366122245789, 0.00524376705288887, 0.09360801428556442, 0.020107939839363098, -0.0009265501867048442, 0.01785753294825554, 0.047885000705718994, -0.0675911232829094, -0.1984109878540039, 0.09357594698667526, -0.05215044692158699, 0.0015536568826064467, -0.08013670891523361, -0.15122665464878082, -0.08837161958217621, -0.16009655594825745, 0.12540200352668762, -0.034406669437885284, 0.12700119614601135, -0.06619787961244583, 0.17341409623622894, -0.07871770113706589, 0.04481020197272301, -0.047349292784929276, 0.050332702696323395, -0.007268077693879604, -0.07756082713603973, 0.16585899889469147, -0.15564003586769104, 0.01809087023139, 0.19572502374649048, -0.018915493041276932, 0.07177707552909851, 0.021322092041373253, -0.0636206790804863, 0.23147478699684143, 0.3014698624610901, 0.008138049393892288, 0.1665448248386383, 0.3018903136253357, -0.07466315478086472, -0.2642788887023926, -0.05505012720823288, -0.2841376066207886, -0.05371501296758652, 0.10716094076633453, -0.22523896396160126, 0.06986407935619354, 0.14383509755134583, -0.06471995264291763, 0.30228954553604126, -0.21825523674488068, 0.012589273042976856, 0.15434536337852478, -0.08868814259767532, 0.5515313148498535, -0.1133413165807724, -0.17677772045135498, -0.008122089318931103, -0.08741296827793121, 0.10602109134197235, -0.0340677872300148, 0.06877441704273224, 0.013465235009789467, 0.04797380417585373, 0.048932258039712906, -0.03111894056200981, 0.22701001167297363, 0.008710170164704323, 0.09015397727489471, -0.07378865778446198, -0.18624304234981537, 0.11639340221881866, -0.04359482601284981, -0.08891059458255768, 0.0849778801202774, -0.05942516401410103, -0.11078983545303345, 0.04663389176130295, -0.07950539886951447, -0.024862350896000862, 0.08423490077257156, -0.04678233340382576, -0.042606171220541, -0.008054176345467567, -0.1618063747882843, -0.0002289071271661669, 0.31360217928886414, -0.07096036523580551, 0.16695955395698547, 0.03677211329340935, 0.00038613268407061696, -0.11027684062719345, 0.030288029462099075, -0.05203165486454964, -0.021576624363660812, 0.09578979015350342, -0.11096979677677155, 0.03204701095819473, 0.14160704612731934, -0.04864364117383957, 0.05846960097551346, 0.09256096184253693, -0.0849417969584465, 0.007583672646433115, 0.17753590643405914, -0.17537221312522888, -0.1273445188999176, -0.006135711446404457, -0.09862716495990753, 0.14055661857128143, 0.04394126310944557, 0.05191568285226822, 0.16669964790344238, 0.03967129811644554, -0.029474308714270592, -0.02817419543862343, -0.1153380498290062, -0.0201893113553524, 0.040153320878744125, 0.00045633706031367183, -0.08791285753250122, 0.2262638509273529, 0.06409153342247009, -0.1328488290309906, -0.051157206296920776, 0.2161225974559784, -0.06805316358804703, -0.04911920800805092, -0.223562553524971, 0.10752306133508682, -0.07112517952919006, -0.0965060144662857, 0.05453834682703018, -0.02270081453025341, 0.005106312222778797, 0.181985542178154, 0.03941008821129799, 0.11070270836353302, 0.03738937899470329, -0.02448922023177147, 0.15798696875572205, -0.142850860953331, -0.14191335439682007, -0.025354057550430298, -0.08757315576076508, -0.13844476640224457, -0.026804137974977493, 0.1617041826248169, -0.09177309274673462, -0.14772607386112213, -0.2621181011199951, 0.10968475043773651, -0.16432365775108337, -0.10192688554525375, -0.03469514101743698, -0.08968492597341537, 0.0696166530251503, 0.030301768332719803, -0.03093348816037178, -0.06706760823726654, -0.18593791127204895, 0.0816768929362297, 0.06349513679742813, 0.045533183962106705, -0.017847947776317596, 0.0067379772663116455, 0.1720137596130371, 0.025955144315958023, 0.10040043294429779, 0.16762186586856842, 0.011397695168852806, 0.2246655523777008, -0.1671202927827835, -0.11496317386627197, 0.1336962729692459, -0.026543032377958298, 0.06762003898620605, 0.16792191565036774, -0.0772583931684494, 0.015526676550507545, -0.028136352077126503, 0.07066910713911057, -0.11003983020782471, -0.105624258518219, 0.007937257178127766, 0.02567129209637642, -0.2755882740020752, -0.005599735304713249, -0.19717298448085785, 0.14788752794265747, 0.02579621411859989, 0.03297143429517746, 0.10257530212402344, 0.10404334217309952, 0.08312062919139862, -0.0017710148822516203, 0.03226327523589134, -0.1176818460226059, 0.02753005363047123, -0.059239376336336136, -0.020663779228925705, 0.017624232918024063, 0.36952024698257446, -0.03603357449173927, -0.046802736818790436, 0.003710439894348383, 0.1307835876941681, -0.02139742486178875, 0.017395347356796265, 0.13209912180900574, 0.12607666850090027, -0.08595693111419678, -0.1504845917224884, 0.04888554662466049, -0.04565655067563057, -0.02836887165904045, 0.1464131623506546, 0.05905961990356445, 0.1050296202301979, 0.0908031314611435, -0.014463032595813274, -0.00318976235575974, 0.012856799177825451, -0.15486004948616028, 0.06223496049642563, -0.010558074340224266, 0.012565906159579754, 0.017934376373887062, 0.15238402783870697, -0.005540105979889631, 0.07739730179309845, -0.09889880567789078, 0.004208535887300968, -0.13498884439468384, -0.07913459837436676, 0.03617347031831741, -0.13393273949623108, 0.04141177982091904, -0.01871878281235695, 0.029611799865961075, 0.30386561155319214, 0.02558239921927452, -0.020639164373278618, 0.12512871623039246, -0.1214587539434433, -0.12050267308950424, -0.001594188273884356, -0.029960084706544876, 0.0791488066315651, -0.02633434161543846, -0.0997740775346756, -0.1001306027173996, -0.15166029334068298, -0.09759195148944855, 0.05182836204767227, -0.04993441700935364, -0.059362251311540604, -0.17634081840515137, -0.05707859992980957, -0.05147340148687363, 0.14025864005088806, -0.12263951450586319, 0.15159130096435547, -0.014490418136119843, 0.004084470681846142, 0.04405883327126503, 0.1950942426919937, -0.03644494712352753, 0.08714226633310318, 0.0154351145029068, 0.1522706001996994, -0.05119588226079941, 0.14720745384693146, -0.10931728035211563, -0.04014137014746666, -0.06710435450077057, 0.21513493359088898, 0.25630924105644226, -0.06136954948306084, -0.008937356993556023, -0.012760217301547527, 0.058654606342315674, 0.1073930487036705, 0.16049085557460785, 0.002326392102986574, 0.2802925705909729, -0.03133585304021835, 0.04815128445625305, 0.02901598811149597, 0.013607407920062542, -0.06336209923028946, 0.03397751972079277, 0.07539387792348862, -0.035039983689785004, -0.1412304788827896, 0.15837742388248444, -0.21980468928813934, 0.18157227337360382, 0.11640069633722305, -0.19996967911720276, -0.013728445395827293, -0.04882071167230606, 0.1689416468143463, -0.0856364443898201, 0.1637246012687683, -0.0903693437576294, -0.2108195722103119, -0.2056000679731369, 0.03867346793413162, -0.34623071551322937, -0.254462867975235, 0.10422009229660034, 0.1488201916217804, 0.04015883058309555, -0.018507536500692368, -0.019967829808592796, -0.018367022275924683, 0.04877542704343796, -0.0067357709631323814, 0.06014643982052803, 0.031397558748722076, -0.02988368645310402, -0.24127542972564697, -0.029804671183228493, 0.023964406922459602, -0.07093082368373871, 0.07464958727359772, -0.06874357163906097, -0.022495782002806664, 0.08059766888618469, -0.03066304884850979, 0.03298592567443848, -0.035373736172914505, -0.16326889395713806, 0.027529051527380943, 0.03900543600320816, 0.036012712866067886, 0.00634160777553916, 0.0008072225609794259, -0.03455270454287529, 0.0644603744149208, -0.16716794669628143, -0.16015739738941193, 0.14140215516090393, -0.06745140254497528, 0.2779497504234314, -0.05812826007604599, -0.0809100940823555, 0.04766704887151718, -0.03426874056458473, 0.1807648241519928, -0.07756473124027252, 0.047254521399736404, 0.12766779959201813, 0.011127962730824947, 0.03121316432952881, -0.3092964291572571, 0.11082969605922699, -0.000795336440205574, -0.006093299947679043, -0.07581598311662674 ]
null
null
transformers
# Distil-wav2vec2 This model is a distilled version of the wav2vec2 model (https://arxiv.org/pdf/2006.11477.pdf). This model is 45% times smaller and twice as fast as the original wav2vec2 base model. # Evaluation results This model achieves the following results (speed is mesured for a batch size of 64): |Model| Size| WER Librispeech-test-clean |WER Librispeech-test-other|Speed on cpu|speed on gpu| |----------| ------------- |-------------|-----------| ------|----| |Distil-wav2vec2| 197.9 Mb | 0.0983 | 0.2266|0.4006s| 0.0046s| |wav2vec2-base| 360 Mb | 0.0389 | 0.1047|0.4919s| 0.0082s| # Usage notebook (executes seamlessly on google colab) at https://github.com/OthmaneJ/distil-wav2vec2
{"language": "en", "license": "apache-2.0", "tags": ["speech", "audio", "automatic-speech-recognition"], "datasets": ["librispeech_asr"]}
automatic-speech-recognition
OthmaneJ/distil-wav2vec2
[ "transformers", "pytorch", "wav2vec2", "automatic-speech-recognition", "speech", "audio", "en", "dataset:librispeech_asr", "arxiv:2006.11477", "license:apache-2.0", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2006.11477" ]
[ "en" ]
TAGS #transformers #pytorch #wav2vec2 #automatic-speech-recognition #speech #audio #en #dataset-librispeech_asr #arxiv-2006.11477 #license-apache-2.0 #endpoints_compatible #has_space #region-us
Distil-wav2vec2 =============== This model is a distilled version of the wav2vec2 model (URL This model is 45% times smaller and twice as fast as the original wav2vec2 base model. Evaluation results ================== This model achieves the following results (speed is mesured for a batch size of 64): Usage ===== notebook (executes seamlessly on google colab) at URL
[]
[ "TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #speech #audio #en #dataset-librispeech_asr #arxiv-2006.11477 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n" ]
[ 76 ]
[ "passage: TAGS\n#transformers #pytorch #wav2vec2 #automatic-speech-recognition #speech #audio #en #dataset-librispeech_asr #arxiv-2006.11477 #license-apache-2.0 #endpoints_compatible #has_space #region-us \n" ]
[ -0.1075126901268959, 0.14181537926197052, -0.004167693667113781, -0.012920774519443512, 0.038351207971572876, -0.05648394301533699, 0.08520273864269257, 0.10905566066503525, 0.019722580909729004, 0.008729087188839912, 0.10640667378902435, 0.17020867764949799, 0.01243459153920412, -0.005081980489194393, -0.049444057047367096, -0.15911665558815002, 0.08999466896057129, 0.002795144449919462, 0.06451889872550964, 0.08279712498188019, 0.10205169767141342, -0.05350364372134209, 0.022419782355427742, 0.06547050178050995, -0.037753183394670486, 0.029369160532951355, 0.048863910138607025, -0.14785440266132355, 0.09742966294288635, 0.02489398792386055, 0.003957407083362341, 0.05544322729110718, 0.0353315994143486, -0.17265482246875763, 0.00760524719953537, -0.002817668719217181, -0.011174566112458706, 0.047039665281772614, 0.0288547333329916, -0.018101047724485397, 0.01431046612560749, 0.04412689432501793, -0.04577603191137314, 0.07105414569377899, -0.05145695060491562, -0.25762277841567993, -0.07216715067625046, 0.05693427845835686, 0.030611375346779823, 0.10209516435861588, -0.014701001346111298, 0.13224610686302185, -0.07990594208240509, 0.07320904731750488, 0.1432257741689682, -0.3390350341796875, 0.044121693819761276, -0.017091775313019753, 0.06329710781574249, 0.009207116439938545, -0.030782422050833702, 0.058303333818912506, 0.050519347190856934, 0.013014914467930794, -0.0026557394303381443, -0.052297793328762054, -0.2200537621974945, 0.01730315387248993, -0.0864490270614624, -0.0748872235417366, 0.2716928720474243, 0.023273084312677383, 0.018356772139668465, -0.040409497916698456, -0.030155334621667862, 0.014181530103087425, 0.0008410461014136672, 0.024799060076475143, 0.01736399717628956, 0.0786280483007431, 0.013372284360229969, -0.027359578758478165, -0.13272760808467865, -0.05628407001495361, -0.16815879940986633, 0.09077021479606628, -0.013874362222850323, 0.07486177980899811, -0.11643673479557037, 0.0003201141953468323, 0.01099633052945137, -0.10398745536804199, -0.012167142704129219, -0.0011326730018481612, 0.0598544105887413, 0.06779151409864426, -0.07783888280391693, 0.02849821373820305, 0.1320870816707611, 0.09028875827789307, 0.05587242171168327, -0.009019032120704651, -0.05909990146756172, 0.11447012424468994, -0.014247244223952293, 0.1039004921913147, -0.05921937897801399, -0.03422369807958603, 0.061303991824388504, -0.031740281730890274, 0.08320464193820953, -0.035440027713775635, -0.10796894133090973, -0.06432966887950897, -0.008490202948451042, 0.0703185498714447, 0.09746936708688736, -0.007583222351968288, -0.03703030198812485, 0.03172338753938675, 0.06128501147031784, -0.10646257549524307, 0.0086351428180933, 0.02958938479423523, 0.033065471798181534, 0.07862884551286697, 0.06432363390922546, 0.06294412910938263, -0.0758373811841011, -0.0013617489021271467, -0.0067804716527462006, 0.00951804593205452, 0.04542919248342514, -0.03172222524881363, 0.0764421820640564, -0.08395248651504517, 0.026199575513601303, -0.15380896627902985, -0.05034957081079483, -0.012536101043224335, 0.0034528523683547974, 0.010425928980112076, -0.09162457287311554, 0.0028572052251547575, -0.03969154506921768, 0.057924188673496246, -0.12600907683372498, 0.03585868328809738, -0.08152501285076141, 0.07136017829179764, -0.010476283729076385, 0.10026626288890839, -0.15309496223926544, 0.08355109393596649, -0.08647989481687546, -0.022815030068159103, 0.0675787627696991, 0.04336962103843689, -0.0919966921210289, 0.057272087782621384, -0.11091525852680206, -0.03527604416012764, -0.09073374420404434, 0.02814396657049656, 0.005842145066708326, 0.06737053394317627, -0.15798726677894592, -0.08076359331607819, 0.12915585935115814, -0.09957504272460938, -0.13073711097240448, 0.1010468453168869, 0.058331407606601715, -0.02128380909562111, 0.055562544614076614, 0.28771355748176575, -0.021148208528757095, -0.1388430893421173, -0.03395995497703552, 0.1029873639345169, -0.052411507815122604, -0.14448663592338562, 0.09518440812826157, -0.09961197525262833, 0.02145468071103096, 0.018159525468945503, -0.006607315503060818, 0.07330656051635742, 0.011431034654378891, -0.0899837389588356, -0.06640037894248962, -0.10090476274490356, -0.012324218638241291, -0.039591334760189056, 0.022243108600378036, -0.02645927295088768, -0.011822833679616451, -0.03628946468234062, 0.06377606093883514, 0.009522393345832825, 0.08157552033662796, -0.0869649276137352, 0.11868607997894287, -0.06463531404733658, 0.005140712019056082, -0.1519060730934143, 0.17531999945640564, -0.04921244829893112, 0.0052423495799303055, 0.0941561833024025, 0.054901767522096634, 0.04544590786099434, -0.09406093508005142, 0.013908451423048973, -0.02165752276778221, 0.09541476517915726, 0.06765573471784592, 0.007710847072303295, -0.14078077673912048, 0.026675347238779068, -0.06071830540895462, 0.007989251986145973, 0.031323861330747604, -0.04417075589299202, 0.04754607379436493, 0.07911910116672516, -0.03417350724339485, 0.03872828930616379, 0.025989491492509842, -0.011271284893155098, -0.020435074344277382, 0.020912136882543564, 0.07076065242290497, 0.02472178265452385, -0.03145839646458626, 0.26385289430618286, -0.10756697505712509, 0.19396838545799255, 0.22948917746543884, -0.14531518518924713, 0.0990350991487503, 0.08271016925573349, -0.0002953653165604919, -0.01333070732653141, 0.03624916821718216, -0.039925213903188705, 0.13298766314983368, -0.03792209550738335, 0.11532457917928696, -0.06876412779092789, 0.01820521056652069, 0.011996091343462467, -0.06622954457998276, -0.02544626034796238, 0.055297933518886566, 0.018621396273374557, -0.07755988836288452, 0.11613249778747559, 0.2568173408508301, -0.09499268233776093, 0.10733721405267715, -0.05514981597661972, -0.06111852452158928, 0.046106815338134766, -0.0374140702188015, -0.04413928464055061, 0.06564967334270477, -0.2014637291431427, -0.03994603455066681, 0.10624387115240097, 0.007011900190263987, 0.06016042083501816, -0.1478361338376999, -0.0022394831757992506, -0.003562869969755411, -0.053370993584394455, -0.13948273658752441, 0.06835708767175674, -0.018191419541835785, 0.08080609887838364, -0.05655783414840698, -0.14737485349178314, 0.05610693618655205, -0.03516218811273575, -0.08612535148859024, 0.056606657803058624, -0.12798240780830383, -0.20602595806121826, -0.12724177539348602, -0.07585933059453964, -0.02147466130554676, 0.03560711443424225, 0.1436983346939087, -0.08379149436950684, -0.03503967076539993, -0.013379961252212524, -0.03169768303632736, -0.0451628752052784, 0.017821121960878372, 0.05643636733293533, 0.02686707302927971, 0.055539339780807495, -0.14683258533477783, -0.009811725467443466, -0.02621401473879814, 0.026496747508645058, 0.02573445811867714, 0.04753714054822922, 0.06756427884101868, 0.14153870940208435, 0.06821519136428833, 0.015681376680731773, -0.0030981977470219135, 0.14580793678760529, -0.05012575909495354, -0.052492715418338776, 0.19736377894878387, -0.03338145464658737, 0.009856514632701874, 0.17461936175823212, 0.02611767128109932, -0.028166750445961952, -0.031341999769210815, -0.05969187617301941, -0.06712232530117035, -0.25024789571762085, -0.12459119409322739, -0.11930717527866364, -0.024320801720023155, 0.01751859113574028, 0.0668352022767067, 0.05691314861178398, 0.000898292288184166, 0.004624874331057072, -0.052402812987565994, -0.008753135800361633, 0.0014089252799749374, 0.26862654089927673, -0.061555564403533936, 0.1176326721906662, -0.09894983470439911, -0.034331392496824265, 0.07249965518712997, 0.11294122040271759, 0.10237334668636322, 0.14264152944087982, 0.07756362855434418, 0.05891920253634453, 0.16968970000743866, 0.08515870571136475, 0.07707792520523071, 0.045121289789676666, 0.010955139063298702, -0.00610383041203022, -0.06979866325855255, -0.023713629692792892, 0.09152219444513321, 0.1673702895641327, -0.08483020961284637, 0.0015021037543192506, -0.13592632114887238, 0.03148533031344414, 0.18981629610061646, 0.10405566543340683, -0.15648230910301208, 0.022416183724999428, 0.04631445184350014, -0.05387327820062637, -0.015669062733650208, 0.105410136282444, 0.008852957747876644, -0.025443676859140396, 0.07475367933511734, 0.06437790393829346, 0.07649768888950348, -0.009170031175017357, 0.03851665183901787, -0.08994273841381073, -0.11395339667797089, 0.07348756492137909, 0.0658375546336174, -0.22378882765769958, 0.24500946700572968, 0.006626901216804981, 0.013997847214341164, -0.02301911823451519, 0.021660486236214638, 0.05611328408122063, 0.04182358831167221, 0.14959287643432617, 0.019275132566690445, -0.11388558149337769, -0.0074242837727069855, -0.0835288017988205, 0.049700818955898285, 0.03874153643846512, 0.10009918361902237, -0.06244690343737602, -0.03123987838625908, -0.02562684752047062, 0.03437734395265579, 0.041133176535367966, -0.12852256000041962, -0.11029914766550064, 0.028515830636024475, 0.29357442259788513, 0.05714000388979912, -0.0221712626516819, -0.06281837075948715, -0.1923389732837677, 0.09216322749853134, -0.11236335337162018, 0.0068389312364161015, -0.04501509666442871, -0.15536777675151825, 0.1287354826927185, -0.025503605604171753, 0.06798611581325531, -0.03215988725423813, -0.020116636529564857, -0.053560979664325714, -0.12366269528865814, 0.11817225068807602, -0.1126284971833229, -0.03384881466627121, -0.0242146048694849, 0.19221381843090057, -0.06591518223285675, 0.0684116929769516, 0.054956477135419846, 0.06948493421077728, -0.10741118341684341, -0.05965052917599678, 0.1065148264169693, 0.05820407718420029, -0.021835092455148697, 0.0040661646053195, -0.013129545375704765, -0.2308400571346283, -0.0009764150017872453, -0.03979942575097084, 0.24392247200012207, 0.14447763562202454, -0.08677004277706146, 0.19062542915344238, 0.2405794858932495, -0.05015124753117561, -0.288155734539032, -0.15603500604629517, -0.07128474861383438, -0.016501955687999725, -0.018756791949272156, -0.12995031476020813, 0.1058291643857956, -0.04546710476279259, -0.12791872024536133, 0.042498111724853516, -0.18477177619934082, -0.09570665657520294, 0.2626623511314392, -0.12095008790493011, 0.26341062784194946, -0.10570019483566284, -0.09391184896230698, -0.06615190207958221, -0.16527070105075836, 0.08713856339454651, -0.13258537650108337, 0.08989473432302475, 0.02021319791674614, 0.04872480407357216, 0.0007439170149154961, -0.04240642115473747, 0.13192956149578094, 0.060285668820142746, -0.055597465485334396, -0.05318157747387886, 0.006490187253803015, 0.041833434253931046, -0.0020345733501017094, 0.0967252105474472, -0.1095050796866417, 0.0389743335545063, -0.07858535647392273, -0.019667766988277435, -0.10807265341281891, 0.11422501504421234, 0.05707850307226181, -0.009999859146773815, -0.010260041803121567, -0.08952359110116959, 0.02055281586945057, 0.017618846148252487, 0.18367967009544373, -0.056280456483364105, 0.015153041109442711, 0.2285742312669754, 0.10206691175699234, -0.14195558428764343, -0.06103597208857536, -0.05427734926342964, -0.09227066487073898, 0.11176091432571411, -0.11726263910531998, 0.06546646356582642, 0.058008480817079544, 0.029820185154676437, 0.039045996963977814, 0.04857202619314194, -0.03885843604803085, 0.011684628203511238, 0.10692574828863144, -0.08288035541772842, -0.02551070787012577, 0.022836066782474518, 0.09447266161441803, 0.0584489107131958, 0.061443667858839035, 0.1448366492986679, -0.023196211084723473, -0.010514031164348125, -0.0028641410171985626, 0.014728417620062828, -0.1514120101928711, 0.10884292423725128, 0.08140068501234055, 0.01704041287302971, -0.15500158071517944, 0.12318743765354156, -0.00036781132803298533, -0.13231833279132843, 0.027531633153557777, -0.01837850734591484, -0.06098973751068115, -0.12566518783569336, -0.0996357724070549, -0.06761913001537323, -0.050855278968811035, -0.12208344787359238, 0.018094586208462715, -0.12018539756536484, 0.04628979414701462, 0.08565859496593475, 0.059030164033174515, 0.056017208844423294, -0.05566652491688728, -0.07753753662109375, 0.040527574717998505, -0.007607026491314173, -0.047830600291490555, 0.0009603590006008744, -0.12270326912403107, -0.04193549230694771, 0.01989758014678955, 0.06489211320877075, -0.056531019508838654, -0.0377398356795311, -0.042342476546764374, 0.06093054264783859, -0.10718388855457306, 0.0022626928985118866, -0.08609139919281006, -0.002249597804620862, 0.025792663916945457, -0.10852034389972687, -0.04830649495124817, 0.06483305990695953, -0.119432732462883, -0.025855978950858116, -0.010862807743251324, 0.09360789507627487, -0.16122406721115112, -0.020613670349121094, 0.0418262779712677, -0.009019234217703342, 0.10003948211669922, 0.15712280571460724, -0.11873672157526016, 0.07185762375593185, -0.19521549344062805, -0.19351205229759216, 0.14892923831939697, 0.045331597328186035, -0.0011566251050680876, -0.07064782828092575, -0.016271810978651047, 0.15292276442050934, 0.031501710414886475, -0.002768203616142273, 0.09689521789550781, -0.08849116414785385, -0.06091146916151047, -0.09472817182540894, -0.06176771968603134, 0.0035032187588512897, -0.08065827935934067, 0.18345129489898682, 0.0685255378484726, 0.14809928834438324, -0.020702078938484192, -0.01580902561545372, -0.059489656239748, 0.05996545031666756, -0.08404675871133804, -0.12258078902959824, -0.16625109314918518, -0.014691946096718311, 0.004062638618052006, -0.046097178012132645, 0.2269514799118042, -0.024879178032279015, -0.1080918163061142, 0.04905790835618973, 0.06001743674278259, -0.07561707496643066, 0.01733982190489769, 0.29596665501594543, 0.06709415465593338, -0.012063111178576946, -0.052252061665058136, -0.020856309682130814, 0.023740749806165695, 0.07172701507806778, -0.04722123220562935, 0.16301648318767548, 0.14402350783348083, 0.1223037838935852, 0.11808192729949951, -0.05467263609170914, -0.10923433303833008, -0.027591681107878685, -0.03113570809364319, 0.09455406665802002, -0.04472305625677109, 0.12024714052677155, 0.14756134152412415, 0.011035610921680927, 0.028815587982535362, -0.05619584396481514, 0.016335057094693184, -0.16415885090827942, -0.0799354836344719, -0.05908392742276192, -0.11517326533794403, 0.0029218331910669804, -0.025620128959417343, 0.08531723916530609, 0.09259889274835587, 0.02415211871266365, -0.003429254051297903, 0.01265679020434618, -0.038617394864559174, -0.07493121922016144, 0.05600728467106819, -0.043665044009685516, -0.020522067323327065, -0.06013600900769234, -0.01963984966278076, 0.07344522327184677, -0.002684847917407751, -0.009482142515480518, 0.03013947606086731, -0.07815112173557281, 0.04490063339471817, -0.1189085990190506, -0.0640525370836258, -0.047719020396471024, 0.019415447488427162, 0.04844716936349869, 0.1841164529323578, 0.079006128013134, -0.02847927249968052, 0.07881636917591095, 0.16146326065063477, -0.11024743318557739, -0.15308408439159393, -0.04927733540534973, 0.08592965453863144, -0.02038010023534298, 0.06349440664052963, -0.04319922626018524, -0.027494359761476517, -0.05799300596117973, 0.21367374062538147, 0.24306395649909973, -0.03660636395215988, 0.04990656301379204, -0.045435938984155655, 0.02374555729329586, -0.03336381912231445, -0.006033094134181738, 0.17085473239421844, 0.20701441168785095, -0.03249066323041916, -0.046409159898757935, -0.0732865184545517, -0.019643018022179604, -0.0870482549071312, 0.06688054651021957, -0.0776655524969101, -0.14246800541877747, -0.006060858257114887, 0.08427506685256958, -0.11581675708293915, 0.01914309337735176, -0.09930620342493057, -0.1319868266582489, -0.04927244037389755, 0.018824540078639984, 0.16802754998207092, 0.13096901774406433, 0.005808170884847641, -0.06445977091789246, -0.03605683520436287, 0.058460235595703125, -0.022534051910042763, -0.19930334389209747, 0.013862207531929016, 0.02573384717106819, -0.11427724361419678, 0.0631570890545845, 0.00648462912067771, 0.1316777914762497, 0.024186689406633377, 0.14469268918037415, -0.07243670523166656, 0.12983660399913788, 0.008234219625592232, -0.1387968808412552, 0.02196369878947735, 0.03483235836029053, -0.013028624467551708, 0.035003092139959335, 0.05194923281669617, -0.03549906983971596, 0.0681058019399643, 0.00924447737634182, -0.04179348796606064, -0.07899661362171173, -0.0069004506804049015, -0.05347486585378647, 0.0653088167309761, -0.05860085412859917, -0.049413394182920456, -0.06033697724342346, -0.021342147141695023, -0.012105057016015053, 0.04451147839426994, -0.16526462137699127, -0.08200758695602417, -0.0692376047372818, -0.004602779634296894, -0.05159304291009903, 0.029058236628770828, -0.1045326367020607, -0.039000254124403, -0.07235310226678848, -0.015421455726027489, -0.05830548331141472, -0.011181655339896679, 0.08400597423315048, -0.03207038342952728, 0.016089536249637604, -0.05547618865966797, 0.10587799549102783, 0.06452041864395142, -0.12105713039636612, -0.09595273435115814 ]
null
null
transformers
0 Tony Stark DialoGPT Model
{"tags": ["conversational"]}
text-generation
P4RZ1V4L/DialoGPT-Medium-Tony
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
0 Tony Stark DialoGPT Model
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 51 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.009697278961539268, 0.03208012506365776, -0.007204889785498381, 0.004809224978089333, 0.16726240515708923, 0.014898733235895634, 0.09765533357858658, 0.13672804832458496, -0.007841327227652073, -0.031050153076648712, 0.14490588009357452, 0.20411323010921478, -0.006439372431486845, 0.0661218985915184, -0.07572533935308456, -0.2683109939098358, 0.05759621039032936, 0.046649303287267685, 0.016515716910362244, 0.1200079694390297, 0.08573378622531891, -0.05473608896136284, 0.08714032918214798, -0.014583407901227474, -0.150366872549057, 0.017733458429574966, 0.043394338339567184, -0.12260226160287857, 0.11910516023635864, 0.05462685227394104, 0.07063519209623337, 0.014929565601050854, -0.07541623711585999, -0.1631229966878891, 0.03031250834465027, 0.01425902172923088, -0.0594632662832737, 0.04757995903491974, 0.059961482882499695, -0.10165371745824814, 0.10819483548402786, 0.09530027210712433, -0.013078106567263603, 0.06798283755779266, -0.16849711537361145, -0.020869607105851173, -0.01446688175201416, 0.009899779222905636, 0.05550243332982063, 0.09964893013238907, -0.03413357585668564, 0.10497362166643143, -0.09214533120393753, 0.11017382889986038, 0.10932035744190216, -0.32057443261146545, -0.005767723545432091, 0.09167823940515518, 0.039358653128147125, 0.07352814823389053, -0.04467793554067612, 0.06258884817361832, 0.018015462905168533, 0.017986174672842026, -0.014015024527907372, -0.07283061742782593, -0.11612214148044586, 0.04717336222529411, -0.08668071031570435, -0.059868961572647095, 0.2244078367948532, -0.05464440956711769, 0.06881742179393768, -0.05281897634267807, -0.10522868484258652, -0.04308144748210907, -0.029833965003490448, 0.00475557055324316, -0.07660607248544693, 0.08692064881324768, 0.00869679357856512, -0.09547875821590424, -0.1376667022705078, -0.02496783249080181, -0.1776352822780609, 0.16140350699424744, 0.02465328387916088, 0.05232657864689827, -0.2027255892753601, 0.09623090922832489, 0.017906051129102707, -0.08045592904090881, 0.022091427817940712, -0.10046248883008957, 0.029131146147847176, 0.013760408386588097, -0.04754498973488808, -0.061387211084365845, 0.0843690037727356, 0.11199145019054413, -0.01731434464454651, 0.025486016646027565, -0.039331406354904175, 0.08100687712430954, 0.03553595021367073, 0.09077847748994827, 0.007288969587534666, -0.028338588774204254, 0.025842782109975815, -0.13719046115875244, -0.003647835226729512, -0.07116208970546722, -0.16572439670562744, -0.021088803187012672, 0.02994808368384838, 0.08289173990488052, 0.015449047088623047, 0.11682453751564026, -0.03272046521306038, -0.025152435526251793, 0.03602350503206253, -0.047656361013650894, -0.012649794109165668, 0.016648368909955025, 0.013163427822291851, 0.12399329990148544, -0.0022096503525972366, 0.03235051408410072, -0.13653022050857544, 0.031423524022102356, -0.06793295592069626, -0.003740974934771657, -0.03486552834510803, -0.040637075901031494, 0.009043924510478973, -0.06862333416938782, 0.003486064961180091, -0.15030112862586975, -0.15063877403736115, 0.007587034720927477, -0.007836631499230862, -0.04107699543237686, -0.06370922178030014, -0.06952770054340363, -0.013550350442528725, 0.04251532256603241, -0.07093454152345657, -0.011352915316820145, -0.06403283774852753, 0.11004766076803207, -0.03197755664587021, 0.07921615242958069, -0.11953279376029968, 0.08390819281339645, -0.11260783672332764, -0.02386913076043129, -0.060801517218351364, 0.09317506104707718, -0.0006014376995153725, 0.09549830108880997, -0.006563255097717047, -0.017931854352355003, -0.07981178909540176, 0.06445012241601944, -0.042872510850429535, 0.21701598167419434, -0.0615808479487896, -0.11181682348251343, 0.28781595826148987, -0.052628401666879654, -0.1370542049407959, 0.11647392809391022, 0.008682746440172195, 0.05777018144726753, 0.10703510791063309, 0.19733482599258423, -0.015276194550096989, 0.004040541127324104, 0.09471915662288666, 0.11263324320316315, -0.11276852339506149, -0.033160366117954254, 0.013019153848290443, -0.04081077128648758, -0.10867965966463089, 0.04689536616206169, 0.09810488671064377, 0.07090286910533905, -0.04786505550146103, -0.03377414867281914, -0.01366397924721241, 0.0052589005790650845, 0.08885077387094498, -0.007157256826758385, 0.10962837189435959, -0.05819983780384064, -0.03796621412038803, -0.029282379895448685, -0.012126247398555279, -0.03951939567923546, 0.03137664496898651, -0.043376367539167404, 0.10821941494941711, -0.011204327456653118, 0.06364280730485916, -0.16185984015464783, -0.07691477984189987, -0.017002692446112633, 0.1581239402294159, 0.024538565427064896, 0.09859629720449448, 0.0552486926317215, -0.040398042649030685, -0.0012767292791977525, 0.012792680412530899, 0.15581141412258148, -0.022091681137681007, -0.065607450902462, -0.052166227251291275, 0.08642971515655518, -0.05641226842999458, 0.04504093527793884, -0.05937713757157326, 0.012367865070700645, 0.05064384639263153, 0.10342344641685486, -0.00018274025933351368, 0.03323284164071083, -0.008164864964783192, 0.002145637758076191, -0.058205123990774155, 0.007405933458358049, 0.10799351334571838, 0.00036868182360194623, -0.07365862280130386, 0.22074243426322937, -0.17796069383621216, 0.1765957772731781, 0.1893044263124466, -0.299345999956131, 0.017949223518371582, -0.10759581625461578, -0.04561871662735939, 0.014407722279429436, 0.05567655712366104, -0.0454222597181797, 0.1703362911939621, -0.009871348738670349, 0.18874616920948029, -0.04946064203977585, -0.04464937001466751, -0.0200483538210392, -0.05118836089968681, -0.0024189651012420654, 0.07781197130680084, 0.10685696452856064, -0.13992026448249817, 0.1964332014322281, 0.1621224284172058, 0.048237916082143784, 0.19945049285888672, 0.015346456319093704, -0.011589210480451584, 0.0909530371427536, 0.005220826715230942, -0.058739423751831055, -0.07409929484128952, -0.2594851851463318, -0.030033592134714127, 0.07992640137672424, 0.0422382652759552, 0.1212305948138237, -0.11349532753229141, -0.038956157863140106, -0.01763172075152397, -0.023146281018853188, 0.021672505885362625, 0.0914369598031044, 0.06075398623943329, 0.13201528787612915, -0.001710098935291171, -0.007300339173525572, 0.10524573177099228, 0.01783694699406624, -0.09354141354560852, 0.18308524787425995, -0.13652534782886505, -0.37097251415252686, -0.13911493122577667, -0.18057456612586975, -0.05449081212282181, 0.05712554603815079, 0.11679314076900482, -0.12011238187551498, -0.018752124160528183, 0.01578843593597412, 0.10931742936372757, -0.08449502289295197, 0.0021454424131661654, -0.06880278885364532, 0.0321490578353405, -0.10310184955596924, -0.09194442629814148, -0.055416494607925415, -0.031392451375722885, -0.08001253753900528, 0.1423761546611786, -0.10777941346168518, 0.04476889222860336, 0.20262959599494934, 0.04653622955083847, 0.05625178664922714, -0.044105201959609985, 0.19377262890338898, -0.11264272034168243, -0.01661740615963936, 0.19215328991413116, -0.048360925167798996, 0.07476246356964111, 0.1232115849852562, -0.006348740309476852, -0.08765771239995956, 0.03011748194694519, -0.02085109055042267, -0.07988511025905609, -0.23219464719295502, -0.13938382267951965, -0.12429051846265793, 0.09477275609970093, 0.028005298227071762, 0.056365787982940674, 0.17219258844852448, 0.06577219814062119, -0.038416244089603424, 0.006410336587578058, 0.02959546446800232, 0.08237514644861221, 0.23417828977108002, -0.06035616248846054, 0.1364797055721283, -0.03420931473374367, -0.14982740581035614, 0.08169995993375778, 0.0713929831981659, 0.10213395953178406, 0.06678459793329239, 0.0804823637008667, 0.0149586396291852, 0.06188136339187622, 0.1311223804950714, 0.08191446959972382, 0.019586285576224327, -0.02480296604335308, -0.03388110175728798, -0.025523077696561813, -0.05937909707427025, 0.040128443390131, 0.06589099019765854, -0.16763372719287872, -0.039227183908224106, -0.09338314831256866, 0.09657008945941925, 0.0873042419552803, 0.06609832495450974, -0.1842060089111328, -0.008006223477423191, 0.08488986641168594, -0.03854905813932419, -0.13727426528930664, 0.09535189718008041, 0.01523482333868742, -0.15144726634025574, 0.03139317408204079, -0.04061909019947052, 0.12188644707202911, -0.07804752141237259, 0.09809603542089462, -0.08108244836330414, -0.07448557764291763, 0.02123199962079525, 0.1261177361011505, -0.30527687072753906, 0.20240111649036407, -0.0024993624538183212, -0.06486981362104416, -0.1243603527545929, -0.0032166161108762026, 0.002410882618278265, 0.07357452809810638, 0.10519039630889893, -0.007196315098553896, 0.001897757756523788, -0.06300821900367737, -0.01829923689365387, 0.032471053302288055, 0.13080233335494995, -0.0401318334043026, -0.021158374845981598, -0.050194524228572845, -0.001653497340157628, -0.03173094615340233, -0.06934895366430283, 0.02002747356891632, -0.19509181380271912, 0.08751901984214783, 0.04166261479258537, 0.09648149460554123, 0.029994789510965347, 0.004265148192644119, -0.09651939570903778, 0.24698667228221893, -0.07148019969463348, -0.10072879493236542, -0.10919588059186935, -0.046813901513814926, 0.03569883480668068, -0.05628936365246773, 0.04309194162487984, -0.0788632407784462, 0.028997479006648064, -0.06352769583463669, -0.19235502183437347, 0.12410202622413635, -0.09027006477117538, -0.04412810131907463, -0.02371402643620968, 0.2110891044139862, -0.05598580464720726, 0.010335659608244896, 0.02930437959730625, 0.01208863127976656, -0.11645778268575668, -0.09678568691015244, 0.031018631532788277, -0.007351789623498917, 0.050603240728378296, 0.041841957718133926, -0.05915454775094986, -0.017138581722974777, -0.052199993282556534, -0.022926922887563705, 0.3496883809566498, 0.14231905341148376, -0.043836336582899094, 0.19347235560417175, 0.12347975373268127, -0.07452994585037231, -0.3159443140029907, -0.1066238060593605, -0.10937739163637161, -0.04680149629712105, -0.07012093812227249, -0.2002030611038208, 0.06474938243627548, 0.00662544509395957, -0.013415241613984108, 0.12749312818050385, -0.2561831772327423, -0.07571036368608475, 0.15906259417533875, -0.017980827018618584, 0.3745945692062378, -0.1168576180934906, -0.10926306992769241, -0.03950892388820648, -0.14175476133823395, 0.16968177258968353, -0.01989765651524067, 0.11221715062856674, -0.009765521623194218, 0.14388824999332428, 0.05548359826207161, -0.023479344323277473, 0.08544106781482697, 0.004999885335564613, -0.03290518373250961, -0.10304180532693863, -0.05676887184381485, 0.007092386484146118, 0.02477436140179634, 0.018026655539870262, -0.041834570467472076, 0.02227151393890381, -0.11731979995965958, -0.04657655209302902, -0.08982590585947037, 0.04431166127324104, 0.03899754583835602, -0.07325074821710587, -0.002380647463724017, -0.07165111601352692, -0.012272949330508709, 0.022334342822432518, 0.20356793701648712, -0.08029330521821976, 0.16448934376239777, 0.09239562600851059, 0.12419285625219345, -0.14376309514045715, -0.00019283240544609725, -0.0762530043721199, -0.05611240118741989, 0.07737895101308823, -0.09433035552501678, 0.058893077075481415, 0.10901971161365509, -0.04567738622426987, 0.08828683942556381, 0.10377411544322968, 0.008936077356338501, 0.003213887568563223, 0.10916902124881744, -0.2667325437068939, -0.0296600554138422, -0.07532413303852081, 0.000883326749317348, 0.09092561900615692, 0.08562852442264557, 0.18840822577476501, 0.025361526757478714, -0.04293036088347435, -0.002770674182102084, 0.028597986325621605, -0.039021048694849014, 0.051667019724845886, 0.001123449532315135, 0.01947369985282421, -0.1530752182006836, 0.072522833943367, 0.01490565575659275, -0.15215420722961426, 0.021316176280379295, 0.16572684049606323, -0.11656328290700912, -0.1283872276544571, -0.06520111113786697, 0.08313824236392975, -0.11755692958831787, -0.01578943058848381, -0.03279297426342964, -0.13145680725574493, 0.07992171496152878, 0.12629036605358124, 0.05557859688997269, 0.0972496047616005, -0.06061713397502899, -0.020469192415475845, -0.018721895292401314, -0.014099318534135818, -0.012384648434817791, -0.007667020428925753, -0.055978111922740936, 0.0590752474963665, -0.026677248999476433, 0.1425808072090149, -0.09221141785383224, -0.1037059873342514, -0.16142144799232483, 0.0374140702188015, -0.11013076454401016, -0.08825794607400894, -0.08821134269237518, -0.050188567489385605, 0.002360827289521694, -0.019856395199894905, -0.04037635400891304, -0.05829505994915962, -0.12300454825162888, 0.0338277705013752, -0.040771447122097015, 0.024727050215005875, -0.07512269169092178, 0.015856385231018066, 0.08507686108350754, -0.03285100311040878, 0.15655414760112762, 0.1450488418340683, -0.1006515845656395, 0.10741901397705078, -0.14806775748729706, -0.09138492494821548, 0.11116421222686768, 0.015329592861235142, 0.0449691042304039, 0.09723787009716034, 0.013362943194806576, 0.0635865181684494, 0.032776717096567154, 0.05308786407113075, 0.027619892731308937, -0.11959987878799438, 0.06483134627342224, -0.03626115620136261, -0.14700546860694885, -0.049338050186634064, -0.05282869189977646, 0.01647452637553215, 0.013054544106125832, 0.09622690081596375, -0.05301849544048309, 0.10698331147432327, -0.04055701196193695, 0.0346808135509491, 0.017554637044668198, -0.1730053424835205, -0.03816922754049301, -0.08538098633289337, 0.03681723028421402, 0.014741539023816586, 0.25266793370246887, 0.030072299763560295, 0.012416383251547813, 0.032671261578798294, 0.08285367488861084, 0.03899408504366875, 0.010228337720036507, 0.17482228577136993, 0.1162426546216011, -0.06621865928173065, -0.10445023328065872, 0.0729617029428482, 0.016332454979419708, 0.01286179106682539, 0.13617953658103943, 0.008365051820874214, 0.005795429926365614, 0.08649782836437225, -0.016865963116288185, 0.009968153201043606, -0.10052056610584259, -0.13426925241947174, -0.022176474332809448, 0.05151832848787308, -0.04655967652797699, 0.11727844923734665, 0.1406494379043579, -0.01806013658642769, 0.03222079202532768, -0.021771740168333054, -0.05699979141354561, -0.1683429479598999, -0.1429590880870819, -0.06883849948644638, -0.13416796922683716, 0.00897989235818386, -0.11180389672517776, 0.05395037308335304, 0.06001098081469536, 0.06750501692295074, -0.06899319589138031, 0.10220931470394135, 0.04626858979463577, -0.11440542340278625, 0.06264589726924896, -0.0296088308095932, 0.09430401772260666, -0.02759445086121559, -0.019505485892295837, -0.09039592742919922, 0.014574515633285046, 0.011419114656746387, 0.06245238706469536, -0.04707273095846176, 0.007463190704584122, -0.14696238934993744, -0.08972041308879852, -0.0523175448179245, 0.0718572810292244, -0.050409089773893356, 0.14282815158367157, 0.00775480642914772, -0.0170906875282526, 0.039554283022880554, 0.22787313163280487, -0.07476283609867096, -0.04778539761900902, -0.05269690603017807, 0.20717895030975342, 0.02975541539490223, 0.1171872541308403, -0.022938819602131844, -0.006106364540755749, -0.0919521227478981, 0.3764844834804535, 0.30030161142349243, -0.09031439572572708, 0.011794124729931355, 0.02137952297925949, 0.04502861574292183, 0.1316293478012085, 0.1216534823179245, 0.10318691283464432, 0.3006802201271057, -0.07452366501092911, -0.04653361067175865, -0.012629742734134197, -0.023858042433857918, -0.09059546142816544, 0.1021224707365036, 0.04839762672781944, -0.06382183730602264, -0.03313443064689636, 0.0954432487487793, -0.25862133502960205, 0.1277991235256195, -0.12311873584985733, -0.17578600347042084, -0.06654827296733856, 0.009760108776390553, 0.10465722531080246, 0.015642458572983742, 0.0946015790104866, 0.007128213066607714, -0.11252258718013763, 0.06305865943431854, 0.03397420793771744, -0.22762253880500793, 0.0006893770187161863, 0.06642123311758041, -0.07006710022687912, -0.0024247700348496437, -0.026499588042497635, 0.05657242611050606, 0.0656052976846695, 0.054629553109407425, -0.00971333310008049, 0.03816632181406021, 0.0034184439573436975, -0.0585215799510479, 0.016623929142951965, 0.05121519789099693, 0.02472509816288948, -0.09763528406620026, 0.06927435845136642, -0.1574270874261856, 0.04766253009438515, -0.0030655991286039352, -0.04124255105853081, 0.006064958870410919, 0.008823691867291927, -0.06491616368293762, 0.05165379121899605, 0.07916834205389023, -0.0016257909592241049, -0.0062433634884655476, -0.057178743183612823, -0.02632102556526661, -0.027755750343203545, -0.09291748702526093, -0.10495562851428986, -0.14682936668395996, -0.11640441417694092, 0.09368976950645447, -0.01011267676949501, -0.1848134547472, 0.022154374048113823, -0.08606051653623581, 0.08319322764873505, -0.1670055389404297, 0.08040720224380493, 0.07041648775339127, 0.013038921169936657, -0.0031511052511632442, -0.02002427540719509, 0.054132770746946335, 0.086809903383255, -0.10407156497240067, -0.07400695979595184 ]
null
null
transformers
#Rick and Morty DialoGPT medium model
{"tags": ["conversational"]}
text-generation
PVAbhiram2003/DialoGPT-medium-RickandMorty
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
#Rick and Morty DialoGPT medium model
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 51 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.009697278961539268, 0.03208012506365776, -0.007204889785498381, 0.004809224978089333, 0.16726240515708923, 0.014898733235895634, 0.09765533357858658, 0.13672804832458496, -0.007841327227652073, -0.031050153076648712, 0.14490588009357452, 0.20411323010921478, -0.006439372431486845, 0.0661218985915184, -0.07572533935308456, -0.2683109939098358, 0.05759621039032936, 0.046649303287267685, 0.016515716910362244, 0.1200079694390297, 0.08573378622531891, -0.05473608896136284, 0.08714032918214798, -0.014583407901227474, -0.150366872549057, 0.017733458429574966, 0.043394338339567184, -0.12260226160287857, 0.11910516023635864, 0.05462685227394104, 0.07063519209623337, 0.014929565601050854, -0.07541623711585999, -0.1631229966878891, 0.03031250834465027, 0.01425902172923088, -0.0594632662832737, 0.04757995903491974, 0.059961482882499695, -0.10165371745824814, 0.10819483548402786, 0.09530027210712433, -0.013078106567263603, 0.06798283755779266, -0.16849711537361145, -0.020869607105851173, -0.01446688175201416, 0.009899779222905636, 0.05550243332982063, 0.09964893013238907, -0.03413357585668564, 0.10497362166643143, -0.09214533120393753, 0.11017382889986038, 0.10932035744190216, -0.32057443261146545, -0.005767723545432091, 0.09167823940515518, 0.039358653128147125, 0.07352814823389053, -0.04467793554067612, 0.06258884817361832, 0.018015462905168533, 0.017986174672842026, -0.014015024527907372, -0.07283061742782593, -0.11612214148044586, 0.04717336222529411, -0.08668071031570435, -0.059868961572647095, 0.2244078367948532, -0.05464440956711769, 0.06881742179393768, -0.05281897634267807, -0.10522868484258652, -0.04308144748210907, -0.029833965003490448, 0.00475557055324316, -0.07660607248544693, 0.08692064881324768, 0.00869679357856512, -0.09547875821590424, -0.1376667022705078, -0.02496783249080181, -0.1776352822780609, 0.16140350699424744, 0.02465328387916088, 0.05232657864689827, -0.2027255892753601, 0.09623090922832489, 0.017906051129102707, -0.08045592904090881, 0.022091427817940712, -0.10046248883008957, 0.029131146147847176, 0.013760408386588097, -0.04754498973488808, -0.061387211084365845, 0.0843690037727356, 0.11199145019054413, -0.01731434464454651, 0.025486016646027565, -0.039331406354904175, 0.08100687712430954, 0.03553595021367073, 0.09077847748994827, 0.007288969587534666, -0.028338588774204254, 0.025842782109975815, -0.13719046115875244, -0.003647835226729512, -0.07116208970546722, -0.16572439670562744, -0.021088803187012672, 0.02994808368384838, 0.08289173990488052, 0.015449047088623047, 0.11682453751564026, -0.03272046521306038, -0.025152435526251793, 0.03602350503206253, -0.047656361013650894, -0.012649794109165668, 0.016648368909955025, 0.013163427822291851, 0.12399329990148544, -0.0022096503525972366, 0.03235051408410072, -0.13653022050857544, 0.031423524022102356, -0.06793295592069626, -0.003740974934771657, -0.03486552834510803, -0.040637075901031494, 0.009043924510478973, -0.06862333416938782, 0.003486064961180091, -0.15030112862586975, -0.15063877403736115, 0.007587034720927477, -0.007836631499230862, -0.04107699543237686, -0.06370922178030014, -0.06952770054340363, -0.013550350442528725, 0.04251532256603241, -0.07093454152345657, -0.011352915316820145, -0.06403283774852753, 0.11004766076803207, -0.03197755664587021, 0.07921615242958069, -0.11953279376029968, 0.08390819281339645, -0.11260783672332764, -0.02386913076043129, -0.060801517218351364, 0.09317506104707718, -0.0006014376995153725, 0.09549830108880997, -0.006563255097717047, -0.017931854352355003, -0.07981178909540176, 0.06445012241601944, -0.042872510850429535, 0.21701598167419434, -0.0615808479487896, -0.11181682348251343, 0.28781595826148987, -0.052628401666879654, -0.1370542049407959, 0.11647392809391022, 0.008682746440172195, 0.05777018144726753, 0.10703510791063309, 0.19733482599258423, -0.015276194550096989, 0.004040541127324104, 0.09471915662288666, 0.11263324320316315, -0.11276852339506149, -0.033160366117954254, 0.013019153848290443, -0.04081077128648758, -0.10867965966463089, 0.04689536616206169, 0.09810488671064377, 0.07090286910533905, -0.04786505550146103, -0.03377414867281914, -0.01366397924721241, 0.0052589005790650845, 0.08885077387094498, -0.007157256826758385, 0.10962837189435959, -0.05819983780384064, -0.03796621412038803, -0.029282379895448685, -0.012126247398555279, -0.03951939567923546, 0.03137664496898651, -0.043376367539167404, 0.10821941494941711, -0.011204327456653118, 0.06364280730485916, -0.16185984015464783, -0.07691477984189987, -0.017002692446112633, 0.1581239402294159, 0.024538565427064896, 0.09859629720449448, 0.0552486926317215, -0.040398042649030685, -0.0012767292791977525, 0.012792680412530899, 0.15581141412258148, -0.022091681137681007, -0.065607450902462, -0.052166227251291275, 0.08642971515655518, -0.05641226842999458, 0.04504093527793884, -0.05937713757157326, 0.012367865070700645, 0.05064384639263153, 0.10342344641685486, -0.00018274025933351368, 0.03323284164071083, -0.008164864964783192, 0.002145637758076191, -0.058205123990774155, 0.007405933458358049, 0.10799351334571838, 0.00036868182360194623, -0.07365862280130386, 0.22074243426322937, -0.17796069383621216, 0.1765957772731781, 0.1893044263124466, -0.299345999956131, 0.017949223518371582, -0.10759581625461578, -0.04561871662735939, 0.014407722279429436, 0.05567655712366104, -0.0454222597181797, 0.1703362911939621, -0.009871348738670349, 0.18874616920948029, -0.04946064203977585, -0.04464937001466751, -0.0200483538210392, -0.05118836089968681, -0.0024189651012420654, 0.07781197130680084, 0.10685696452856064, -0.13992026448249817, 0.1964332014322281, 0.1621224284172058, 0.048237916082143784, 0.19945049285888672, 0.015346456319093704, -0.011589210480451584, 0.0909530371427536, 0.005220826715230942, -0.058739423751831055, -0.07409929484128952, -0.2594851851463318, -0.030033592134714127, 0.07992640137672424, 0.0422382652759552, 0.1212305948138237, -0.11349532753229141, -0.038956157863140106, -0.01763172075152397, -0.023146281018853188, 0.021672505885362625, 0.0914369598031044, 0.06075398623943329, 0.13201528787612915, -0.001710098935291171, -0.007300339173525572, 0.10524573177099228, 0.01783694699406624, -0.09354141354560852, 0.18308524787425995, -0.13652534782886505, -0.37097251415252686, -0.13911493122577667, -0.18057456612586975, -0.05449081212282181, 0.05712554603815079, 0.11679314076900482, -0.12011238187551498, -0.018752124160528183, 0.01578843593597412, 0.10931742936372757, -0.08449502289295197, 0.0021454424131661654, -0.06880278885364532, 0.0321490578353405, -0.10310184955596924, -0.09194442629814148, -0.055416494607925415, -0.031392451375722885, -0.08001253753900528, 0.1423761546611786, -0.10777941346168518, 0.04476889222860336, 0.20262959599494934, 0.04653622955083847, 0.05625178664922714, -0.044105201959609985, 0.19377262890338898, -0.11264272034168243, -0.01661740615963936, 0.19215328991413116, -0.048360925167798996, 0.07476246356964111, 0.1232115849852562, -0.006348740309476852, -0.08765771239995956, 0.03011748194694519, -0.02085109055042267, -0.07988511025905609, -0.23219464719295502, -0.13938382267951965, -0.12429051846265793, 0.09477275609970093, 0.028005298227071762, 0.056365787982940674, 0.17219258844852448, 0.06577219814062119, -0.038416244089603424, 0.006410336587578058, 0.02959546446800232, 0.08237514644861221, 0.23417828977108002, -0.06035616248846054, 0.1364797055721283, -0.03420931473374367, -0.14982740581035614, 0.08169995993375778, 0.0713929831981659, 0.10213395953178406, 0.06678459793329239, 0.0804823637008667, 0.0149586396291852, 0.06188136339187622, 0.1311223804950714, 0.08191446959972382, 0.019586285576224327, -0.02480296604335308, -0.03388110175728798, -0.025523077696561813, -0.05937909707427025, 0.040128443390131, 0.06589099019765854, -0.16763372719287872, -0.039227183908224106, -0.09338314831256866, 0.09657008945941925, 0.0873042419552803, 0.06609832495450974, -0.1842060089111328, -0.008006223477423191, 0.08488986641168594, -0.03854905813932419, -0.13727426528930664, 0.09535189718008041, 0.01523482333868742, -0.15144726634025574, 0.03139317408204079, -0.04061909019947052, 0.12188644707202911, -0.07804752141237259, 0.09809603542089462, -0.08108244836330414, -0.07448557764291763, 0.02123199962079525, 0.1261177361011505, -0.30527687072753906, 0.20240111649036407, -0.0024993624538183212, -0.06486981362104416, -0.1243603527545929, -0.0032166161108762026, 0.002410882618278265, 0.07357452809810638, 0.10519039630889893, -0.007196315098553896, 0.001897757756523788, -0.06300821900367737, -0.01829923689365387, 0.032471053302288055, 0.13080233335494995, -0.0401318334043026, -0.021158374845981598, -0.050194524228572845, -0.001653497340157628, -0.03173094615340233, -0.06934895366430283, 0.02002747356891632, -0.19509181380271912, 0.08751901984214783, 0.04166261479258537, 0.09648149460554123, 0.029994789510965347, 0.004265148192644119, -0.09651939570903778, 0.24698667228221893, -0.07148019969463348, -0.10072879493236542, -0.10919588059186935, -0.046813901513814926, 0.03569883480668068, -0.05628936365246773, 0.04309194162487984, -0.0788632407784462, 0.028997479006648064, -0.06352769583463669, -0.19235502183437347, 0.12410202622413635, -0.09027006477117538, -0.04412810131907463, -0.02371402643620968, 0.2110891044139862, -0.05598580464720726, 0.010335659608244896, 0.02930437959730625, 0.01208863127976656, -0.11645778268575668, -0.09678568691015244, 0.031018631532788277, -0.007351789623498917, 0.050603240728378296, 0.041841957718133926, -0.05915454775094986, -0.017138581722974777, -0.052199993282556534, -0.022926922887563705, 0.3496883809566498, 0.14231905341148376, -0.043836336582899094, 0.19347235560417175, 0.12347975373268127, -0.07452994585037231, -0.3159443140029907, -0.1066238060593605, -0.10937739163637161, -0.04680149629712105, -0.07012093812227249, -0.2002030611038208, 0.06474938243627548, 0.00662544509395957, -0.013415241613984108, 0.12749312818050385, -0.2561831772327423, -0.07571036368608475, 0.15906259417533875, -0.017980827018618584, 0.3745945692062378, -0.1168576180934906, -0.10926306992769241, -0.03950892388820648, -0.14175476133823395, 0.16968177258968353, -0.01989765651524067, 0.11221715062856674, -0.009765521623194218, 0.14388824999332428, 0.05548359826207161, -0.023479344323277473, 0.08544106781482697, 0.004999885335564613, -0.03290518373250961, -0.10304180532693863, -0.05676887184381485, 0.007092386484146118, 0.02477436140179634, 0.018026655539870262, -0.041834570467472076, 0.02227151393890381, -0.11731979995965958, -0.04657655209302902, -0.08982590585947037, 0.04431166127324104, 0.03899754583835602, -0.07325074821710587, -0.002380647463724017, -0.07165111601352692, -0.012272949330508709, 0.022334342822432518, 0.20356793701648712, -0.08029330521821976, 0.16448934376239777, 0.09239562600851059, 0.12419285625219345, -0.14376309514045715, -0.00019283240544609725, -0.0762530043721199, -0.05611240118741989, 0.07737895101308823, -0.09433035552501678, 0.058893077075481415, 0.10901971161365509, -0.04567738622426987, 0.08828683942556381, 0.10377411544322968, 0.008936077356338501, 0.003213887568563223, 0.10916902124881744, -0.2667325437068939, -0.0296600554138422, -0.07532413303852081, 0.000883326749317348, 0.09092561900615692, 0.08562852442264557, 0.18840822577476501, 0.025361526757478714, -0.04293036088347435, -0.002770674182102084, 0.028597986325621605, -0.039021048694849014, 0.051667019724845886, 0.001123449532315135, 0.01947369985282421, -0.1530752182006836, 0.072522833943367, 0.01490565575659275, -0.15215420722961426, 0.021316176280379295, 0.16572684049606323, -0.11656328290700912, -0.1283872276544571, -0.06520111113786697, 0.08313824236392975, -0.11755692958831787, -0.01578943058848381, -0.03279297426342964, -0.13145680725574493, 0.07992171496152878, 0.12629036605358124, 0.05557859688997269, 0.0972496047616005, -0.06061713397502899, -0.020469192415475845, -0.018721895292401314, -0.014099318534135818, -0.012384648434817791, -0.007667020428925753, -0.055978111922740936, 0.0590752474963665, -0.026677248999476433, 0.1425808072090149, -0.09221141785383224, -0.1037059873342514, -0.16142144799232483, 0.0374140702188015, -0.11013076454401016, -0.08825794607400894, -0.08821134269237518, -0.050188567489385605, 0.002360827289521694, -0.019856395199894905, -0.04037635400891304, -0.05829505994915962, -0.12300454825162888, 0.0338277705013752, -0.040771447122097015, 0.024727050215005875, -0.07512269169092178, 0.015856385231018066, 0.08507686108350754, -0.03285100311040878, 0.15655414760112762, 0.1450488418340683, -0.1006515845656395, 0.10741901397705078, -0.14806775748729706, -0.09138492494821548, 0.11116421222686768, 0.015329592861235142, 0.0449691042304039, 0.09723787009716034, 0.013362943194806576, 0.0635865181684494, 0.032776717096567154, 0.05308786407113075, 0.027619892731308937, -0.11959987878799438, 0.06483134627342224, -0.03626115620136261, -0.14700546860694885, -0.049338050186634064, -0.05282869189977646, 0.01647452637553215, 0.013054544106125832, 0.09622690081596375, -0.05301849544048309, 0.10698331147432327, -0.04055701196193695, 0.0346808135509491, 0.017554637044668198, -0.1730053424835205, -0.03816922754049301, -0.08538098633289337, 0.03681723028421402, 0.014741539023816586, 0.25266793370246887, 0.030072299763560295, 0.012416383251547813, 0.032671261578798294, 0.08285367488861084, 0.03899408504366875, 0.010228337720036507, 0.17482228577136993, 0.1162426546216011, -0.06621865928173065, -0.10445023328065872, 0.0729617029428482, 0.016332454979419708, 0.01286179106682539, 0.13617953658103943, 0.008365051820874214, 0.005795429926365614, 0.08649782836437225, -0.016865963116288185, 0.009968153201043606, -0.10052056610584259, -0.13426925241947174, -0.022176474332809448, 0.05151832848787308, -0.04655967652797699, 0.11727844923734665, 0.1406494379043579, -0.01806013658642769, 0.03222079202532768, -0.021771740168333054, -0.05699979141354561, -0.1683429479598999, -0.1429590880870819, -0.06883849948644638, -0.13416796922683716, 0.00897989235818386, -0.11180389672517776, 0.05395037308335304, 0.06001098081469536, 0.06750501692295074, -0.06899319589138031, 0.10220931470394135, 0.04626858979463577, -0.11440542340278625, 0.06264589726924896, -0.0296088308095932, 0.09430401772260666, -0.02759445086121559, -0.019505485892295837, -0.09039592742919922, 0.014574515633285046, 0.011419114656746387, 0.06245238706469536, -0.04707273095846176, 0.007463190704584122, -0.14696238934993744, -0.08972041308879852, -0.0523175448179245, 0.0718572810292244, -0.050409089773893356, 0.14282815158367157, 0.00775480642914772, -0.0170906875282526, 0.039554283022880554, 0.22787313163280487, -0.07476283609867096, -0.04778539761900902, -0.05269690603017807, 0.20717895030975342, 0.02975541539490223, 0.1171872541308403, -0.022938819602131844, -0.006106364540755749, -0.0919521227478981, 0.3764844834804535, 0.30030161142349243, -0.09031439572572708, 0.011794124729931355, 0.02137952297925949, 0.04502861574292183, 0.1316293478012085, 0.1216534823179245, 0.10318691283464432, 0.3006802201271057, -0.07452366501092911, -0.04653361067175865, -0.012629742734134197, -0.023858042433857918, -0.09059546142816544, 0.1021224707365036, 0.04839762672781944, -0.06382183730602264, -0.03313443064689636, 0.0954432487487793, -0.25862133502960205, 0.1277991235256195, -0.12311873584985733, -0.17578600347042084, -0.06654827296733856, 0.009760108776390553, 0.10465722531080246, 0.015642458572983742, 0.0946015790104866, 0.007128213066607714, -0.11252258718013763, 0.06305865943431854, 0.03397420793771744, -0.22762253880500793, 0.0006893770187161863, 0.06642123311758041, -0.07006710022687912, -0.0024247700348496437, -0.026499588042497635, 0.05657242611050606, 0.0656052976846695, 0.054629553109407425, -0.00971333310008049, 0.03816632181406021, 0.0034184439573436975, -0.0585215799510479, 0.016623929142951965, 0.05121519789099693, 0.02472509816288948, -0.09763528406620026, 0.06927435845136642, -0.1574270874261856, 0.04766253009438515, -0.0030655991286039352, -0.04124255105853081, 0.006064958870410919, 0.008823691867291927, -0.06491616368293762, 0.05165379121899605, 0.07916834205389023, -0.0016257909592241049, -0.0062433634884655476, -0.057178743183612823, -0.02632102556526661, -0.027755750343203545, -0.09291748702526093, -0.10495562851428986, -0.14682936668395996, -0.11640441417694092, 0.09368976950645447, -0.01011267676949501, -0.1848134547472, 0.022154374048113823, -0.08606051653623581, 0.08319322764873505, -0.1670055389404297, 0.08040720224380493, 0.07041648775339127, 0.013038921169936657, -0.0031511052511632442, -0.02002427540719509, 0.054132770746946335, 0.086809903383255, -0.10407156497240067, -0.07400695979595184 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # albert-base-v2_squad This model is a fine-tuned version of [albert-base-v2](https://huggingface.co/albert-base-v2) on the **squadV1** dataset. - "eval_exact_match": 82.69631031220435 - "eval_f1": 90.10806626207174 - "eval_samples": 10808 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 16 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results ### Framework versions - Transformers 4.14.1 - Pytorch 1.9.0 - Datasets 1.16.1 - Tokenizers 0.10.3
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "albert-base-v2_squad", "results": []}]}
question-answering
Palak/albert-base-v2_squad
[ "transformers", "pytorch", "albert", "question-answering", "generated_from_trainer", "dataset:squad", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #albert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
# albert-base-v2_squad This model is a fine-tuned version of albert-base-v2 on the squadV1 dataset. - "eval_exact_match": 82.69631031220435 - "eval_f1": 90.10806626207174 - "eval_samples": 10808 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 16 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results ### Framework versions - Transformers 4.14.1 - Pytorch 1.9.0 - Datasets 1.16.1 - Tokenizers 0.10.3
[ "# albert-base-v2_squad\n\nThis model is a fine-tuned version of albert-base-v2 on the squadV1 dataset.\n- \"eval_exact_match\": 82.69631031220435\n- \"eval_f1\": 90.10806626207174\n- \"eval_samples\": 10808", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0", "### Training results", "### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ "TAGS\n#transformers #pytorch #albert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n", "# albert-base-v2_squad\n\nThis model is a fine-tuned version of albert-base-v2 on the squadV1 dataset.\n- \"eval_exact_match\": 82.69631031220435\n- \"eval_f1\": 90.10806626207174\n- \"eval_samples\": 10808", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0", "### Training results", "### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ 51, 82, 6, 12, 8, 3, 90, 4, 31 ]
[ "passage: TAGS\n#transformers #pytorch #albert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# albert-base-v2_squad\n\nThis model is a fine-tuned version of albert-base-v2 on the squadV1 dataset.\n- \"eval_exact_match\": 82.69631031220435\n- \"eval_f1\": 90.10806626207174\n- \"eval_samples\": 10808## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0### Training results### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ -0.10492035746574402, 0.17285023629665375, -0.0019107128027826548, 0.0953221395611763, 0.14066733419895172, 0.02968919835984707, 0.08188986778259277, 0.14423343539237976, -0.050866175442934036, 0.07743242383003235, 0.11262671649456024, 0.04663942754268646, 0.06433812528848648, 0.10666878521442413, -0.03344797343015671, -0.19804298877716064, 0.004123311955481768, 0.02876468189060688, 0.013371389359235764, 0.10895688831806183, 0.11604168266057968, -0.0884685218334198, 0.09569438546895981, 0.02563435770571232, -0.11827897280454636, 0.013274922966957092, -0.02667691558599472, -0.046780992299318314, 0.08457909524440765, 0.03120332956314087, 0.06967689096927643, -0.012187082320451736, 0.09346982836723328, -0.21168047189712524, -0.010448427870869637, 0.0442751906812191, 0.03904815390706062, 0.09673244506120682, 0.022924218326807022, 0.02036447264254093, 0.053598999977111816, -0.14244267344474792, 0.09537190943956375, 0.026778852567076683, -0.09559547156095505, -0.15681928396224976, -0.08906152099370956, 0.06654743105173111, 0.09451325237751007, 0.10871383547782898, -0.002868403447791934, 0.19301581382751465, -0.02020159736275673, 0.06559883058071136, 0.20601657032966614, -0.30266132950782776, -0.050306450575590134, 0.011285211890935898, 0.05645246431231499, 0.04925021901726723, -0.09608538448810577, 0.0014038655208423734, 0.045437317341566086, 0.034025538712739944, 0.1174309104681015, -0.01355979684740305, -0.0423424206674099, -0.02448439598083496, -0.11341745406389236, -0.08351951837539673, 0.21209947764873505, 0.08612868189811707, -0.06400710344314575, -0.1300513595342636, -0.04993117228150368, -0.1170467883348465, -0.006739472039043903, -0.05133520066738129, 0.021347185596823692, -0.0648006871342659, -0.024407168850302696, -0.049530286341905594, -0.08049594610929489, -0.04246479272842407, 0.013365976512432098, 0.11464756727218628, 0.024866456165909767, 0.03206939995288849, 0.0005861669196747243, 0.0961177721619606, -0.045877259224653244, -0.15103083848953247, -0.02272409200668335, -0.010740197263658047, -0.06837328523397446, -0.044250961393117905, -0.040596287697553635, -0.04506830871105194, -0.03177856281399727, 0.14666347205638885, -0.002696582116186619, 0.043266598135232925, 0.047710105776786804, -0.005356879904866219, -0.00461296271532774, 0.18388786911964417, -0.05089893564581871, -0.06845508515834808, 0.0019982391968369484, 0.13210348784923553, 0.031957708299160004, -0.023611247539520264, -0.08842611312866211, -0.032823823392391205, 0.09525410830974579, 0.033206019550561905, -0.012110578827559948, 0.021812641993165016, -0.07480505853891373, -0.06034088134765625, 0.07362965494394302, -0.10998739302158356, 0.036244940012693405, -0.03701476380228996, -0.09406474232673645, -0.08307000249624252, 0.03558836132287979, 0.014034093357622623, -0.04525342956185341, 0.05867509916424751, -0.08777078986167908, -0.01856757327914238, -0.06340395659208298, -0.0654386505484581, 0.0018783105770125985, -0.048858337104320526, 0.017502760514616966, -0.0771271213889122, -0.16434228420257568, -0.031564585864543915, 0.046891991049051285, -0.06350307911634445, -0.08593995869159698, -0.007437036372721195, -0.042139649391174316, 0.01682690717279911, -0.015299532562494278, 0.11449524015188217, -0.04695778712630272, 0.06370949000120163, 0.040522731840610504, 0.021740389987826347, 0.001789892208762467, 0.05577502027153969, -0.09094627946615219, 0.034691352397203445, -0.06510651111602783, 0.07315531373023987, -0.09330961853265762, 0.007620562333613634, -0.13019859790802002, -0.09813597798347473, 0.03637157380580902, -0.03915555775165558, 0.08660370111465454, 0.11612176150083542, -0.10787995904684067, -0.010489524342119694, 0.09481097757816315, -0.037931133061647415, -0.12894368171691895, 0.11417847126722336, -0.035804279148578644, -0.005045202560722828, 0.05462327226996422, 0.12666717171669006, 0.11805109679698944, -0.13888601958751678, -0.03431016951799393, 0.036311741918325424, 0.07581023126840591, 0.007290199864655733, 0.10844344645738602, 0.0004702175792772323, 0.004581166431307793, 0.017305150628089905, -0.08119753003120422, -0.011177861131727695, -0.08196843415498734, -0.0898846760392189, -0.046329159289598465, -0.08141633868217468, 0.060742951929569244, 0.037473082542419434, 0.019528048112988472, -0.07307546585798264, -0.137863889336586, 0.033137913793325424, 0.11127106100320816, -0.021755609661340714, -0.004345615394413471, -0.08106610924005508, 0.06923432648181915, -0.07881138473749161, -0.03649476543068886, -0.1843329668045044, -0.0871061161160469, 0.06503263860940933, -0.05112525075674057, 0.013727586716413498, 0.04066132754087448, 0.05002949759364128, 0.06728552281856537, -0.05155095085501671, -0.03237348049879074, -0.12332896888256073, -0.005362486466765404, -0.11433789134025574, -0.11442235112190247, -0.08681198954582214, -0.026474541053175926, 0.19557936489582062, -0.19014094769954681, -0.0034330429043620825, -0.04099508747458458, 0.11197356879711151, 0.02363704703748226, -0.06333888322114944, 0.009987723082304, 0.02148163504898548, -0.00302327168174088, -0.0842004045844078, 0.037808723747730255, 0.018115239217877388, -0.10162356495857239, -0.06969206035137177, -0.13075928390026093, 0.07575663179159164, 0.07385728508234024, 0.07339491695165634, -0.06567348539829254, 0.013385813683271408, -0.060465484857559204, -0.03525792807340622, -0.07609348744153976, -0.039778098464012146, 0.21926885843276978, 0.021340344101190567, 0.13824394345283508, -0.05515509471297264, -0.05417317897081375, 0.016492638736963272, -0.0008854147745296359, -0.021523362025618553, 0.0738588348031044, 0.00732535682618618, -0.14839725196361542, 0.08496011793613434, 0.10058526694774628, -0.022098613902926445, 0.08926495909690857, -0.027889329940080643, -0.09778724610805511, -0.05479222908616066, 0.0002924785658251494, -0.0070982989855110645, 0.12911196053028107, -0.09103024750947952, 0.012693746015429497, 0.07437567412853241, 0.001918136258609593, 0.00035945928539149463, -0.15546709299087524, -0.0069769383408129215, 0.05817931890487671, -0.019319629296660423, -0.007925344631075859, -0.01969652995467186, 0.003099457360804081, 0.08189307898283005, 0.04375982657074928, -0.029859112575650215, 0.018761223182082176, -0.022920547053217888, -0.06651145964860916, 0.16383622586727142, -0.08108101040124893, -0.1945764273405075, -0.15292274951934814, 0.03612014651298523, -0.08513513207435608, -0.0009718274232000113, 0.024459965527057648, -0.024451538920402527, -0.0651235282421112, -0.09919989854097366, -0.022299878299236298, -0.055209144949913025, -0.0240065548568964, 0.07921527326107025, -0.02759886533021927, 0.1045074313879013, -0.12901686131954193, -0.011561575345695019, -0.012024510651826859, -0.03638239577412605, -0.016864575445652008, 0.05280083790421486, 0.11661528050899506, 0.05904313921928406, -0.02923489362001419, 0.017625821754336357, -0.02862991765141487, 0.2815234959125519, -0.06536199152469635, -0.04055824875831604, 0.12865054607391357, -0.012546966783702374, 0.06409766525030136, 0.10823843628168106, 0.028431454673409462, -0.08699601143598557, 0.01361001841723919, 0.043239790946245193, -0.012708254158496857, -0.221091628074646, -0.04591050744056702, -0.033199649304151535, -0.07174820452928543, 0.12614062428474426, 0.027423018589615822, 0.013815606944262981, 0.07113567739725113, -0.028247205540537834, 0.06716906279325485, -0.053383637219667435, 0.0925491526722908, 0.11001946032047272, 0.036193136125802994, 0.09936223179101944, -0.020479794591665268, -0.033990103751420975, 0.05174604430794716, 0.009751265868544579, 0.21534304320812225, -0.017598187550902367, 0.16273713111877441, 0.0134016964584589, 0.15128636360168457, -0.03634665906429291, 0.04263840988278389, 0.005653563421219587, 0.0018150167306885123, -0.0164131298661232, -0.05574489012360573, -0.08307754993438721, 0.03153835982084274, 0.02006319910287857, 0.08349313586950302, -0.09632107615470886, 0.024231480434536934, -0.012513874098658562, 0.2600521743297577, 0.07221049815416336, -0.3252371847629547, -0.09401745349168777, 0.017675533890724182, -0.015452365390956402, -0.0925847664475441, -0.011488287709653378, 0.09623371809720993, -0.13696728646755219, 0.05676063895225525, -0.056076690554618835, 0.09457585215568542, -0.06344778090715408, 0.0038695523981004953, 0.0375666581094265, 0.06331697106361389, 0.01160807441920042, 0.10573527216911316, -0.18388931453227997, 0.22156250476837158, 0.028804657980799675, 0.07917289435863495, -0.08345439285039902, 0.0528644360601902, -0.0079612722620368, 0.05570273846387863, 0.1341540515422821, 0.00700230710208416, 0.03917524591088295, -0.197808638215065, -0.10630107671022415, 0.017659494653344154, 0.05462443456053734, -0.08442603796720505, 0.08239953964948654, -0.042132798582315445, 0.000143893834319897, 0.025826169177889824, 0.048411186784505844, -0.11634020507335663, -0.13996858894824982, 0.052550218999385834, 0.017701074481010437, -0.03515215590596199, -0.06743548065423965, -0.10536455363035202, -0.02514728531241417, 0.15801672637462616, 0.060648586601018906, -0.06024407595396042, -0.12699562311172485, 0.09462889283895493, 0.1205793246626854, -0.07755422592163086, 0.002765393117442727, 0.001774233067408204, 0.13592152297496796, 0.02324742078781128, -0.05201004073023796, 0.05468874052166939, -0.05329442024230957, -0.13666410744190216, -0.0549926683306694, 0.16342444717884064, 0.012028008699417114, 0.06859107315540314, 0.01131395623087883, 0.03567555174231529, -0.010467265732586384, -0.07003335654735565, 0.035214364528656006, 0.04192674532532692, 0.08639068156480789, 0.05203449726104736, -0.018787413835525513, 0.024717817083001137, -0.05953221768140793, 0.01345784217119217, 0.15193597972393036, 0.2410038262605667, -0.08529852330684662, 0.04072720929980278, 0.039033204317092896, -0.060476649552583694, -0.1411954015493393, 0.00916238222271204, 0.11835671961307526, 0.03731878846883774, 0.09925336390733719, -0.13062022626399994, 0.07137781381607056, 0.08419892191886902, -0.03470177948474884, 0.04347249120473862, -0.2754758596420288, -0.11373458802700043, 0.07644274830818176, 0.1144370511174202, 0.05793868750333786, -0.13017551600933075, -0.05634782090783119, -0.028557633981108665, -0.21040795743465424, 0.09527837485074997, 0.0008552678627893329, 0.10632787644863129, -0.015480784699320793, 0.068740613758564, 0.047186367213726044, -0.03583451360464096, 0.17481397092342377, 0.02838869020342827, 0.04678415134549141, -0.06530535966157913, 0.013121811673045158, 0.11361925303936005, -0.07521141320466995, 0.09755755215883255, -0.05572778359055519, 0.06770909577608109, -0.23590736091136932, -0.02342798188328743, -0.05625859275460243, 0.055236876010894775, -0.05840422213077545, -0.05938149243593216, -0.017533110454678535, 0.03145446255803108, 0.07235948741436005, -0.028998443856835365, 0.09597863256931305, 0.0456450879573822, 0.07682780921459198, 0.13752461969852448, 0.08155544102191925, -0.014138960279524326, -0.15781736373901367, -0.020374849438667297, -0.013798011466860771, 0.06978167593479156, -0.10762085020542145, 0.023580482229590416, 0.12563389539718628, 0.038857679814100266, 0.13955046236515045, -0.00011998986883554608, -0.04957703500986099, 0.0023646559566259384, 0.01630263216793537, -0.12063334882259369, -0.1740950495004654, -0.04744458571076393, 0.0010136435739696026, -0.16808228194713593, -0.017446082085371017, 0.11397060006856918, -0.055790599435567856, -0.02677619270980358, -0.014609330333769321, 0.007663946598768234, -0.008375623263418674, 0.159515380859375, 0.025191813707351685, 0.07860223203897476, -0.0767752081155777, 0.08664385229349136, 0.09250175207853317, -0.07528287917375565, 0.052085187286138535, -0.005936617497354746, -0.08407066017389297, -0.026022516191005707, 0.02454480156302452, 0.13379400968551636, -0.010987303219735622, -0.013660377822816372, -0.07560860365629196, -0.06272430717945099, 0.036639004945755005, -0.008287728764116764, 0.06411152333021164, -0.022592246532440186, -0.03417564928531647, -0.008000744506716728, -0.12296099215745926, 0.11895144730806351, 0.04545783996582031, 0.07340370863676071, -0.14274464547634125, 0.018053676933050156, 0.0029518718365579844, 0.054849762469530106, -0.011251171119511127, 0.0067788343876600266, -0.04472679644823074, -0.015298668295145035, -0.13114795088768005, 0.005653001368045807, -0.016877198591828346, 0.011004279367625713, -0.03987671062350273, -0.07799004763364792, -0.038720615208148956, 0.05364305526018143, -0.06541523337364197, -0.07823407649993896, 0.03032686561346054, 0.05759938806295395, -0.12095493078231812, -0.036887235939502716, 0.03127560392022133, -0.08840101212263107, 0.08998183161020279, 0.04408066347241402, 0.021804383024573326, -0.010890128090977669, -0.0074311974458396435, -0.009608346968889236, 0.01485799252986908, 0.04111699014902115, 0.06826266646385193, -0.12484219670295715, -0.0032786279916763306, -0.02687753178179264, 0.01889161393046379, 0.0061359950341284275, 0.07498645782470703, -0.13614606857299805, -0.0828276202082634, -0.01030261442065239, -0.038517434149980545, -0.058970510959625244, 0.04222649335861206, 0.09100012481212616, 0.030533451586961746, 0.17171388864517212, -0.04047784581780434, 0.057800039649009705, -0.2094278484582901, -0.0323735773563385, -0.004367037210613489, -0.040104225277900696, -0.08188590407371521, -0.055732205510139465, 0.07523991912603378, -0.06149302423000336, 0.1024915799498558, 0.004406686406582594, 0.15372753143310547, 0.03664383664727211, 0.016574149951338768, 0.04158054292201996, -0.018502458930015564, 0.17374175786972046, 0.04713636264204979, -0.02710101194679737, 0.104402557015419, -0.015662888064980507, 0.06518854200839996, 0.06314492225646973, 0.11119689792394638, 0.1431669145822525, 0.019898569211363792, 0.050103623420000076, 0.057607825845479965, -0.07392006367444992, -0.1999077945947647, 0.014512608759105206, -0.0004422964120749384, 0.1424003392457962, -0.022131117060780525, 0.14026804268360138, 0.05588487163186073, -0.16579093039035797, 0.05939847603440285, -0.07336302101612091, -0.09552305936813354, -0.06949220597743988, -0.08637522906064987, -0.07570362091064453, -0.09763112664222717, 0.024755477905273438, -0.1223856657743454, 0.020218072459101677, 0.1133245974779129, -0.02243114821612835, -0.022967549040913582, 0.16028907895088196, -0.045727383345365524, -0.0043035028502345085, 0.057051900774240494, 0.0029939429368823767, -0.00205813255161047, -0.06743974983692169, -0.03129256144165993, 0.06398177146911621, 0.03691820427775383, 0.09309469908475876, -0.04784677177667618, 0.005714402534067631, 0.02238355576992035, 0.00674834568053484, -0.10236295312643051, 0.0012329497840255499, 0.016701526939868927, 0.04483497142791748, 0.04951067641377449, 0.03278755769133568, 0.04292812943458557, -0.047164931893348694, 0.2526009976863861, -0.05854283273220062, -0.036418117582798004, -0.12791524827480316, 0.1365216076374054, 0.038244880735874176, -0.012858109548687935, 0.07987748086452484, -0.10002918541431427, -0.004354993812739849, 0.12876492738723755, 0.11884574592113495, -0.05238136276602745, -0.029531989246606827, 0.001294997287914157, -0.010405179113149643, -0.05027957633137703, 0.0822126641869545, 0.08755560219287872, -0.006244970485568047, -0.07379540055990219, 0.012925000861287117, -0.021221958100795746, -0.03532669320702553, -0.053117718547582626, 0.07107312977313995, 0.033012934029102325, 0.015217605046927929, -0.03471449017524719, 0.04720848798751831, 0.048335831612348557, -0.18822897970676422, 0.019645418971776962, -0.2021447718143463, -0.18923869729042053, -0.014431032352149487, 0.08578475564718246, 0.01727135293185711, 0.07477181404829025, 0.0009368815226480365, 0.01494443416595459, 0.15813471376895905, -0.023588059470057487, -0.06670784950256348, -0.09909391403198242, 0.08965318650007248, -0.09537898004055023, 0.22985176742076874, 0.011123661883175373, 0.0833863690495491, 0.1017509400844574, 0.01451867911964655, -0.15069150924682617, 0.03755539283156395, 0.08673223853111267, -0.02044341154396534, 0.05151576176285744, 0.16116291284561157, -0.024343393743038177, 0.09949949383735657, 0.03894682973623276, -0.13027916848659515, -0.04484044760465622, -0.04543131962418556, 0.020261229947209358, -0.057983580976724625, -0.012020882219076157, -0.06523531675338745, 0.1662256419658661, 0.2008657306432724, -0.05669212341308594, -0.03823573514819145, -0.0672202929854393, 0.01995069533586502, 0.07125990092754364, 0.09123052656650543, -0.03058880940079689, -0.18155375123023987, 0.014562606811523438, 0.02920936606824398, 0.03626745194196701, -0.24936245381832123, -0.10033952444791794, 0.04542119428515434, -0.0603557750582695, -0.014524763450026512, 0.08962801843881607, 0.03743034973740578, 0.02006397396326065, -0.04488934576511383, -0.12555336952209473, -0.05857893452048302, 0.1364728957414627, -0.14315573871135712, -0.06275733560323715 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # albert-large-v2_squad This model is a fine-tuned version of [albert-large-v2](https://huggingface.co/albert-large-v2) on the **squadV1** dataset. - "eval_exact_match": 84.80605487228004 - "eval_f1": 91.80638438705844 - "eval_samples": 10808 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 16 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results ### Framework versions - Transformers 4.14.1 - Pytorch 1.9.0 - Datasets 1.16.1 - Tokenizers 0.10.3
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "albert-large-v2_squad", "results": []}]}
question-answering
Palak/albert-large-v2_squad
[ "transformers", "pytorch", "albert", "question-answering", "generated_from_trainer", "dataset:squad", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #albert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
# albert-large-v2_squad This model is a fine-tuned version of albert-large-v2 on the squadV1 dataset. - "eval_exact_match": 84.80605487228004 - "eval_f1": 91.80638438705844 - "eval_samples": 10808 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 16 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results ### Framework versions - Transformers 4.14.1 - Pytorch 1.9.0 - Datasets 1.16.1 - Tokenizers 0.10.3
[ "# albert-large-v2_squad\n\nThis model is a fine-tuned version of albert-large-v2 on the squadV1 dataset.\n\n- \"eval_exact_match\": 84.80605487228004\n- \"eval_f1\": 91.80638438705844\n- \"eval_samples\": 10808", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0", "### Training results", "### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ "TAGS\n#transformers #pytorch #albert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n", "# albert-large-v2_squad\n\nThis model is a fine-tuned version of albert-large-v2 on the squadV1 dataset.\n\n- \"eval_exact_match\": 84.80605487228004\n- \"eval_f1\": 91.80638438705844\n- \"eval_samples\": 10808", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0", "### Training results", "### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ 51, 84, 6, 12, 8, 3, 90, 4, 31 ]
[ "passage: TAGS\n#transformers #pytorch #albert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# albert-large-v2_squad\n\nThis model is a fine-tuned version of albert-large-v2 on the squadV1 dataset.\n\n- \"eval_exact_match\": 84.80605487228004\n- \"eval_f1\": 91.80638438705844\n- \"eval_samples\": 10808## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0### Training results### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ -0.10697959363460541, 0.16364526748657227, -0.0022208192385733128, 0.09581515938043594, 0.13862498104572296, 0.02827581763267517, 0.09467032551765442, 0.1400977075099945, -0.05666499212384224, 0.06910484284162521, 0.11976762115955353, 0.0649937242269516, 0.06266817450523376, 0.1104256734251976, -0.029326478019356728, -0.2121485471725464, 0.01107804011553526, 0.02895154431462288, -0.010272311978042126, 0.11155473440885544, 0.12095343321561813, -0.08819855749607086, 0.09808020293712616, 0.04309236630797386, -0.1270960420370102, 0.00645053805783391, -0.018477994948625565, -0.0495416484773159, 0.09230370074510574, 0.03136953338980675, 0.056932080537080765, -0.0022984123788774014, 0.09116306900978088, -0.20497597754001617, -0.00935246143490076, 0.038206614553928375, 0.03887220472097397, 0.10136392712593079, 0.026301901787519455, 0.0320332832634449, 0.04193000867962837, -0.13171830773353577, 0.0973849818110466, 0.035260628908872604, -0.09486495703458786, -0.1787385493516922, -0.08470547199249268, 0.079551562666893, 0.09359569847583771, 0.11107943952083588, -0.000046199376811273396, 0.18446460366249084, -0.012914189137518406, 0.06529736518859863, 0.22195608913898468, -0.3017958700656891, -0.04766977205872536, 0.01075810194015503, 0.062287889420986176, 0.05065663531422615, -0.08980509638786316, -0.0022840253077447414, 0.05004778876900673, 0.032407332211732864, 0.11762063950300217, -0.010767616331577301, -0.004352858290076256, -0.030231332406401634, -0.11403689533472061, -0.09275882691144943, 0.22655783593654633, 0.0854334682226181, -0.07353845238685608, -0.1266351044178009, -0.05139259248971939, -0.12694594264030457, -0.012206440791487694, -0.05393542721867561, 0.02474123239517212, -0.06382690370082855, -0.02447848580777645, -0.05000478774309158, -0.080225370824337, -0.041830163449048996, 0.011355969123542309, 0.12304622679948807, 0.032124198973178864, 0.03606206551194191, -0.007658950984477997, 0.08756174892187119, -0.04052237048745155, -0.14763203263282776, -0.030293058604002, -0.014914635568857193, -0.05114937946200371, -0.03505973890423775, -0.04317457601428032, -0.02032843977212906, -0.024525359272956848, 0.1632036715745926, -0.00983507465571165, 0.04125435650348663, 0.06119829788804054, 0.001759060425683856, -0.016686974093317986, 0.18557299673557281, -0.04486938565969467, -0.048890482634305954, 0.0010827016085386276, 0.12775757908821106, 0.029391327872872353, -0.022277159616351128, -0.09510330855846405, -0.030704393982887268, 0.09785667806863785, 0.034269969910383224, -0.013445712625980377, 0.03161430358886719, -0.07187710702419281, -0.06243998184800148, 0.0806855633854866, -0.10779718309640884, 0.020512867718935013, -0.03950246796011925, -0.09017211198806763, -0.09087260067462921, 0.030091000720858574, 0.015399615280330181, -0.04184150695800781, 0.03939933329820633, -0.0887843444943428, -0.02578454464673996, -0.059576261788606644, -0.06147352233529091, -0.009770218282938004, -0.05244261398911476, 0.026253744959831238, -0.08725585788488388, -0.16297966241836548, -0.0315672792494297, 0.041264329105615616, -0.05192901939153671, -0.10411848872900009, -0.0017473192419856787, -0.04978908970952034, 0.01286028977483511, -0.014115571975708008, 0.11459684371948242, -0.04097955301403999, 0.05924985185265541, 0.043169304728507996, 0.02891777642071247, 0.0025572776794433594, 0.05416529253125191, -0.08873811364173889, 0.03618285059928894, -0.056923944503068924, 0.06715551763772964, -0.08685140311717987, 0.008123952895402908, -0.13531698286533356, -0.10821450501680374, 0.04653201997280121, -0.03242650255560875, 0.07867283374071121, 0.1113513633608818, -0.10672816634178162, -0.01660926081240177, 0.09847849607467651, -0.04142659902572632, -0.13240039348602295, 0.10943210124969482, -0.028566645458340645, 0.009219917468726635, 0.04567842558026314, 0.13841032981872559, 0.11528121680021286, -0.1198822557926178, -0.024650484323501587, 0.037496309727430344, 0.06636112183332443, 0.008480493910610676, 0.1145615503191948, 0.0009516258724033833, 0.013639172539114952, 0.018515413627028465, -0.07358398288488388, -0.02676471322774887, -0.0893189013004303, -0.08924905955791473, -0.05071844905614853, -0.08422010391950607, 0.05457115173339844, 0.023972254246473312, 0.0220012366771698, -0.07441507279872894, -0.1279585212469101, 0.026889974251389503, 0.1191565990447998, -0.023655997589230537, -0.0021239405032247305, -0.09449464827775955, 0.06764843314886093, -0.06689973175525665, -0.03469474986195564, -0.18562009930610657, -0.07662568241357803, 0.06287316232919693, -0.07417219877243042, 0.013304561376571655, 0.04291249439120293, 0.04658864811062813, 0.06491357088088989, -0.044306449592113495, -0.03838247433304787, -0.12684065103530884, -0.006831450387835503, -0.11767902225255966, -0.117298424243927, -0.09108264744281769, -0.01831776276230812, 0.16999411582946777, -0.1877632588148117, 0.004066344816237688, -0.03910542652010918, 0.10463199764490128, 0.018580734729766846, -0.05891799181699753, 0.0030207736417651176, 0.013397863134741783, -0.008268713019788265, -0.08461199700832367, 0.04158305004239082, 0.02319149114191532, -0.09955267608165741, -0.05460137128829956, -0.13268019258975983, 0.10376106947660446, 0.0721370056271553, 0.06516822427511215, -0.06247838959097862, 0.024495169520378113, -0.06673112511634827, -0.035362791270017624, -0.06643721461296082, -0.03580716997385025, 0.2261592000722885, 0.008219673298299313, 0.1416393667459488, -0.06118546053767204, -0.05790308862924576, 0.023404870182275772, -0.01125372014939785, -0.018930893391370773, 0.08125466853380203, -0.00583054032176733, -0.14829343557357788, 0.08360984176397324, 0.10504265874624252, -0.02163195051252842, 0.08367811888456345, -0.036739736795425415, -0.09233856946229935, -0.054047539830207825, -0.013954510912299156, -0.0000036330450257082703, 0.11692379415035248, -0.09714354574680328, 0.009326091036200523, 0.08061347901821136, 0.011212829500436783, 0.005872745998203754, -0.15808972716331482, -0.008733568713068962, 0.06243812292814255, -0.023733992129564285, -0.019730791449546814, -0.01051258109509945, -0.0051408372819423676, 0.08477725088596344, 0.04207243397831917, -0.03588439151644707, 0.02297389879822731, -0.020532963797450066, -0.06804809719324112, 0.1661047637462616, -0.07734640687704086, -0.19840046763420105, -0.15392418205738068, 0.039303604513406754, -0.08114727586507797, -0.0030411018524318933, 0.03527050092816353, -0.0321800671517849, -0.06675403565168381, -0.09445716440677643, -0.021603697910904884, -0.07097797095775604, -0.013001971878111362, 0.09980055689811707, -0.021242644637823105, 0.10654653608798981, -0.1282733827829361, -0.015590245835483074, -0.00940600223839283, -0.04234137013554573, -0.01275781262665987, 0.03984781727194786, 0.11592273414134979, 0.05451524257659912, -0.028699880465865135, 0.018488464877009392, -0.023423999547958374, 0.29144763946533203, -0.063327357172966, -0.04650534316897392, 0.1584502011537552, -0.006520436145365238, 0.06698345392942429, 0.09979915618896484, 0.02867494337260723, -0.09132446348667145, 0.014379492029547691, 0.034666817635297775, -0.016154516488313675, -0.23084193468093872, -0.03858519718050957, -0.029183458536863327, -0.05860348045825958, 0.12438924610614777, 0.03019133396446705, 0.028007471933960915, 0.07504452019929886, -0.024727312847971916, 0.06751758605241776, -0.049231089651584625, 0.10045307129621506, 0.1201147586107254, 0.03832739591598511, 0.1056184396147728, -0.019669758155941963, -0.026620054617524147, 0.05127497762441635, 0.005473718047142029, 0.22061286866664886, -0.00916263647377491, 0.18022477626800537, 0.023476077243685722, 0.14949357509613037, -0.019039347767829895, 0.046516478061676025, 0.006678254343569279, 0.002342769643291831, -0.018067678436636925, -0.048105113208293915, -0.07779233902692795, 0.028794018551707268, 0.012910707853734493, 0.08150823414325714, -0.10282839089632034, 0.044848211109638214, -0.017678283154964447, 0.2666592001914978, 0.06473931670188904, -0.33893445134162903, -0.10744166374206543, 0.007723414339125156, -0.02666344679892063, -0.09057267010211945, -0.003195386379957199, 0.08135354518890381, -0.14303338527679443, 0.05341856926679611, -0.05011063441634178, 0.09463536739349365, -0.066623754799366, 0.016242461279034615, 0.05509036406874657, 0.06350740790367126, 0.0109126390889287, 0.10581128299236298, -0.2014366239309311, 0.2307198941707611, 0.019049271941184998, 0.06582700461149216, -0.07909276336431503, 0.04748940467834473, -0.008505798876285553, 0.05841367319226265, 0.13976874947547913, 0.008048199117183685, 0.025749200955033302, -0.19856327772140503, -0.10655288398265839, 0.017905957996845245, 0.06586804986000061, -0.09677355736494064, 0.08056432008743286, -0.04954773932695389, 0.007228432223200798, 0.023816686123609543, 0.04484541714191437, -0.11347081512212753, -0.13754871487617493, 0.051670484244823456, 0.013031057082116604, -0.02283531054854393, -0.0756603255867958, -0.10645266622304916, -0.026748118922114372, 0.16551072895526886, 0.05171018838882446, -0.06531491875648499, -0.13145072758197784, 0.08977264165878296, 0.1172143742442131, -0.08219261467456818, 0.011333822272717953, 0.0014065110590308905, 0.14073413610458374, 0.027790101245045662, -0.05408966541290283, 0.05712694674730301, -0.06101973354816437, -0.15743976831436157, -0.0494997538626194, 0.15940387547016144, 0.0025588811840862036, 0.07001171261072159, 0.01481105200946331, 0.0348568931221962, -0.017313385382294655, -0.07142501324415207, 0.03685902804136276, 0.04132053628563881, 0.0884772539138794, 0.0387384369969368, -0.00705560389906168, 0.04725582152605057, -0.06353351473808289, -0.000295334990369156, 0.14766629040241241, 0.2520853281021118, -0.08554476499557495, 0.03744611144065857, 0.025552058592438698, -0.06004482880234718, -0.1420203298330307, -0.0004262867441866547, 0.12649264931678772, 0.03274935483932495, 0.08649440854787827, -0.1327241063117981, 0.07014662772417068, 0.08719642460346222, -0.03697722777724266, 0.03534574434161186, -0.279927134513855, -0.12016309052705765, 0.07241798937320709, 0.11778918653726578, 0.07714549452066422, -0.12793076038360596, -0.05742590129375458, -0.03106527589261532, -0.2078346312046051, 0.0846906527876854, 0.021421488374471664, 0.10710854828357697, -0.02363857813179493, 0.06516867130994797, 0.043073780834674835, -0.032484713941812515, 0.1817305088043213, 0.031630758196115494, 0.04667132720351219, -0.0672188550233841, 0.016210926696658134, 0.11029183864593506, -0.07306267321109772, 0.09143001586198807, -0.045980989933013916, 0.08079610019922256, -0.21371068060398102, -0.028131429105997086, -0.05743012577295303, 0.045760419219732285, -0.061457887291908264, -0.058578938245773315, -0.01767447404563427, 0.027286143973469734, 0.0701860636472702, -0.02943677082657814, 0.10543686896562576, 0.04604468494653702, 0.060819827020168304, 0.13663524389266968, 0.09006987512111664, -0.010786776430904865, -0.18337088823318481, -0.027570270001888275, -0.016020800918340683, 0.06606323271989822, -0.1127462163567543, 0.02701134979724884, 0.11912711709737778, 0.040346916764974594, 0.12947887182235718, 0.009765755385160446, -0.046393923461437225, 0.004129060078412294, 0.014431084506213665, -0.11195547133684158, -0.18469606339931488, -0.047373656183481216, 0.014328096993267536, -0.16944989562034607, 0.00033613533014431596, 0.11580628156661987, -0.058104049414396286, -0.029850110411643982, -0.01778952218592167, 0.009703618474304676, -0.0029220604337751865, 0.16023267805576324, 0.027422551065683365, 0.08551891148090363, -0.0837925374507904, 0.08841915428638458, 0.0806460976600647, -0.06710771471261978, 0.05063688009977341, -0.008395576849579811, -0.09250876307487488, -0.023482609540224075, 0.029752254486083984, 0.11770868301391602, -0.03400745987892151, -0.02182909846305847, -0.08228401094675064, -0.0757860317826271, 0.046427369117736816, -0.00881033856421709, 0.06984581053256989, -0.01598900929093361, -0.029949503019452095, -0.006691076792776585, -0.11995697021484375, 0.11570076644420624, 0.042658913880586624, 0.07730523496866226, -0.15537434816360474, 0.016391701996326447, 0.0015398624818772078, 0.06375760585069656, -0.008251412771642208, 0.008535141125321388, -0.04713810235261917, -0.014754713512957096, -0.1323031634092331, 0.012249881401658058, -0.012808782048523426, 0.008261420764029026, -0.03941282629966736, -0.07530318200588226, -0.041054461151361465, 0.061413947492837906, -0.06743389368057251, -0.08115995675325394, 0.028350524604320526, 0.055906347930431366, -0.1085372343659401, -0.024114016443490982, 0.03535246104001999, -0.09256190061569214, 0.09981364756822586, 0.04042968153953552, 0.01721630059182644, -0.011638188734650612, 0.012962780892848969, 0.0013691033236682415, 0.01555721741169691, 0.03839941695332527, 0.06551949679851532, -0.12460274994373322, -0.008438498713076115, -0.02690945565700531, 0.011120032519102097, 0.00227381126023829, 0.06810200959444046, -0.13357096910476685, -0.07492486387491226, -0.007109557744115591, -0.026669560000300407, -0.06822840124368668, 0.03520439192652702, 0.08082283288240433, 0.028676405549049377, 0.17473162710666656, -0.03772330284118652, 0.05054529011249542, -0.2116921991109848, -0.02733876183629036, -0.002959801582619548, -0.04923522472381592, -0.0760202407836914, -0.059950802475214005, 0.07622560858726501, -0.06534503400325775, 0.10559280961751938, -0.0010209359461441636, 0.14227019250392914, 0.0355873666703701, 0.0185990072786808, 0.03888842090964317, -0.01378688681870699, 0.1927749216556549, 0.046595022082328796, -0.020176103338599205, 0.1100921630859375, -0.010607142932713032, 0.06454885751008987, 0.06219904124736786, 0.11657191812992096, 0.1478644162416458, 0.012802300043404102, 0.04814887046813965, 0.052331555634737015, -0.06852266192436218, -0.21230264008045197, 0.016001960262656212, -0.0051025343127548695, 0.14404241740703583, -0.013191570527851582, 0.126710444688797, 0.0726495087146759, -0.16124698519706726, 0.06034552678465843, -0.06908923387527466, -0.09092234820127487, -0.07581030577421188, -0.09759818762540817, -0.08224285393953323, -0.10287640243768692, 0.02314670756459236, -0.12742720544338226, 0.021310197189450264, 0.11700339615345001, -0.021816246211528778, -0.020108822733163834, 0.1449880748987198, -0.04445847123861313, -0.006444326136261225, 0.05414484813809395, -0.0065590753220021725, -0.014817659743130207, -0.06065235659480095, -0.040606070309877396, 0.0625687912106514, 0.036562882363796234, 0.08975677192211151, -0.04644268378615379, -0.004401880782097578, 0.0181468203663826, -0.0007469471893273294, -0.09943593293428421, 0.00041470935684628785, 0.008902330882847309, 0.04091070219874382, 0.02849830873310566, 0.02798248454928398, 0.035916827619075775, -0.04105285555124283, 0.2585785686969757, -0.05851167440414429, -0.042220376431941986, -0.1211184561252594, 0.15655732154846191, 0.03745587542653084, -0.01662646234035492, 0.08986790478229523, -0.09026191383600235, -0.004478445742279291, 0.14349810779094696, 0.1312975436449051, -0.04517785459756851, -0.03082801215350628, 0.0050055645406246185, -0.012946355156600475, -0.049351952970027924, 0.08735103160142899, 0.09883397072553635, -0.018167737871408463, -0.07163535058498383, 0.014917205087840557, -0.018593696877360344, -0.02496497891843319, -0.05470157042145729, 0.08186284452676773, 0.03401770815253258, 0.012921436689794064, -0.03270117565989494, 0.04145172983407974, 0.040430907160043716, -0.17887218296527863, 0.010753319598734379, -0.19058836996555328, -0.18483245372772217, -0.012375316582620144, 0.07728921622037888, 0.02126859501004219, 0.08126544207334518, 0.008211784064769745, 0.009835834614932537, 0.14670327305793762, -0.025177806615829468, -0.07289543747901917, -0.08707022666931152, 0.0944560244679451, -0.10042692720890045, 0.23031115531921387, 0.010172046720981598, 0.0855194628238678, 0.10385117679834366, 0.011491157114505768, -0.15178534388542175, 0.031822241842746735, 0.08187084645032883, -0.004929134156554937, 0.05250558629631996, 0.15991459786891937, -0.01564498245716095, 0.07630308717489243, 0.03818025067448616, -0.1503363996744156, -0.05068076774477959, -0.04000069946050644, 0.021939659491181374, -0.05148254707455635, -0.011875113472342491, -0.06864236295223236, 0.16051800549030304, 0.2003663182258606, -0.06619007885456085, -0.04474332183599472, -0.06121748313307762, 0.02270900458097458, 0.07271672785282135, 0.08549778163433075, -0.02629888989031315, -0.18451912701129913, 0.013300799764692783, 0.012704954482614994, 0.031738921999931335, -0.24148444831371307, -0.1053629219532013, 0.036740805953741074, -0.06287799030542374, -0.017160458490252495, 0.08594955503940582, 0.04463699087500572, 0.02211887016892433, -0.04513636603951454, -0.09390228241682053, -0.06058454141020775, 0.14303401112556458, -0.1430557668209076, -0.06928510218858719 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilroberta-base_squad This model is a fine-tuned version of [distilroberta-base](https://huggingface.co/distilroberta-base) on the **squadV1** dataset. - "eval_exact_match": 80.97445600756859 - "eval_f1": 88.0153886332912 - "eval_samples": 10790 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results ### Framework versions - Transformers 4.14.1 - Pytorch 1.9.0 - Datasets 1.16.1 - Tokenizers 0.10.3
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "distilroberta-base_squad", "results": []}]}
question-answering
Palak/distilroberta-base_squad
[ "transformers", "pytorch", "roberta", "question-answering", "generated_from_trainer", "dataset:squad", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
# distilroberta-base_squad This model is a fine-tuned version of distilroberta-base on the squadV1 dataset. - "eval_exact_match": 80.97445600756859 - "eval_f1": 88.0153886332912 - "eval_samples": 10790 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results ### Framework versions - Transformers 4.14.1 - Pytorch 1.9.0 - Datasets 1.16.1 - Tokenizers 0.10.3
[ "# distilroberta-base_squad\n\nThis model is a fine-tuned version of distilroberta-base on the squadV1 dataset.\n\n- \"eval_exact_match\": 80.97445600756859\n- \"eval_f1\": 88.0153886332912\n- \"eval_samples\": 10790", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 32\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0", "### Training results", "### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ "TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n", "# distilroberta-base_squad\n\nThis model is a fine-tuned version of distilroberta-base on the squadV1 dataset.\n\n- \"eval_exact_match\": 80.97445600756859\n- \"eval_f1\": 88.0153886332912\n- \"eval_samples\": 10790", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 32\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0", "### Training results", "### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ 51, 81, 6, 12, 8, 3, 90, 4, 31 ]
[ "passage: TAGS\n#transformers #pytorch #roberta #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# distilroberta-base_squad\n\nThis model is a fine-tuned version of distilroberta-base on the squadV1 dataset.\n\n- \"eval_exact_match\": 80.97445600756859\n- \"eval_f1\": 88.0153886332912\n- \"eval_samples\": 10790## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 32\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0### Training results### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ -0.10205333679914474, 0.18723160028457642, -0.002742929384112358, 0.09004752337932587, 0.1397070586681366, 0.03105110488831997, 0.07326173037290573, 0.15485727787017822, -0.05768602341413498, 0.08500044792890549, 0.11132331192493439, 0.06625387072563171, 0.06507770717144012, 0.12318689376115799, -0.03622056916356087, -0.18608662486076355, 0.01013376284390688, 0.013728664256632328, 0.007069063372910023, 0.11099481582641602, 0.11861014366149902, -0.08384065330028534, 0.08804351091384888, 0.01011461578309536, -0.11736663430929184, 0.015150952152907848, -0.03037944622337818, -0.046190306544303894, 0.0862896740436554, 0.018116671591997147, 0.07618848234415054, -0.017513595521450043, 0.08333750069141388, -0.21315018832683563, -0.0071758488193154335, 0.05399409681558609, 0.02842639572918415, 0.08596278727054596, 0.010663245804607868, 0.005805483553558588, 0.07816217094659805, -0.14396719634532928, 0.08616046607494354, 0.01590191386640072, -0.09553474932909012, -0.1485164612531662, -0.09578393399715424, 0.06040351465344429, 0.0897369459271431, 0.11721070110797882, 0.002464544028043747, 0.18841859698295593, -0.02586509846150875, 0.06835059076547623, 0.16481977701187134, -0.2627752721309662, -0.05167744308710098, 0.01648535206913948, 0.04067730903625488, 0.055715057998895645, -0.09532401710748672, -0.01348066795617342, 0.030460337176918983, 0.045442648231983185, 0.09367859363555908, -0.017679397016763687, -0.06063678860664368, -0.02234909124672413, -0.10681463032960892, -0.08235634118318558, 0.23027552664279938, 0.0740962028503418, -0.06152312457561493, -0.11643826216459274, -0.04788972809910774, -0.08501213788986206, -0.0019268722971901298, -0.04345333203673363, 0.007492783013731241, -0.05241510644555092, -0.03593142703175545, -0.06413213163614273, -0.08999453485012054, -0.03847876191139221, 0.0031071992125362158, 0.10222858935594559, 0.022710265591740608, 0.03277958929538727, -0.003398195607587695, 0.08406244218349457, -0.052401576191186905, -0.14478327333927155, -0.025524845346808434, -0.014914073050022125, -0.06017201766371727, -0.045124251395463943, -0.0364731028676033, -0.06772470474243164, -0.008195225149393082, 0.1567779779434204, -0.02354045771062374, 0.05084056034684181, 0.04440080001950264, -0.00981681514531374, -0.002259071683511138, 0.14045743644237518, -0.038277916610240936, -0.07910120487213135, 0.009662638418376446, 0.12666289508342743, 0.036029256880283356, -0.021024463698267937, -0.08698555827140808, -0.03564159944653511, 0.09297449886798859, 0.0348736047744751, 0.016118880361318588, 0.018254613503813744, -0.06645230948925018, -0.05343344807624817, 0.08196604251861572, -0.11483566462993622, 0.029142621904611588, -0.037481341511011124, -0.08781193941831589, -0.08805989474058151, 0.03310200944542885, 0.018067460507154465, -0.04347655177116394, 0.05216311290860176, -0.08815397322177887, -0.018849188461899757, -0.06664328277111053, -0.05831760913133621, 0.0057952264323830605, -0.06010512262582779, 0.014533204957842827, -0.07588906586170197, -0.20189842581748962, -0.04054610803723335, 0.04673144593834877, -0.06524362415075302, -0.08125286549329758, -0.012103199027478695, -0.04570287838578224, 0.024766579270362854, -0.01585795357823372, 0.11233452707529068, -0.05100856348872185, 0.07689862698316574, 0.03774982690811157, 0.01415938138961792, -0.024425415322184563, 0.05096716061234474, -0.10148551315069199, 0.03232613950967789, -0.07854284346103668, 0.07940145581960678, -0.08850723505020142, 0.02095901034772396, -0.1355440616607666, -0.1030375063419342, -0.004274152219295502, -0.04594724625349045, 0.08807245641946793, 0.11456149071455002, -0.12024089694023132, -0.0009992747800424695, 0.09648717939853668, -0.056588444858789444, -0.12747043371200562, 0.10411059856414795, -0.03774602711200714, 0.02565775252878666, 0.05270484834909439, 0.15460287034511566, 0.13951608538627625, -0.12984150648117065, -0.0465281717479229, 0.023885242640972137, 0.05420432984828949, -0.013564894907176495, 0.09541308134794235, -0.01061523798853159, 0.04044674336910248, 0.011566639877855778, -0.07467640936374664, -0.0007302407757379115, -0.06802811473608017, -0.09451230615377426, -0.051723118871450424, -0.0840379148721695, 0.037799667567014694, 0.04667230322957039, 0.014608001336455345, -0.07526197284460068, -0.11941394954919815, 0.0475442074239254, 0.12449027597904205, -0.027170686051249504, -0.0027944198809564114, -0.07723192870616913, 0.07558751851320267, -0.06768979877233505, -0.03819160908460617, -0.19673004746437073, -0.10669281333684921, 0.050413645803928375, -0.020216738805174828, 0.02193252369761467, 0.009760758839547634, 0.056047506630420685, 0.05418340861797333, -0.04540398716926575, -0.028689203783869743, -0.10185658186674118, -0.004302864894270897, -0.10246513783931732, -0.11811137944459915, -0.07059666514396667, -0.031109105795621872, 0.1997961699962616, -0.1882106512784958, -0.0031670229509472847, -0.02156880684196949, 0.12274178117513657, 0.015226608142256737, -0.07348321378231049, 0.016060665249824524, 0.028270984068512917, -0.007285107392817736, -0.08638926595449448, 0.04291735962033272, 0.014514072798192501, -0.09283286333084106, -0.08258698135614395, -0.12270410358905792, 0.06368572264909744, 0.07681547105312347, 0.07569693773984909, -0.07198482751846313, -0.0052195023745298386, -0.05473353713750839, -0.03811606764793396, -0.0637289509177208, -0.036262668669223785, 0.1836850345134735, 0.01904957741498947, 0.11804774403572083, -0.05549995228648186, -0.045099057257175446, 0.014774305745959282, 0.0010896429885178804, -0.028183704242110252, 0.0656147375702858, 0.018098782747983932, -0.17156793177127838, 0.09699098765850067, 0.09258780628442764, -0.02410168945789337, 0.10262930393218994, -0.029322508722543716, -0.09439484030008316, -0.04898978769779205, 0.01789555698633194, -0.011347319930791855, 0.12881311774253845, -0.09437449276447296, 0.017505938187241554, 0.07050686329603195, 0.006461180280894041, 0.011899583041667938, -0.1501428186893463, -0.01362695824354887, 0.04115941748023033, -0.02763659693300724, -0.013333957642316818, -0.015999866649508476, 0.0064969793893396854, 0.07271334528923035, 0.04307246953248978, -0.02858423814177513, 0.026921480894088745, -0.020713023841381073, -0.06664930284023285, 0.15926413238048553, -0.09714826941490173, -0.19505251944065094, -0.15228378772735596, 0.03972894325852394, -0.0741698369383812, -0.003900191280990839, 0.02036878652870655, -0.026639066636562347, -0.05915769562125206, -0.09088289737701416, -0.04536191374063492, -0.055401649326086044, -0.02531519904732704, 0.056201376020908356, -0.00612922478467226, 0.09925011545419693, -0.12279502302408218, -0.007818290032446384, -0.006477625109255314, -0.02892283722758293, -0.010068351402878761, 0.04662831127643585, 0.1228455975651741, 0.053449343889951706, -0.017845043912529945, 0.020548224449157715, -0.02746087871491909, 0.28920048475265503, -0.07491982728242874, -0.023441297933459282, 0.15036143362522125, 0.007791259326040745, 0.07351977378129959, 0.11769475042819977, 0.020497171208262444, -0.08329831808805466, 0.02098037675023079, 0.04590538516640663, -0.021360188722610474, -0.21082007884979248, -0.04537959769368172, -0.039327144622802734, -0.09317667782306671, 0.11958391964435577, 0.03816072270274162, 0.025374015793204308, 0.08489573001861572, -0.022027142345905304, 0.04189969226717949, -0.05471370369195938, 0.09473203122615814, 0.12528693675994873, 0.04225882142782211, 0.10239479690790176, -0.027071252465248108, -0.02888801507651806, 0.058018989861011505, 0.01720600388944149, 0.22057922184467316, -0.03066447004675865, 0.14998427033424377, 0.01873408816754818, 0.18278159201145172, -0.05724279582500458, 0.029608575627207756, -0.003268096363171935, 0.009270807728171349, -0.010742217302322388, -0.05783995985984802, -0.0862152948975563, 0.036654308438301086, 0.024891017004847527, 0.06279708445072174, -0.0701923593878746, 0.027652747929096222, -0.004149057436734438, 0.23911117017269135, 0.06965295225381851, -0.32389527559280396, -0.09390357881784439, 0.019272470846772194, -0.007747546769678593, -0.08488763123750687, -0.023212160915136337, 0.10365733504295349, -0.13828419148921967, 0.06400827318429947, -0.04769080877304077, 0.09108567237854004, -0.052227552980184555, -0.004183951765298843, 0.032965369522571564, 0.06950631737709045, 0.007661731913685799, 0.10564414411783218, -0.1807417869567871, 0.20204952359199524, 0.02879204787313938, 0.09544768184423447, -0.08110030740499496, 0.04647831246256828, -0.010224499739706516, 0.047327496111392975, 0.12599284946918488, 0.006942343432456255, 0.010909594595432281, -0.17528210580348969, -0.09937731921672821, 0.018585314974188805, 0.0654127225279808, -0.06207571551203728, 0.08867169916629791, -0.04007377475500107, 0.010870283469557762, 0.02802426367998123, 0.01781069114804268, -0.10663168877363205, -0.1445155143737793, 0.049708593636751175, 0.03797393664717674, -0.034561678767204285, -0.06147954985499382, -0.08446425199508667, 0.004352892283350229, 0.16551297903060913, 0.04708118736743927, -0.0734758973121643, -0.12898418307304382, 0.0825544223189354, 0.14183709025382996, -0.07810662686824799, 0.012544852681457996, -0.0051991562359035015, 0.10342926532030106, 0.03069852478802204, -0.05752093344926834, 0.040394123643636703, -0.05665203556418419, -0.11900151520967484, -0.04709142819046974, 0.15719972550868988, 0.006930313538759947, 0.05634242296218872, 0.014066963456571102, 0.03902219980955124, -0.0258172620087862, -0.07432011514902115, 0.03492986783385277, 0.010837078094482422, 0.1128416433930397, 0.06112605333328247, -0.011315940879285336, 0.019715996459126472, -0.06187839433550835, -0.001907908241264522, 0.1448882669210434, 0.22991253435611725, -0.078251913189888, 0.05087967962026596, 0.043066129088401794, -0.055507123470306396, -0.13972903788089752, 0.002242245012894273, 0.11357315629720688, 0.02250855229794979, 0.09138107299804688, -0.14946794509887695, 0.06334247440099716, 0.08594664931297302, -0.02941952459514141, 0.05885114520788193, -0.29118648171424866, -0.10872369259595871, 0.06880831718444824, 0.10547976195812225, 0.05255232751369476, -0.13887766003608704, -0.06460745632648468, -0.024958955124020576, -0.19451548159122467, 0.11091280728578568, -0.016894740983843803, 0.10562854260206223, -0.010303531773388386, 0.10235493630170822, 0.04386473074555397, -0.04293375089764595, 0.17491364479064941, 0.025013985112309456, 0.04611654952168465, -0.07646261900663376, 0.017122475430369377, 0.11282256990671158, -0.06584510952234268, 0.1056065782904625, -0.036410853266716, 0.07093110680580139, -0.22534522414207458, -0.013959485106170177, -0.06834381818771362, 0.058753080666065216, -0.04500158131122589, -0.06227578967809677, -0.030336251482367516, 0.041718464344739914, 0.06349661201238632, -0.02804851531982422, 0.10750368982553482, 0.04341057315468788, 0.08288214355707169, 0.13594576716423035, 0.07054685056209564, -0.0031424611806869507, -0.15359242260456085, -0.01579960249364376, -0.018339894711971283, 0.0533955842256546, -0.10088666528463364, 0.018123917281627655, 0.11883104592561722, 0.03948073461651802, 0.13747917115688324, 0.0023411486763507128, -0.0547807477414608, 0.011971676722168922, 0.03934711590409279, -0.12512366473674774, -0.1619729995727539, -0.027579886838793755, -0.01579931750893593, -0.17096464335918427, -0.021143736317753792, 0.12360337376594543, -0.03797350451350212, -0.022351069375872612, -0.019650490954518318, 0.015411428175866604, -0.010603328235447407, 0.16814850270748138, 0.04288265481591225, 0.06500138342380524, -0.07279114425182343, 0.08662379533052444, 0.09736886620521545, -0.06992194801568985, 0.059099454432725906, 0.010887667536735535, -0.08950648456811905, -0.02875717729330063, 0.010375721380114555, 0.09571938961744308, -0.03911847621202469, -0.028034774586558342, -0.07217594236135483, -0.04162092134356499, 0.03570700064301491, 0.0015323153929784894, 0.05974582955241203, -0.021797791123390198, -0.024425409734249115, -0.009351478889584541, -0.12310732156038284, 0.1048179343342781, 0.03508894145488739, 0.06637462228536606, -0.15029136836528778, 0.036913756281137466, -0.004721559584140778, 0.04987120255827904, -0.007970320992171764, 0.007208783179521561, -0.0472169853746891, -0.01361397560685873, -0.10391132533550262, 0.0031016110442578793, -0.0378100611269474, 0.0035065675619989634, -0.026289332658052444, -0.08478861302137375, -0.046498868614435196, 0.05131009966135025, -0.06010354310274124, -0.07942616194486618, 0.025664014741778374, 0.05011991411447525, -0.14125105738639832, -0.03983643278479576, 0.03241194784641266, -0.07839766889810562, 0.08193914592266083, 0.06420259922742844, 0.02812369540333748, -0.014795132912695408, -0.011467923410236835, -0.004826825577765703, 0.017684344202280045, 0.040670983493328094, 0.0595838688313961, -0.11063972115516663, -0.0076242247596383095, -0.015337790362536907, 0.022451262921094894, 0.018311185762286186, 0.09451846778392792, -0.13634534180164337, -0.059486810117959976, -0.02501215599477291, -0.047914858907461166, -0.05735950544476509, 0.05127134174108505, 0.1047302857041359, 0.03601311519742012, 0.18125519156455994, -0.0445995070040226, 0.046341218054294586, -0.202448308467865, -0.031873490661382675, -0.012227502651512623, -0.03442743420600891, -0.060693204402923584, -0.03991270065307617, 0.07194692641496658, -0.05298585444688797, 0.08693065494298935, -0.0019342591986060143, 0.14175042510032654, 0.036893006414175034, 0.01892504096031189, 0.038134124130010605, -0.025202831253409386, 0.16598272323608398, 0.05704612284898758, -0.02816549688577652, 0.10315778106451035, -0.01609352044761181, 0.05330852419137955, 0.06135361269116402, 0.1047968789935112, 0.14429505169391632, 0.03523479402065277, 0.05000097304582596, 0.04935755208134651, -0.07478155195713043, -0.17772896587848663, 0.03118365816771984, 0.00347670610062778, 0.11347631365060806, -0.02370854839682579, 0.14604245126247406, 0.0677025243639946, -0.1685027778148651, 0.05906638503074646, -0.0715416669845581, -0.10269585996866226, -0.06799939274787903, -0.09388766437768936, -0.0718308836221695, -0.08642902225255966, 0.021620672196149826, -0.12100569903850555, 0.029322408139705658, 0.11455925554037094, -0.022521955892443657, -0.02363002859055996, 0.15608108043670654, -0.033251360058784485, -0.014709323644638062, 0.037416666746139526, 0.0027305064722895622, -0.00013998353097122163, -0.047274600714445114, -0.03135039657354355, 0.05965637415647507, 0.03286262974143028, 0.0969080775976181, -0.04213530942797661, 0.0130495335906744, 0.02192368544638157, -0.0014434385811910033, -0.09253515303134918, -0.0029706789646297693, 0.0335937924683094, 0.04011378809809685, 0.06159190461039543, 0.041488341987133026, 0.0387423075735569, -0.04502439498901367, 0.26834991574287415, -0.051100071519613266, -0.05604163929820061, -0.13030584156513214, 0.1303684562444687, 0.033279404044151306, -0.017104683443903923, 0.0714004710316658, -0.11459536850452423, 0.008545825257897377, 0.12613503634929657, 0.11937254667282104, -0.07218072563409805, -0.02697061188519001, -0.00010009801189880818, -0.008835062384605408, -0.04265473410487175, 0.07584788650274277, 0.086858831346035, 0.01621445082128048, -0.06986390799283981, 0.019767550751566887, -0.031272586435079575, -0.020725812762975693, -0.07518263906240463, 0.050809357315301895, 0.017301693558692932, 0.02005256712436676, -0.030949924141168594, 0.06518637388944626, 0.042813122272491455, -0.18012332916259766, 0.02695409208536148, -0.20084840059280396, -0.1926087737083435, -0.007655291818082333, 0.08394607156515121, 0.004910735413432121, 0.0610475167632103, -0.010523962788283825, 0.017759470269083977, 0.12621043622493744, -0.018292412161827087, -0.07550592720508575, -0.10051263868808746, 0.09827134013175964, -0.07320279628038406, 0.21962521970272064, -0.004902072716504335, 0.09368821978569031, 0.10663988441228867, 0.00867486372590065, -0.15181531012058258, 0.041350144892930984, 0.09153426438570023, -0.010786781087517738, 0.05488459765911102, 0.15271826088428497, -0.026597628369927406, 0.10441247373819351, 0.04994337633252144, -0.11395075172185898, -0.044318459928035736, -0.01734357513487339, 0.025740329176187515, -0.061163343489170074, -0.01549396850168705, -0.07675778865814209, 0.17114080488681793, 0.18076421320438385, -0.0575045570731163, -0.022808609530329704, -0.07201481610536575, 0.023018132895231247, 0.07066389173269272, 0.09023807942867279, -0.04897177219390869, -0.17766804993152618, 0.009192665107548237, 0.03549573943018913, 0.03324844688177109, -0.24875067174434662, -0.09618932753801346, 0.04891810566186905, -0.05232687667012215, -0.017827149480581284, 0.09893286228179932, 0.025359364226460457, 0.015382939018309116, -0.04315383732318878, -0.08656122535467148, -0.06577692925930023, 0.1286529302597046, -0.14182133972644806, -0.061805613338947296 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # google_electra-base-discriminator_squad This model is a fine-tuned version of [google/electra-base-discriminator](https://huggingface.co/google/electra-base-discriminator) on the **squadV1** dataset. - "eval_exact_match": 85.33585619678335 - "eval_f1": 91.97363450387108 - "eval_samples": 10784 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 16 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results ### Framework versions - Transformers 4.14.1 - Pytorch 1.9.0 - Datasets 1.16.1 - Tokenizers 0.10.3
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "google_electra-base-discriminator_squad", "results": []}]}
question-answering
Palak/google_electra-base-discriminator_squad
[ "transformers", "pytorch", "electra", "question-answering", "generated_from_trainer", "dataset:squad", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #electra #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
# google_electra-base-discriminator_squad This model is a fine-tuned version of google/electra-base-discriminator on the squadV1 dataset. - "eval_exact_match": 85.33585619678335 - "eval_f1": 91.97363450387108 - "eval_samples": 10784 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 16 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results ### Framework versions - Transformers 4.14.1 - Pytorch 1.9.0 - Datasets 1.16.1 - Tokenizers 0.10.3
[ "# google_electra-base-discriminator_squad\n\nThis model is a fine-tuned version of google/electra-base-discriminator on the squadV1 dataset.\n- \"eval_exact_match\": 85.33585619678335\n- \"eval_f1\": 91.97363450387108\n- \"eval_samples\": 10784", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0", "### Training results", "### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ "TAGS\n#transformers #pytorch #electra #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n", "# google_electra-base-discriminator_squad\n\nThis model is a fine-tuned version of google/electra-base-discriminator on the squadV1 dataset.\n- \"eval_exact_match\": 85.33585619678335\n- \"eval_f1\": 91.97363450387108\n- \"eval_samples\": 10784", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0", "### Training results", "### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ 51, 89, 6, 12, 8, 3, 90, 4, 31 ]
[ "passage: TAGS\n#transformers #pytorch #electra #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# google_electra-base-discriminator_squad\n\nThis model is a fine-tuned version of google/electra-base-discriminator on the squadV1 dataset.\n- \"eval_exact_match\": 85.33585619678335\n- \"eval_f1\": 91.97363450387108\n- \"eval_samples\": 10784## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0### Training results### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ -0.0835874006152153, 0.16365587711334229, -0.0033156995195895433, 0.08424162119626999, 0.1519879400730133, 0.02310379408299923, 0.08472850173711777, 0.14175328612327576, -0.06940536946058273, 0.10275336354970932, 0.10577236860990524, 0.09833233058452606, 0.056144993752241135, 0.13220533728599548, -0.035997096449136734, -0.22591786086559296, 0.01799006201326847, 0.01830470561981201, -0.048897746950387955, 0.1069173514842987, 0.12482352554798126, -0.08601023256778717, 0.09527648985385895, 0.0499536469578743, -0.13962307572364807, 0.020282424986362457, -0.023444632068276405, -0.07005700469017029, 0.09036248177289963, 0.016037795692682266, 0.05184099078178406, -0.008404701016843319, 0.06134965270757675, -0.20758946239948273, -0.005826943553984165, 0.048156216740608215, 0.019633065909147263, 0.09612857550382614, 0.029185211285948753, 0.017869982868433, 0.03803318738937378, -0.1363634616136551, 0.07691270112991333, 0.039235278964042664, -0.09434117376804352, -0.18300627171993256, -0.09099894016981125, 0.07880807667970657, 0.07808844745159149, 0.10150053352117538, 0.004370281007140875, 0.20279045403003693, -0.0027417524252086878, 0.07542680948972702, 0.2183637022972107, -0.29640376567840576, -0.05449662357568741, 0.016458986327052116, 0.08033528178930283, 0.03312639892101288, -0.07675066590309143, 0.0030051975045353174, 0.03847716748714447, 0.03327497839927673, 0.10278691351413727, -0.017901325598359108, -0.0330287404358387, -0.037304047495126724, -0.11098533868789673, -0.08190947026014328, 0.24615567922592163, 0.08562818914651871, -0.061791449785232544, -0.12317126989364624, -0.0843118280172348, -0.10644848644733429, -0.007231434807181358, -0.06656579673290253, 0.0398634597659111, -0.04025992751121521, -0.040011823177337646, -0.0757417231798172, -0.06689883023500443, -0.041504696011543274, -0.004959845915436745, 0.16023020446300507, 0.046403463929891586, 0.03286978229880333, -0.010231423191726208, 0.07931855320930481, -0.015675758942961693, -0.14878681302070618, -0.05561850219964981, 0.002680036239326, -0.0407547764480114, -0.04659860208630562, -0.03645869344472885, -0.014477808028459549, 0.02181839384138584, 0.20525085926055908, -0.053536079823970795, 0.06580149382352829, 0.045412126928567886, -0.010668721981346607, -0.010445579886436462, 0.17326760292053223, -0.036911770701408386, -0.025340791791677475, 0.03423368185758591, 0.08876843005418777, 0.046744778752326965, -0.007032797206193209, -0.08180009573698044, -0.038841914385557175, 0.11450077593326569, 0.05466698482632637, -0.00030587209039367735, 0.04604211077094078, -0.06811432540416718, -0.05166644603013992, 0.08019660413265228, -0.11203572899103165, 0.02965349145233631, -0.03904178366065025, -0.0887121632695198, -0.07101833075284958, 0.02070135436952114, 0.014554549939930439, -0.0620744489133358, 0.0036749292630702257, -0.09794405847787857, -0.015113004483282566, -0.04012703523039818, -0.04254565015435219, 0.0010696968529373407, -0.08383146673440933, 0.01787489838898182, -0.09031608700752258, -0.18132951855659485, -0.03312619403004646, 0.02178090438246727, -0.06010624021291733, -0.10324819386005402, 0.005169346928596497, -0.04997796565294266, 0.022495001554489136, -0.021019255742430687, 0.05230911821126938, -0.039988283067941666, 0.0561244934797287, 0.07745806872844696, 0.025317562744021416, 0.02961082197725773, 0.048917170614004135, -0.10078231990337372, 0.04726855829358101, -0.11097875237464905, 0.09826172888278961, -0.0713781863451004, 0.029611703008413315, -0.14167377352714539, -0.10925491899251938, 0.006011639721691608, -0.03955410420894623, 0.0735292136669159, 0.12512248754501343, -0.07815015316009521, -0.02599678561091423, 0.11168382316827774, -0.06393514573574066, -0.13698521256446838, 0.0861966609954834, -0.014598642475903034, 0.003233372000977397, 0.04889023303985596, 0.13482657074928284, 0.08210853487253189, -0.10388743877410889, -0.05369902774691582, -0.0003580374177545309, 0.005187686532735825, 0.016662877053022385, 0.0843706727027893, -0.017606481909751892, 0.0508943572640419, 0.016247399151325226, -0.051212187856435776, -0.009546498768031597, -0.07974431663751602, -0.06589215993881226, -0.08154463768005371, -0.05381638929247856, 0.04275662451982498, 0.022585352882742882, 0.033143073320388794, -0.08414937555789948, -0.1190466582775116, 0.0648937001824379, 0.1185876876115799, -0.03987552598118782, 0.007413392886519432, -0.09500838816165924, 0.08715581148862839, -0.06968826055526733, -0.02227991446852684, -0.20366109907627106, -0.12227029353380203, 0.08259550482034683, -0.10830441117286682, 0.024633076041936874, -0.01487218588590622, 0.04007146507501602, 0.050027694553136826, -0.023362549021840096, -0.04861088842153549, -0.09640514850616455, -0.022003700956702232, -0.08882346749305725, -0.10620678961277008, -0.06664568185806274, -0.008678167127072811, 0.134824737906456, -0.1601318120956421, 0.015055273659527302, 0.026777146384119987, 0.15398071706295013, 0.006047837436199188, -0.056005191057920456, 0.023073768243193626, -0.006657967809587717, -0.015760716050863266, -0.10523492097854614, 0.03705891594290733, 0.016048457473516464, -0.07898711413145065, -0.051127124577760696, -0.1169201135635376, 0.09473239630460739, 0.07981963455677032, 0.03536154702305794, -0.05459776520729065, 0.011928782798349857, -0.06382787227630615, -0.041423652321100235, -0.058093126863241196, -0.03144402429461479, 0.21234671771526337, -0.003949671983718872, 0.14235369861125946, -0.07716108113527298, -0.06293486058712006, 0.027272386476397514, -0.010532700456678867, -0.03841748088598251, 0.049053966999053955, -0.00011165614705532789, -0.1803622990846634, 0.1039297953248024, 0.06900826841592789, 0.0015413976507261395, 0.09968427568674088, -0.04760489985346794, -0.05957740545272827, -0.05684711039066315, 0.007479154039174318, -0.01183188147842884, 0.06034838780760765, -0.11695875972509384, 0.013397661969065666, 0.06256704777479172, 0.01676338165998459, 0.021911580115556717, -0.1378854662179947, 0.019605964422225952, 0.02985825017094612, -0.0514817014336586, 0.014594998210668564, 0.022219570353627205, 0.006081480532884598, 0.06459498405456543, 0.048847172409296036, -0.004592261742800474, 0.05035223066806793, -0.0055799041874706745, -0.07119088619947433, 0.16353750228881836, -0.10315646976232529, -0.20147529244422913, -0.16636639833450317, 0.06016319990158081, -0.09768353402614594, 0.0034931579139083624, 0.03987225890159607, -0.03631709888577461, -0.07347479462623596, -0.06260176002979279, -0.020358776673674583, -0.04368354007601738, -0.009343069046735764, 0.0904405266046524, -0.0021769963204860687, 0.11407028138637543, -0.13643041253089905, -0.01356614287942648, -0.00037757668178528547, -0.04718928411602974, -0.022654345259070396, 0.03736016899347305, 0.1118616834282875, 0.04320943355560303, -0.02658567950129509, 0.025898121297359467, -0.013990971259772778, 0.3064269721508026, -0.07530782371759415, -0.01720028556883335, 0.18310284614562988, 0.03185843676328659, 0.08234778046607971, 0.09937336295843124, 0.01356852613389492, -0.09288208931684494, 0.0229670200496912, 0.03721415996551514, -0.0069707646034657955, -0.2501371502876282, -0.03168854862451553, -0.023209543898701668, -0.09746988117694855, 0.1157156378030777, 0.06144176051020622, 0.0702255368232727, 0.08598791807889938, -0.04493211954832077, 0.06067937985062599, -0.0371595099568367, 0.11210514605045319, 0.2055482119321823, 0.05893625691533089, 0.09592613577842712, -0.023217175155878067, -0.044676974415779114, 0.05010802298784256, -0.00022357323905453086, 0.23387238383293152, 0.008056615479290485, 0.2094821035861969, 0.012735252268612385, 0.142069011926651, -0.011051308363676071, 0.027957333251833916, 0.02829146571457386, 0.01655544713139534, 0.007060416042804718, -0.043662942945957184, -0.05730484053492546, 0.024325519800186157, 0.011683857999742031, 0.07116366177797318, -0.06722716242074966, 0.0560913011431694, 0.01874902844429016, 0.306845098733902, 0.024504927918314934, -0.327548086643219, -0.09210933744907379, 0.007085014134645462, -0.04063359647989273, -0.09578996896743774, -0.014451230876147747, 0.07026361674070358, -0.15408311784267426, 0.08274323493242264, -0.05903994292020798, 0.08954901993274689, -0.07560811936855316, 0.0036827425938099623, 0.06992173939943314, 0.09873872995376587, 0.012321694754064083, 0.09172854572534561, -0.16241610050201416, 0.1887284815311432, 0.02491815760731697, 0.07315455377101898, -0.07021977007389069, 0.06635573506355286, -0.010446835309267044, 0.06599092483520508, 0.11504709720611572, -0.001490833587013185, -0.028830362483859062, -0.17030897736549377, -0.10373331606388092, 0.01255452074110508, 0.10323825478553772, -0.08658584952354431, 0.0823725163936615, -0.05825089290738106, 0.0011342857033014297, 0.011337011121213436, -0.0349058173596859, -0.12331972271203995, -0.11725283414125443, 0.04166826233267784, 0.032818760722875595, 0.03360139578580856, -0.0673827975988388, -0.09370117634534836, -0.03002382628619671, 0.17105621099472046, -0.007679530419409275, -0.09168686717748642, -0.13559824228286743, 0.08216864615678787, 0.1417551040649414, -0.0860619992017746, 0.05905858054757118, -0.019151104614138603, 0.1335211843252182, 0.05672717094421387, -0.06821729987859726, 0.06324609369039536, -0.051093246787786484, -0.16721764206886292, -0.026214992627501488, 0.15845267474651337, -0.023359335958957672, 0.03237702697515488, 0.009981959126889706, 0.049970995634794235, -0.01621953211724758, -0.08149715512990952, 0.027422644197940826, -0.003969177138060331, 0.10455760359764099, 0.06400144845247269, -0.004468949511647224, 0.058150455355644226, -0.0524279810488224, -0.004555927589535713, 0.12893731892108917, 0.22549892961978912, -0.0749966949224472, 0.021306853741407394, 0.028747864067554474, -0.05559094622731209, -0.12123855203390121, -0.0035261940211057663, 0.12162340432405472, 0.007731043733656406, 0.06708476692438126, -0.15818506479263306, 0.07716036587953568, 0.10079974681138992, -0.03385206684470177, 0.053661104291677475, -0.26964789628982544, -0.12849432229995728, 0.03935166448354721, 0.0971466675400734, 0.03321608901023865, -0.13960053026676178, -0.07989641278982162, -0.026591328904032707, -0.16520418226718903, 0.07631805539131165, 0.019104918465018272, 0.11788635700941086, -0.03668231889605522, 0.059901509433984756, 0.030597049742937088, -0.028299102559685707, 0.1615254133939743, 0.032596755772829056, 0.06681783497333527, -0.06838350743055344, 0.01709824427962303, 0.11503196507692337, -0.058477964252233505, 0.10574766993522644, -0.019121386110782623, 0.09446673840284348, -0.1569983959197998, -0.028191598132252693, -0.047759413719177246, 0.07128283381462097, -0.041886430233716965, -0.03418099880218506, -0.04121563956141472, 0.002438740339130163, 0.04490109160542488, -0.020505985245108604, 0.14872008562088013, 0.0662679597735405, 0.05702870339155197, 0.14065751433372498, 0.08150038123130798, -0.016200270503759384, -0.19241946935653687, -0.019047891721129417, -0.009212372824549675, 0.05168396979570389, -0.1055007055401802, 0.040727946907281876, 0.10830486565828323, 0.03429267928004265, 0.12611117959022522, 0.017068877816200256, -0.049579136073589325, 0.01583212986588478, 0.02131623402237892, -0.11332272738218307, -0.19214199483394623, -0.04960703104734421, -0.07139534503221512, -0.13452523946762085, 0.027558112516999245, 0.11965972930192947, -0.02359379641711712, -0.031569402664899826, -0.014706525951623917, -0.004451550077646971, 0.01844405196607113, 0.1515154391527176, 0.04045141860842705, 0.07462964206933975, -0.09100561589002609, 0.11170637607574463, 0.0851399376988411, -0.04324033111333847, 0.06069035828113556, 0.03577662259340286, -0.09301778674125671, -0.017782391980290413, 0.011196689680218697, 0.10416076332330704, -0.049540795385837555, -0.0415787473320961, -0.09797213971614838, -0.06613276898860931, 0.04911798611283302, 0.03467171639204025, 0.07347448170185089, -0.004692393355071545, -0.01803356036543846, -0.003634154098108411, -0.11053846776485443, 0.0984368771314621, 0.02897082082927227, 0.07371580600738525, -0.19507159292697906, 0.00933837704360485, 0.01896691508591175, 0.06339545547962189, -0.014288049191236496, -0.006452007219195366, -0.058758657425642014, -0.03369244188070297, -0.08751834183931351, 0.013801402412354946, -0.041207876056432724, 0.005533222574740648, -0.025232413783669472, -0.07737649232149124, -0.062029894441366196, 0.07461215555667877, -0.0513032041490078, -0.0916399210691452, 0.029197651892900467, 0.06012299656867981, -0.12180328369140625, -0.021551433950662613, 0.03821241110563278, -0.0994001179933548, 0.1172296404838562, 0.06380792707204819, 0.026743484660983086, -0.024450700730085373, 0.005497962236404419, 0.005387585144490004, 0.04467465355992317, 0.030410893261432648, 0.04335767775774002, -0.12736324965953827, 0.012126925401389599, -0.01393747329711914, 0.020215200260281563, -0.0054909889586269855, 0.05873030051589012, -0.14782069623470306, -0.06698823720216751, -0.04908006638288498, -0.03322305157780647, -0.06698904186487198, 0.02207512967288494, 0.12221177667379379, 0.0009353803470730782, 0.18990427255630493, -0.0473158024251461, 0.032210420817136765, -0.21187888085842133, -0.012731778435409069, -0.012854713015258312, -0.07113970071077347, -0.08317385613918304, -0.0384368896484375, 0.059396181255578995, -0.05781612545251846, 0.1181453987956047, -0.01614665426313877, 0.13897718489170074, 0.04384732246398926, 0.043926820158958435, 0.004181770607829094, -0.008076141588389874, 0.14662662148475647, 0.05924435332417488, -0.01946171559393406, 0.1303855925798416, -0.00231460714712739, 0.07436911761760712, -0.0025950518902391195, 0.06309417635202408, 0.11997663974761963, -0.028766613453626633, 0.07623423635959625, 0.076929472386837, -0.07179998606443405, -0.19125612080097198, 0.0656699389219284, -0.011864806525409222, 0.11612311005592346, -0.007067133206874132, 0.08741859346628189, 0.08897572755813599, -0.14767113327980042, 0.04454995319247246, -0.03604523092508316, -0.08514309674501419, -0.08762427419424057, -0.09438144415616989, -0.09362481534481049, -0.14050966501235962, 0.02909744530916214, -0.11938510090112686, 0.02104419283568859, 0.08272134512662888, -0.031032297760248184, -0.03615597262978554, 0.15148437023162842, -0.018802499398589134, -0.029460454359650612, 0.035801105201244354, -0.008189301937818527, -0.018519196659326553, -0.028323782607913017, -0.064721018075943, 0.04290136322379112, 0.03151080757379532, 0.08881670981645584, -0.021694492548704147, 0.00415834691375494, 0.023347537964582443, -0.03406483307480812, -0.09074795246124268, -0.0011828585993498564, 0.04390783607959747, -0.005284218583256006, 0.0019601909443736076, 0.028400542214512825, 0.011037226766347885, -0.03809522092342377, 0.23210258781909943, -0.03834492713212967, -0.029935792088508606, -0.12633034586906433, 0.08927237242460251, 0.06510990858078003, -0.008052081800997257, 0.08432020992040634, -0.1041824147105217, -0.017361802980303764, 0.1285923421382904, 0.09321576356887817, -0.011272688396275043, -0.024307731539011, 0.0010570385493338108, -0.023208921775221825, -0.024126341566443443, 0.05844836309552193, 0.10612893849611282, -0.02578258141875267, -0.04891091212630272, 0.008287021890282631, -0.017136745154857635, -0.015749281272292137, -0.0853036642074585, 0.056159473955631256, 0.008859416469931602, 0.02546640671789646, -0.020762575790286064, 0.0485336035490036, 0.02176825702190399, -0.12330611050128937, 0.04820670187473297, -0.18076665699481964, -0.16700732707977295, -0.004167374689131975, 0.052518121898174286, 0.002696459414437413, 0.06529746204614639, 0.0208195298910141, -0.020794274285435677, 0.128919318318367, -0.02115701511502266, -0.11037494987249374, -0.08631580322980881, 0.08610624074935913, -0.12235628813505173, 0.20790405571460724, -0.006491129286587238, 0.1011597141623497, 0.1154521033167839, 0.0026351024862378836, -0.16287323832511902, 0.03221280127763748, 0.06432473659515381, -0.026096394285559654, 0.058412373065948486, 0.13169075548648834, 0.0017587970942258835, 0.0584028996527195, 0.03972197696566582, -0.11355522274971008, -0.054524049162864685, -0.025252148509025574, 0.04071500524878502, -0.08116737753152847, -0.01177140511572361, -0.07878382503986359, 0.1516629457473755, 0.1800089329481125, -0.055050816386938095, -0.03065846674144268, -0.05769086629152298, 0.035884786397218704, 0.07670719921588898, 0.07493439316749573, -0.02155027538537979, -0.22344429790973663, 0.02749982476234436, 0.03543349727988243, 0.015838496387004852, -0.17850543558597565, -0.11692951619625092, 0.026240432634949684, -0.05937591567635536, -0.03453051671385765, 0.0800737589597702, 0.037447985261678696, 0.02315417490899563, -0.03336932882666588, -0.07628554850816727, -0.07507238537073135, 0.1512862890958786, -0.16449573636054993, -0.059224847704172134 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # google_electra-small-discriminator_squad This model is a fine-tuned version of [google/electra-small-discriminator](https://huggingface.co/google/electra-small-discriminator) on the **squadV1** dataset. - "eval_exact_match": 76.95364238410596 - "eval_f1": 84.98869246841396 - "eval_samples": 10784 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 16 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results ### Framework versions - Transformers 4.14.1 - Pytorch 1.9.0 - Datasets 1.16.1 - Tokenizers 0.10.3
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "google_electra-small-discriminator_squad", "results": []}]}
question-answering
Palak/google_electra-small-discriminator_squad
[ "transformers", "pytorch", "electra", "question-answering", "generated_from_trainer", "dataset:squad", "license:apache-2.0", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #electra #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
# google_electra-small-discriminator_squad This model is a fine-tuned version of google/electra-small-discriminator on the squadV1 dataset. - "eval_exact_match": 76.95364238410596 - "eval_f1": 84.98869246841396 - "eval_samples": 10784 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 16 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results ### Framework versions - Transformers 4.14.1 - Pytorch 1.9.0 - Datasets 1.16.1 - Tokenizers 0.10.3
[ "# google_electra-small-discriminator_squad\n\nThis model is a fine-tuned version of google/electra-small-discriminator on the squadV1 dataset.\n\n- \"eval_exact_match\": 76.95364238410596\n- \"eval_f1\": 84.98869246841396\n- \"eval_samples\": 10784", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0", "### Training results", "### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ "TAGS\n#transformers #pytorch #electra #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n", "# google_electra-small-discriminator_squad\n\nThis model is a fine-tuned version of google/electra-small-discriminator on the squadV1 dataset.\n\n- \"eval_exact_match\": 76.95364238410596\n- \"eval_f1\": 84.98869246841396\n- \"eval_samples\": 10784", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0", "### Training results", "### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ 51, 91, 6, 12, 8, 3, 90, 4, 31 ]
[ "passage: TAGS\n#transformers #pytorch #electra #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# google_electra-small-discriminator_squad\n\nThis model is a fine-tuned version of google/electra-small-discriminator on the squadV1 dataset.\n\n- \"eval_exact_match\": 76.95364238410596\n- \"eval_f1\": 84.98869246841396\n- \"eval_samples\": 10784## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0### Training results### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ -0.08905800431966782, 0.1570902168750763, -0.0028707312885671854, 0.0874367505311966, 0.16004687547683716, 0.03226250782608986, 0.0830308049917221, 0.1500205397605896, -0.0614442303776741, 0.11217807978391647, 0.09631934762001038, 0.06708376854658127, 0.06304025650024414, 0.12310000509023666, -0.026264606043696404, -0.23839698731899261, 0.0036468207836151123, 0.015759475529193878, -0.08141275495290756, 0.0949467346072197, 0.12244308739900589, -0.08656692504882812, 0.09878026694059372, 0.03290070593357086, -0.14313019812107086, 0.025353997945785522, -0.016385341063141823, -0.07082325220108032, 0.07956086844205856, 0.023632816970348358, 0.047463659197092056, -0.01352517120540142, 0.08002680540084839, -0.199302539229393, -0.00798456184566021, 0.0498834028840065, 0.03235519677400589, 0.09861480444669724, 0.04740506410598755, 0.041718292981386185, 0.061747997999191284, -0.15280728042125702, 0.06115240976214409, 0.04703991860151291, -0.08636713773012161, -0.1702113300561905, -0.08918113261461258, 0.0816233828663826, 0.07175078243017197, 0.1047842875123024, 0.007761801593005657, 0.18385058641433716, -0.02215263433754444, 0.0658208578824997, 0.19839562475681305, -0.27356576919555664, -0.06290506571531296, 0.04206879064440727, 0.0786377489566803, 0.02718372642993927, -0.084629587829113, 0.00280544301494956, 0.04239249229431152, 0.03839869052171707, 0.0687340497970581, -0.020505433902144432, -0.059910327196121216, -0.022129664197564125, -0.10132298618555069, -0.07222135365009308, 0.2483268529176712, 0.08614663034677505, -0.05292792618274689, -0.12226056307554245, -0.08116810023784637, -0.09439551085233688, -0.009019764140248299, -0.05889369919896126, 0.04222200810909271, -0.03863261640071869, -0.05385056138038635, -0.07758262753486633, -0.0718565434217453, -0.05265900492668152, -0.0063146562315523624, 0.14026875793933868, 0.036235637962818146, 0.03726666793227196, -0.011447488330304623, 0.0746910348534584, 0.002240804024040699, -0.14566335082054138, -0.04818767309188843, 0.0007694557425566018, -0.05204373598098755, -0.04384568706154823, -0.03853878751397133, 0.012089528143405914, 0.03296350687742233, 0.16452838480472565, -0.08001235127449036, 0.06346628069877625, 0.057455290108919144, -0.008968544192612171, -0.015614357776939869, 0.1624578833580017, -0.05562363564968109, -0.023802561685442924, 0.026546338573098183, 0.09315969794988632, 0.0363946333527565, 0.006163863465189934, -0.07911188155412674, -0.06551451981067657, 0.09581002593040466, 0.050300564616918564, 0.006443007383495569, 0.04374051094055176, -0.052875082939863205, -0.050599683076143265, 0.04907640442252159, -0.11145246028900146, 0.035245053470134735, -0.03046354465186596, -0.09522747993469238, -0.050513532012701035, 0.024156175553798676, 0.002399974735453725, -0.059006307274103165, 0.010508685372769833, -0.09940820932388306, -0.010780137032270432, -0.04491807147860527, -0.029850512742996216, 0.0111971041187644, -0.06021665036678314, 0.009565912187099457, -0.08302827179431915, -0.19040577113628387, -0.03765653446316719, 0.016414768993854523, -0.06261758506298065, -0.10922854393720627, 0.012562687508761883, -0.05906904116272926, 0.017118282616138458, -0.017564069479703903, 0.07970497012138367, -0.03317679837346077, 0.0644005760550499, 0.07833791524171829, 0.02742520533502102, 0.02224438264966011, 0.04603792354464531, -0.09352044016122818, 0.03877940773963928, -0.11081977933645248, 0.10113800317049026, -0.07700930535793304, 0.024658415466547012, -0.13148289918899536, -0.11268623173236847, -0.01635035127401352, -0.026770438998937607, 0.06897897273302078, 0.14240506291389465, -0.09456001222133636, -0.029706314206123352, 0.11205790936946869, -0.0625845193862915, -0.12311694771051407, 0.08553995192050934, -0.006718337535858154, -0.00666003068909049, 0.046967558562755585, 0.11921775341033936, 0.10614904761314392, -0.08432459831237793, -0.0680854395031929, 0.019500695168972015, 0.014617915265262127, 0.005678488872945309, 0.08332191407680511, -0.02460356429219246, 0.033466044813394547, 0.025219108909368515, -0.05774928256869316, 0.0018537761643528938, -0.08728235214948654, -0.06359530240297318, -0.09422457963228226, -0.05459413677453995, 0.03264936804771423, 0.03923898562788963, 0.03668684512376785, -0.07726114243268967, -0.10833224654197693, 0.07022376358509064, 0.1277714967727661, -0.03015335462987423, -0.0057013374753296375, -0.0835653692483902, 0.0692782923579216, -0.06345073878765106, -0.021646348759531975, -0.20561598241329193, -0.1245885044336319, 0.07974895089864731, -0.09927764534950256, 0.02249889448285103, 0.01172960177063942, 0.05112581327557564, 0.038923658430576324, -0.03598133847117424, -0.037071388214826584, -0.09397957473993301, -0.03331274539232254, -0.08129656314849854, -0.10557747632265091, -0.060676880180835724, -0.001236963551491499, 0.12555274367332458, -0.18108844757080078, 0.008328805677592754, 0.015286569483578205, 0.16040880978107452, 0.0010756616247817874, -0.05843668803572655, 0.026891618967056274, -0.005602017976343632, -0.013238686136901379, -0.1101711168885231, 0.03640206530690193, 0.014016811735928059, -0.06661443412303925, -0.057739682495594025, -0.1046096459031105, 0.07924668490886688, 0.07848528027534485, 0.033041052520275116, -0.055698320269584656, 0.016061754897236824, -0.07471489161252975, -0.04432389885187149, -0.0687992200255394, -0.0366823635995388, 0.18967413902282715, -0.008815707638859749, 0.13120096921920776, -0.07530742138624191, -0.057734061032533646, 0.02828104794025421, 0.001428597024641931, -0.027919186279177666, 0.048845645040273666, 0.02962290495634079, -0.15474526584148407, 0.10651399195194244, 0.04508686438202858, -0.007262760773301125, 0.11559522151947021, -0.06676540523767471, -0.07081106305122375, -0.05017875134944916, 0.02616950497031212, -0.017441026866436005, 0.07265014946460724, -0.11184558272361755, 0.022815847769379616, 0.06127085164189339, 0.015677761286497116, 0.032784853130578995, -0.13451573252677917, 0.015522359870374203, 0.013117874972522259, -0.0614793598651886, 0.022521568462252617, 0.024362819269299507, 0.022732365876436234, 0.07337391376495361, 0.0434986874461174, 0.007871964015066624, 0.04406168684363365, -0.008318337611854076, -0.07537122070789337, 0.16931046545505524, -0.0957556813955307, -0.19911299645900726, -0.17878220975399017, 0.09634031355381012, -0.09411376714706421, 0.005127758253365755, 0.017492227256298065, -0.04456092417240143, -0.0625654011964798, -0.06010158360004425, -0.00016530744323972613, -0.049830563366413116, -0.0025558455381542444, 0.08621612191200256, 0.02195941098034382, 0.12679670751094818, -0.13051532208919525, -0.002012986224144697, -0.001838618889451027, -0.0720624104142189, -0.03184693306684494, 0.04321448877453804, 0.09621234238147736, 0.0429096557199955, -0.025580866262316704, 0.02181067503988743, -0.014118296094238758, 0.2821952700614929, -0.0704876109957695, -0.012296538800001144, 0.18352381885051727, 0.05121704936027527, 0.0730665996670723, 0.10573094338178635, 0.016710201278328896, -0.0820976197719574, 0.021581079810857773, 0.041114676743745804, 0.003940591588616371, -0.2523835003376007, -0.0395299457013607, -0.027563564479351044, -0.0759449154138565, 0.1165502518415451, 0.0646262913942337, 0.034072451293468475, 0.09174303710460663, -0.05380155146121979, 0.07744859904050827, -0.05580712482333183, 0.09470337629318237, 0.20566651225090027, 0.054767899215221405, 0.09164562076330185, -0.030998026952147484, -0.04919842258095741, 0.06514597684144974, -0.017285538837313652, 0.24796782433986664, 0.0017228276701644063, 0.17884822189807892, 0.005216550547629595, 0.13253693282604218, -0.012225808575749397, 0.031686585396528244, 0.04055111110210419, 0.012651056051254272, 0.011505316011607647, -0.04960324242711067, -0.05396081134676933, 0.01515798270702362, 0.022894524037837982, 0.08521002531051636, -0.07392822951078415, 0.07032697647809982, 0.009169462136924267, 0.3016006052494049, 0.028054730966687202, -0.3156804144382477, -0.09466055035591125, 0.013624480925500393, -0.04060608893632889, -0.10033638030290604, -0.006076067220419645, 0.11299111694097519, -0.15324431657791138, 0.07517658174037933, -0.05793284252285957, 0.08322232216596603, -0.06002095341682434, 0.003038903698325157, 0.0776175707578659, 0.10608089715242386, 0.023183736950159073, 0.09821911156177521, -0.16308841109275818, 0.18635599315166473, 0.022862650454044342, 0.07326234877109528, -0.06339508295059204, 0.0788472443819046, 0.0000458825052191969, 0.057473424822092056, 0.11444041132926941, 0.0025005810894072056, -0.05423639714717865, -0.16701781749725342, -0.09994608908891678, 0.028195828199386597, 0.10960560292005539, -0.08303026854991913, 0.0827188640832901, -0.06315294653177261, 0.003309995401650667, 0.01596265286207199, -0.050004325807094574, -0.13936132192611694, -0.12174743413925171, 0.03804473578929901, 0.03379238769412041, 0.01322586927562952, -0.07367100566625595, -0.09201094508171082, -0.025721345096826553, 0.16448412835597992, -0.016476180404424667, -0.08322907984256744, -0.1394127607345581, 0.11068544536828995, 0.16346365213394165, -0.0835026204586029, 0.054857026785612106, -0.01719931699335575, 0.12826234102249146, 0.06615640968084335, -0.07490000873804092, 0.05179690197110176, -0.05635537952184677, -0.1753631979227066, -0.025730349123477936, 0.16274210810661316, -0.03809106722474098, 0.03331013768911362, 0.0015733871841803193, 0.045289430767297745, -0.014177950099110603, -0.09178944677114487, 0.02070271410048008, 0.012937472201883793, 0.09942876547574997, 0.08216622471809387, -0.03286154195666313, 0.050767019391059875, -0.023118482902646065, -0.006165628787130117, 0.12245523184537888, 0.20765937864780426, -0.08303584158420563, 0.009549906477332115, 0.014405937865376472, -0.0538700595498085, -0.14302748441696167, 0.013447313569486141, 0.1299894154071808, 0.010915478691458702, 0.07433494925498962, -0.16989094018936157, 0.09571908414363861, 0.10212648659944534, -0.03330869600176811, 0.06326571851968765, -0.2635900378227234, -0.11593714356422424, 0.029211994260549545, 0.08092210441827774, 0.0345010831952095, -0.15117529034614563, -0.08767645806074142, -0.02058500237762928, -0.1388293355703354, 0.0766313374042511, 0.01879500225186348, 0.12303809076547623, -0.04253246262669563, 0.07141207158565521, 0.03520114719867706, -0.03262537345290184, 0.16110210120677948, 0.026582762598991394, 0.0599091500043869, -0.06706966459751129, 0.008485550060868263, 0.10513603687286377, -0.05511815473437309, 0.10034111887216568, -0.020093830302357674, 0.10683315992355347, -0.1757088601589203, -0.02127484232187271, -0.05946626886725426, 0.08476845175027847, -0.045221827924251556, -0.030243845656514168, -0.04401686042547226, 0.010891803540289402, 0.029168477281928062, -0.019162744283676147, 0.10761065781116486, 0.06516730040311813, 0.058058641850948334, 0.12598982453346252, 0.09219507873058319, 0.008128094486892223, -0.18316124379634857, -0.015279467217624187, -0.006646470166742802, 0.04881199821829796, -0.12292371690273285, 0.03216332942247391, 0.10485058277845383, 0.03850700706243515, 0.1141376793384552, 0.02308274619281292, -0.037081699818372726, 0.0037826208863407373, 0.03480532392859459, -0.11226391792297363, -0.2058437317609787, -0.06242221221327782, -0.07012857496738434, -0.15642739832401276, 0.01442707423120737, 0.10167733579874039, -0.031695060431957245, -0.01796572469174862, -0.011486794799566269, -0.003164703957736492, 0.0185745470225811, 0.15588434040546417, 0.0409061424434185, 0.0756620392203331, -0.08947835117578506, 0.11593902856111526, 0.08414861559867859, -0.04308256506919861, 0.050306420773267746, 0.05239296332001686, -0.08716771751642227, -0.014462864026427269, 0.014325658790767193, 0.10384412854909897, -0.02992943301796913, -0.04110290855169296, -0.09182281792163849, -0.07816994935274124, 0.04669764265418053, 0.012087270617485046, 0.06935431808233261, 0.0021244012750685215, -0.00991787575185299, 0.003013844136148691, -0.11467115581035614, 0.09993817657232285, 0.03365186229348183, 0.074623242020607, -0.19911237061023712, 0.030576474964618683, 0.005091915838420391, 0.06804697215557098, -0.013582340441644192, 0.010319581255316734, -0.06347811967134476, -0.044683367013931274, -0.08612346649169922, 0.015657078474760056, -0.039607319980859756, -0.005018335301429033, -0.019667722284793854, -0.07396397739648819, -0.06533501297235489, 0.06952347606420517, -0.045927807688713074, -0.09316667914390564, 0.020979521796107292, 0.0728553906083107, -0.10706007480621338, -0.01271765772253275, 0.0355406254529953, -0.09763754159212112, 0.11399834603071213, 0.06511760503053665, 0.03790384158492088, -0.01953013800084591, -0.007609630469232798, -0.014298727735877037, 0.045069094747304916, 0.037273671478033066, 0.03985166549682617, -0.11279947310686111, 0.024056432768702507, -0.01796387881040573, 0.048986878246068954, -0.00865156203508377, 0.04014226049184799, -0.1522827297449112, -0.05511351674795151, -0.06340721994638443, -0.023940950632095337, -0.06919945031404495, 0.01294753048568964, 0.12250880151987076, 0.006341094151139259, 0.18402816355228424, -0.05901006981730461, 0.030427386984229088, -0.23728469014167786, -0.0054644751362502575, -0.023172762244939804, -0.053627654910087585, -0.09320452809333801, -0.026247825473546982, 0.07122432440519333, -0.04477526992559433, 0.11608821898698807, -0.015362988226115704, 0.12913517653942108, 0.0432191863656044, 0.02531229704618454, 0.016713909804821014, -0.009521803818643093, 0.17287534475326538, 0.07454705983400345, -0.0177801214158535, 0.12567166984081268, -0.010703491978347301, 0.07108522206544876, -0.007680648937821388, 0.07243404537439346, 0.1338643878698349, -0.06782973557710648, 0.08089635521173477, 0.09860492497682571, -0.08416140824556351, -0.16213487088680267, 0.08550289273262024, -0.02014302648603916, 0.10413270443677902, -0.02793869748711586, 0.09063664078712463, 0.0823083147406578, -0.14972619712352753, 0.04399025812745094, -0.04792860895395279, -0.08332157135009766, -0.09710767865180969, -0.06246141716837883, -0.09727881103754044, -0.14604081213474274, 0.029729247093200684, -0.1200271025300026, 0.016873544082045555, 0.08967555314302444, -0.029516447335481644, -0.04079631716012955, 0.14729219675064087, -0.024247048422694206, -0.02609933912754059, 0.042167969048023224, -0.0005526829045265913, -0.002857248531654477, -0.037066712975502014, -0.07051366567611694, 0.03772168979048729, 0.03495577722787857, 0.09267736226320267, -0.025298558175563812, -0.008100965991616249, 0.02167443186044693, -0.028050260618329048, -0.08077888935804367, -0.003190438263118267, 0.03470226377248764, -0.006992940325289965, 0.00516436668112874, 0.02790145017206669, 0.009862235747277737, -0.04404212161898613, 0.24138806760311127, -0.05422332510352135, -0.04942046478390694, -0.12669266760349274, 0.08257806301116943, 0.04570427164435387, -0.010752161033451557, 0.07975492626428604, -0.11357776075601578, -0.03189926967024803, 0.1417611688375473, 0.09553330391645432, -0.013060244731605053, -0.02381538599729538, 0.010694283992052078, -0.02701115794479847, -0.03801007941365242, 0.06741753965616226, 0.09604863822460175, -0.0043255481868982315, -0.044295813888311386, -0.00581959867849946, -0.0000467774661956355, -0.012204447761178017, -0.09194245934486389, 0.06365332007408142, 0.0032619130797684193, 0.027555078268051147, -0.01766945794224739, 0.04845064878463745, 0.014063509181141853, -0.12260046601295471, 0.056285560131073, -0.18215669691562653, -0.1639646291732788, -0.013811922632157803, 0.05316492170095444, 0.004734956659376621, 0.0656774714589119, 0.00699315220117569, -0.01616520993411541, 0.11517781019210815, -0.02973901480436325, -0.12811951339244843, -0.07693887501955032, 0.07598578184843063, -0.0984736979007721, 0.20398712158203125, -0.003467842936515808, 0.0998690128326416, 0.11166427284479141, -0.002812296384945512, -0.17780867218971252, 0.016473643481731415, 0.06452900171279907, -0.04138296842575073, 0.058295201510190964, 0.13745102286338806, -0.005067604128271341, 0.054632022976875305, 0.03453279286623001, -0.11121828109025955, -0.06043363735079765, -0.00868860725313425, 0.03931364789605141, -0.10233639925718307, -0.011723971925675869, -0.07764236629009247, 0.15087218582630157, 0.18733283877372742, -0.05397713556885719, -0.034454528242349625, -0.0647127702832222, 0.03886042907834053, 0.07058855891227722, 0.09357713162899017, -0.015248031355440617, -0.2221139520406723, 0.02240949310362339, 0.03117501176893711, 0.018117908388376236, -0.18035542964935303, -0.09986438602209091, 0.033955272287130356, -0.05920729413628578, -0.03290052339434624, 0.09190298616886139, 0.04230248183012009, 0.023843148723244667, -0.02495470456779003, -0.08832785487174988, -0.06873063743114471, 0.15472222864627838, -0.16907106339931488, -0.06246500834822655 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # microsoft_deberta-base_squad This model is a fine-tuned version of [microsoft/deberta-base](https://huggingface.co/microsoft/deberta-base) on the **squadV1** dataset. - "eval_exact_match": 86.30085146641439 - "eval_f1": 92.68502275661561 - "eval_samples": 10788 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 12 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results ### Framework versions - Transformers 4.14.1 - Pytorch 1.9.0 - Datasets 1.16.1 - Tokenizers 0.10.3
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "microsoft_deberta-base_squad", "results": []}]}
question-answering
Palak/microsoft_deberta-base_squad
[ "transformers", "pytorch", "deberta", "question-answering", "generated_from_trainer", "dataset:squad", "license:mit", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #deberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
# microsoft_deberta-base_squad This model is a fine-tuned version of microsoft/deberta-base on the squadV1 dataset. - "eval_exact_match": 86.30085146641439 - "eval_f1": 92.68502275661561 - "eval_samples": 10788 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 12 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results ### Framework versions - Transformers 4.14.1 - Pytorch 1.9.0 - Datasets 1.16.1 - Tokenizers 0.10.3
[ "# microsoft_deberta-base_squad\n\nThis model is a fine-tuned version of microsoft/deberta-base on the squadV1 dataset.\n- \"eval_exact_match\": 86.30085146641439\n- \"eval_f1\": 92.68502275661561\n- \"eval_samples\": 10788", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 12\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0", "### Training results", "### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ "TAGS\n#transformers #pytorch #deberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n", "# microsoft_deberta-base_squad\n\nThis model is a fine-tuned version of microsoft/deberta-base on the squadV1 dataset.\n- \"eval_exact_match\": 86.30085146641439\n- \"eval_f1\": 92.68502275661561\n- \"eval_samples\": 10788", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 12\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0", "### Training results", "### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ 49, 84, 6, 12, 8, 3, 90, 4, 31 ]
[ "passage: TAGS\n#transformers #pytorch #deberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# microsoft_deberta-base_squad\n\nThis model is a fine-tuned version of microsoft/deberta-base on the squadV1 dataset.\n- \"eval_exact_match\": 86.30085146641439\n- \"eval_f1\": 92.68502275661561\n- \"eval_samples\": 10788## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 12\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0### Training results### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ -0.10159613937139511, 0.17546029388904572, -0.0019918354228138924, 0.08877098560333252, 0.14476875960826874, 0.0350707471370697, 0.07224192470312119, 0.13224095106124878, -0.06290436536073685, 0.08595596253871918, 0.11232856661081314, 0.07457195222377777, 0.06551479548215866, 0.12035626918077469, -0.026067260652780533, -0.2083369940519333, 0.024721315130591393, 0.016314053907990456, 0.004838690161705017, 0.09943535923957825, 0.12371595948934555, -0.08815503120422363, 0.09096326678991318, 0.014257706701755524, -0.11349577456712723, 0.014596737921237946, -0.041956834495067596, -0.03797651827335358, 0.0865391343832016, 0.03527110815048218, 0.059082839637994766, -0.014036430045962334, 0.08787690848112106, -0.21924574673175812, -0.009460299275815487, 0.04342905431985855, 0.03042779304087162, 0.0773032084107399, 0.014767798595130444, 0.04174700006842613, 0.07536435127258301, -0.13509251177310944, 0.09601029008626938, 0.016950907185673714, -0.09055110067129135, -0.15249669551849365, -0.08865395933389664, 0.051073942333459854, 0.08447694033384323, 0.12343644350767136, -0.0031387750059366226, 0.19310641288757324, -0.024413490667939186, 0.06801962852478027, 0.1567564308643341, -0.25500741600990295, -0.04524251073598862, 0.030154965817928314, 0.07343244552612305, 0.05327843874692917, -0.090411476790905, -0.013710686936974525, 0.03360176458954811, 0.03755167871713638, 0.07776863127946854, -0.008312016725540161, -0.012897428125143051, -0.03046354465186596, -0.10795343667268753, -0.08499736338853836, 0.21425233781337738, 0.08135966956615448, -0.05487208813428879, -0.1378025859594345, -0.04876202717423439, -0.09482484310865402, 0.004782191477715969, -0.06711656600236893, 0.023145323619246483, -0.058271799236536026, -0.03629579767584801, -0.060639284551143646, -0.0814802348613739, -0.04353238269686699, 0.010533769614994526, 0.09487268328666687, 0.02460341900587082, 0.02735309675335884, -0.0027233967557549477, 0.08668252825737, -0.03907172381877899, -0.13971081376075745, -0.04470178857445717, -0.009906389750540257, -0.05739396810531616, -0.03932676464319229, -0.050133924931287766, -0.04885600134730339, -0.006943614687770605, 0.16028091311454773, -0.019636016339063644, 0.05249147117137909, 0.062499452382326126, -0.015070033259689808, 0.003163326298817992, 0.16884870827198029, -0.054015811532735825, -0.08804603666067123, 0.008845021016895771, 0.11767411977052689, 0.048116352409124374, -0.02290121465921402, -0.10880374908447266, -0.03960152342915535, 0.11614610254764557, 0.03878691419959068, 0.007078761234879494, 0.03173017501831055, -0.08058930933475494, -0.05589347332715988, 0.09804437309503555, -0.10712570697069168, 0.029845891520380974, -0.03884674236178398, -0.08745099604129791, -0.10574598610401154, 0.030025508254766464, 0.017780324444174767, -0.04466744512319565, 0.043273959308862686, -0.090863898396492, -0.03336509317159653, -0.07233945280313492, -0.06318652629852295, 0.00941924937069416, -0.061507485806941986, 0.03299643099308014, -0.07616474479436874, -0.19140352308750153, -0.04797692224383354, 0.032319650053977966, -0.0649762824177742, -0.0846816822886467, -0.02189014106988907, -0.040754642337560654, 0.015939056873321533, -0.026277249678969383, 0.10648350417613983, -0.051158368587493896, 0.067035011947155, 0.058787960559129715, 0.013856038451194763, -0.03434156998991966, 0.05817952752113342, -0.09216772019863129, 0.02602490782737732, -0.07937625795602798, 0.08884946256875992, -0.09547882527112961, 0.02626989595592022, -0.11777123808860779, -0.1074066087603569, 0.01934484764933586, -0.034663423895835876, 0.0881519690155983, 0.10428006201982498, -0.1117267906665802, -0.00004870980410487391, 0.07851710170507431, -0.06443120539188385, -0.12741638720035553, 0.08398650586605072, -0.02873682975769043, 0.025187399238348007, 0.05439717322587967, 0.13847219944000244, 0.12117134034633636, -0.14045700430870056, -0.024799631908535957, 0.023401640355587006, 0.047129031270742416, -0.0018115341663360596, 0.08845556527376175, -0.014014720916748047, 0.061776723712682724, 0.024356160312891006, -0.07666349411010742, -0.016026336699724197, -0.07127149403095245, -0.07960327714681625, -0.04370404779911041, -0.08286388218402863, 0.025207161903381348, 0.046466585248708725, 0.012483063153922558, -0.07369914650917053, -0.12177800387144089, 0.09548422694206238, 0.12020071595907211, -0.025053657591342926, -0.011263620108366013, -0.08714558184146881, 0.05485276132822037, -0.06389079988002777, -0.041599761694669724, -0.1962956339120865, -0.12591607868671417, 0.05685120448470116, -0.04913155734539032, 0.019703486934304237, 0.03764583170413971, 0.04571337252855301, 0.057606447488069534, -0.04659595713019371, -0.028012072667479515, -0.12023589015007019, 0.010167370550334454, -0.10504788905382156, -0.10894704610109329, -0.08090445399284363, -0.023681819438934326, 0.1813361942768097, -0.17970310151576996, -0.0037795272655785084, -0.0037721525877714157, 0.12865011394023895, 0.014836144633591175, -0.07010754197835922, 0.01733386144042015, 0.029438400641083717, -0.0022438373416662216, -0.0882832258939743, 0.04174035042524338, 0.02025589533150196, -0.1056756004691124, -0.08913889527320862, -0.13269688189029694, 0.07910177111625671, 0.07913625985383987, 0.08170045912265778, -0.06124350056052208, 0.0033071485813707113, -0.05126660689711571, -0.0459669791162014, -0.05500396341085434, -0.05615222081542015, 0.19876991212368011, 0.009997944347560406, 0.10833535343408585, -0.06337927281856537, -0.05491828918457031, 0.009278135374188423, 0.012594729661941528, -0.04256657138466835, 0.06573022902011871, 0.003796348813921213, -0.19048559665679932, 0.09975578635931015, 0.06327944248914719, -0.006707917433232069, 0.10457015037536621, -0.030910231173038483, -0.09104496240615845, -0.05287100374698639, -0.003021279349923134, -0.010258101858198643, 0.13178594410419464, -0.1190982311964035, 0.019439011812210083, 0.06356720626354218, 0.009590712375938892, 0.022182203829288483, -0.14345796406269073, -0.0074138944037258625, 0.038488030433654785, -0.03343898057937622, -0.013293306343257427, -0.014485751278698444, 0.002709093503654003, 0.06515450775623322, 0.05249641090631485, -0.019907645881175995, 0.03432596102356911, -0.02629462629556656, -0.07358744740486145, 0.15723612904548645, -0.09834697097539902, -0.2235659509897232, -0.15142543613910675, 0.06426352262496948, -0.08395417034626007, 0.0006279933732002974, 0.025469565764069557, -0.017790554091334343, -0.06697223335504532, -0.0876353308558464, -0.02392594702541828, -0.06277376413345337, -0.034763749688863754, 0.0558602400124073, 0.0004961221129633486, 0.10654789209365845, -0.1396683007478714, -0.004892480093985796, -0.010983358137309551, -0.025717345997691154, -0.017165161669254303, 0.041954901069402695, 0.12902198731899261, 0.03879823908209801, -0.012332749553024769, 0.009778625331819057, -0.027283912524580956, 0.2999227046966553, -0.08372402936220169, -0.034692246466875076, 0.1519079953432083, -0.0016120042419061065, 0.06262283027172089, 0.1177668645977974, 0.011077969335019588, -0.10491620749235153, 0.02753470093011856, 0.03588002175092697, -0.006986534688621759, -0.20496517419815063, -0.043774768710136414, -0.02991425432264805, -0.0942377895116806, 0.10025910288095474, 0.039539482444524765, 0.013741855509579182, 0.06327594816684723, -0.022865144535899162, 0.03306259214878082, -0.0375538170337677, 0.08860180526971817, 0.1485140472650528, 0.031408343464136124, 0.10001645237207413, -0.02139667607843876, -0.02181675098836422, 0.05764767900109291, 0.013248835690319538, 0.2037448287010193, -0.017020033672451973, 0.16487571597099304, 0.012977855280041695, 0.1639610230922699, -0.043959926813840866, 0.028161825612187386, 0.009057820774614811, 0.016674889251589775, -0.009826714172959328, -0.04968785122036934, -0.08979260921478271, 0.03687522932887077, 0.03313557803630829, 0.056507013738155365, -0.07642946392297745, 0.0201313067227602, -0.008057286962866783, 0.22949382662773132, 0.07139232754707336, -0.3328181803226471, -0.10713136941194534, 0.014507721178233624, -0.01651236042380333, -0.08822724223136902, -0.028781414031982422, 0.09265515208244324, -0.1428779661655426, 0.07506345957517624, -0.03845365345478058, 0.09124114364385605, -0.08187174797058105, 0.008390568196773529, 0.05065252259373665, 0.07166317105293274, 0.017153814435005188, 0.1087474524974823, -0.20346906781196594, 0.18143068253993988, 0.02087736502289772, 0.09177389740943909, -0.07875464111566544, 0.04777006432414055, -0.02198764868080616, 0.04613633081316948, 0.11161848902702332, 0.0030693793669342995, 0.011338216252624989, -0.17854230105876923, -0.09893618524074554, 0.01666060835123062, 0.06687869876623154, -0.06053771823644638, 0.0788242369890213, -0.03402642533183098, 0.013746777549386024, 0.033889040350914, 0.018779054284095764, -0.11759518831968307, -0.14593404531478882, 0.05444870889186859, 0.006792284548282623, -0.0030921027064323425, -0.06904489547014236, -0.09125785529613495, 0.00857242289930582, 0.17171232402324677, 0.04036171734333038, -0.07467249780893326, -0.13003641366958618, 0.10115933418273926, 0.1335008144378662, -0.07201256603002548, 0.02551736868917942, 0.0010245234007015824, 0.11588854342699051, 0.04027881845831871, -0.051171112805604935, 0.0638648271560669, -0.04340984299778938, -0.1428910195827484, -0.051479458808898926, 0.1392933577299118, 0.009843140840530396, 0.051160719245672226, 0.012923616915941238, 0.03179213032126427, -0.017985226586461067, -0.07224446535110474, 0.028988638892769814, 0.03880554065108299, 0.09891501814126968, 0.04149831086397171, -0.008449125103652477, 0.03646751120686531, -0.04871674254536629, -0.010854220017790794, 0.13121795654296875, 0.21863417327404022, -0.08078382164239883, 0.033655233681201935, 0.04482371732592583, -0.054103653877973557, -0.14962157607078552, 0.009411589242517948, 0.12206131219863892, 0.03212571516633034, 0.08607117831707001, -0.13526815176010132, 0.05218711122870445, 0.09856722503900528, -0.03407856076955795, 0.0651320219039917, -0.2766907215118408, -0.12481997907161713, 0.07056980580091476, 0.11561047285795212, 0.02548161894083023, -0.1317833811044693, -0.06951312720775604, -0.04172048345208168, -0.19719307124614716, 0.09722261875867844, 0.018176114186644554, 0.10762660205364227, -0.008539832197129726, 0.0877348855137825, 0.04002365097403526, -0.038864679634571075, 0.17354866862297058, 0.02208695188164711, 0.04929129034280777, -0.08656616508960724, 0.029200175777077675, 0.1063707023859024, -0.0628262609243393, 0.1108556017279625, -0.025677472352981567, 0.07683419436216354, -0.2046104520559311, -0.03266717866063118, -0.05330689251422882, 0.0588456429541111, -0.060047026723623276, -0.06150415539741516, -0.027086343616247177, 0.041720353066921234, 0.043147630989551544, -0.03725948929786682, 0.09677432477474213, 0.02953796274960041, 0.048728153109550476, 0.14413374662399292, 0.08493780344724655, -0.003786628833040595, -0.15243695676326752, -0.015020877122879028, -0.015693115070462227, 0.05678670108318329, -0.06602973490953445, 0.014463501051068306, 0.1245274767279625, 0.026588577777147293, 0.13541558384895325, -0.004018374253064394, -0.06369217485189438, 0.013246312737464905, 0.028483200818300247, -0.10182193666696548, -0.19600605964660645, -0.03222506120800972, -0.03919961303472519, -0.15682366490364075, -0.0016207234002649784, 0.11679227650165558, -0.03475913777947426, -0.026268983259797096, -0.016905652359128, 0.015207197517156601, -0.0028714737854897976, 0.16255350410938263, 0.03872623294591904, 0.06927341967821121, -0.0768778994679451, 0.10887651890516281, 0.09441002458333969, -0.07818777859210968, 0.055273350328207016, 0.002200309420004487, -0.09465746581554413, -0.026257945224642754, 0.006548996549099684, 0.10672646015882492, -0.043114520609378815, -0.03495310619473457, -0.07249657064676285, -0.059818148612976074, 0.03960437327623367, -0.004090128466486931, 0.0633108988404274, -0.006974333897233009, -0.0126182297244668, 0.004521333612501621, -0.1068076565861702, 0.10865359008312225, 0.02544519305229187, 0.07206577807664871, -0.1580519825220108, 0.008431926369667053, 0.006447085179388523, 0.04115302488207817, -0.011966769583523273, 0.01519065722823143, -0.06064366549253464, -0.020076114684343338, -0.12068656831979752, 0.0026539780665189028, -0.032016608864068985, 0.00853186845779419, -0.027713026851415634, -0.09231521934270859, -0.03400692716240883, 0.06032887473702431, -0.05061507970094681, -0.0863686352968216, 0.026052193716168404, 0.050758734345436096, -0.12191271781921387, -0.03854606673121452, 0.027048448100686073, -0.08183807134628296, 0.10398715734481812, 0.05240914598107338, 0.030510788783431053, -0.015324952080845833, 0.0077355168759822845, 0.013927940279245377, 0.02057754434645176, 0.04308408871293068, 0.06586059927940369, -0.11261230707168579, -0.016120189800858498, -0.008503753691911697, 0.015993716195225716, 0.006551697850227356, 0.09311341494321823, -0.14269298315048218, -0.07567746937274933, -0.019012141972780228, -0.04509721323847771, -0.056282397359609604, 0.03654489293694496, 0.10748381167650223, 0.022946452721953392, 0.18186452984809875, -0.04426984116435051, 0.04420353099703789, -0.19824649393558502, -0.028265228495001793, 0.0003253138274885714, -0.036477066576480865, -0.06682026386260986, -0.0404447503387928, 0.08314710855484009, -0.05178464949131012, 0.11613672226667404, -0.017873002216219902, 0.15059177577495575, 0.03578883036971092, 0.020309049636125565, 0.045340646058321, -0.010938369669020176, 0.16923075914382935, 0.04918158799409866, -0.011449497193098068, 0.10772255808115005, -0.006245912052690983, 0.043191175907850266, 0.04725153371691704, 0.10434551537036896, 0.14633236825466156, 0.024783527478575706, 0.06286022812128067, 0.07723959535360336, -0.06469308584928513, -0.19249893724918365, 0.033479299396276474, 0.009894351474940777, 0.12192677706480026, -0.028927182778716087, 0.12238826602697372, 0.06487633287906647, -0.17000669240951538, 0.056492093950510025, -0.07835622876882553, -0.10153493285179138, -0.05689427629113197, -0.08804269134998322, -0.07914125919342041, -0.09839095175266266, 0.022122599184513092, -0.12466434389352798, 0.006326214876025915, 0.10295815020799637, -0.02626696601510048, -0.03791605308651924, 0.15360671281814575, -0.02216651663184166, -0.015850497409701347, 0.03626379743218422, 0.008285896852612495, -0.005945098586380482, -0.05042048171162605, -0.03349697217345238, 0.05966586992144585, 0.05597734451293945, 0.0944947749376297, -0.03710924834012985, -0.009443972259759903, 0.013121765106916428, 0.011507263407111168, -0.08433689922094345, 0.0013367198407649994, 0.042301878333091736, 0.020649585872888565, 0.060681551694869995, 0.035863880068063736, 0.03506399318575859, -0.04293400049209595, 0.23912714421749115, -0.047648582607507706, -0.06326068937778473, -0.1148318350315094, 0.13681650161743164, 0.03433498367667198, 0.0003931604151148349, 0.08103304356336594, -0.1198989674448967, 0.0029927100986242294, 0.11851752549409866, 0.12299502640962601, -0.0709678903222084, -0.029376745223999023, 0.01182536594569683, -0.012146265245974064, -0.04159603640437126, 0.06828952580690384, 0.09184343367815018, 0.02483198046684265, -0.05435723438858986, 0.015132397413253784, -0.01598108746111393, -0.023420941084623337, -0.05347414314746857, 0.05019865557551384, 0.02545064501464367, 0.022967936471104622, -0.03865895792841911, 0.059953369200229645, 0.030469195917248726, -0.1924617886543274, 0.03722401335835457, -0.18728044629096985, -0.1903773546218872, -0.006789908744394779, 0.08654893934726715, 0.00014827972336206585, 0.06717449426651001, 0.005980112124234438, 0.02657443657517433, 0.13105139136314392, -0.021603386849164963, -0.0813056156039238, -0.09324415028095245, 0.08872979134321213, -0.051743414252996445, 0.20753659307956696, -0.008950757794082165, 0.09418092668056488, 0.10883309692144394, 0.02430126816034317, -0.1563246101140976, 0.05912008509039879, 0.08690709620714188, -0.015039877034723759, 0.06655263900756836, 0.1280868947505951, -0.01665276288986206, 0.1033625677227974, 0.04751401022076607, -0.13569578528404236, -0.03734579309821129, -0.015390392392873764, 0.01598447374999523, -0.06226610764861107, -0.002662939252331853, -0.0790439173579216, 0.16945230960845947, 0.18099644780158997, -0.058232828974723816, -0.031167976558208466, -0.06673706322908401, 0.027309145778417587, 0.05409485846757889, 0.09507325291633606, -0.031935058534145355, -0.19419889152050018, 0.017368415370583534, 0.03863906115293503, 0.026609767228364944, -0.2253689169883728, -0.09646131098270416, 0.046017978340387344, -0.053203828632831573, -0.009687718003988266, 0.09981027245521545, 0.020395789295434952, 0.010648285038769245, -0.04655129462480545, -0.08419219404459, -0.06812684237957001, 0.13529807329177856, -0.14726144075393677, -0.06998463720083237 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # microsoft-deberta-large This model is a fine-tuned version of [microsoft_deberta-large](https://huggingface.co/microsoft/deberta-large) on the **squadV1** dataset. - "eval_exact_match": 87.89025543992432 - "eval_f1": 93.8755152147345 - "eval_samples": 10788 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 12 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Framework versions - Transformers 4.14.1 - Pytorch 1.9.0 - Datasets 1.16.1 - Tokenizers 0.10.3
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "microsoft-deberta-large", "results": []}]}
question-answering
Palak/microsoft_deberta-large_squad
[ "transformers", "pytorch", "deberta", "question-answering", "generated_from_trainer", "dataset:squad", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #deberta #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
# microsoft-deberta-large This model is a fine-tuned version of microsoft_deberta-large on the squadV1 dataset. - "eval_exact_match": 87.89025543992432 - "eval_f1": 93.8755152147345 - "eval_samples": 10788 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 12 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Framework versions - Transformers 4.14.1 - Pytorch 1.9.0 - Datasets 1.16.1 - Tokenizers 0.10.3
[ "# microsoft-deberta-large\n\nThis model is a fine-tuned version of microsoft_deberta-large on the squadV1 dataset.\n\n- \"eval_exact_match\": 87.89025543992432\n- \"eval_f1\": 93.8755152147345\n- \"eval_samples\": 10788", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 12\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3", "### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ "TAGS\n#transformers #pytorch #deberta #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n", "# microsoft-deberta-large\n\nThis model is a fine-tuned version of microsoft_deberta-large on the squadV1 dataset.\n\n- \"eval_exact_match\": 87.89025543992432\n- \"eval_f1\": 93.8755152147345\n- \"eval_samples\": 10788", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 12\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3", "### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ 44, 83, 6, 12, 8, 3, 90, 31 ]
[ "passage: TAGS\n#transformers #pytorch #deberta #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# microsoft-deberta-large\n\nThis model is a fine-tuned version of microsoft_deberta-large on the squadV1 dataset.\n\n- \"eval_exact_match\": 87.89025543992432\n- \"eval_f1\": 93.8755152147345\n- \"eval_samples\": 10788## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 12\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ -0.08983360975980759, 0.16909977793693542, -0.00310511514544487, 0.10033491998910904, 0.1285335123538971, 0.018628111109137535, 0.09676925092935562, 0.13370127975940704, -0.056290607899427414, 0.0922006145119667, 0.10577643662691116, 0.05658635124564171, 0.06482987850904465, 0.13245227932929993, -0.03663606196641922, -0.19782403111457825, 0.022308869287371635, 0.011164803989231586, 0.0028899884782731533, 0.08552311360836029, 0.124809630215168, -0.10024707019329071, 0.0912744328379631, -0.007816599681973457, -0.12536175549030304, 0.02485520951449871, -0.036728620529174805, -0.03704242408275604, 0.08739864826202393, 0.03154883533716202, 0.07954411953687668, -0.012200575321912766, 0.0845862552523613, -0.23014138638973236, -0.00976791325956583, 0.04602425917983055, 0.03000456839799881, 0.07170754671096802, 0.015947682783007622, 0.048076603561639786, 0.042667292058467865, -0.14625030755996704, 0.08918027579784393, 0.01626843772828579, -0.10748089849948883, -0.1671472042798996, -0.08979790657758713, 0.04778873175382614, 0.08620955795049667, 0.112161785364151, -0.006552441976964474, 0.17649364471435547, -0.019709648564457893, 0.06765419989824295, 0.1912146508693695, -0.23887909948825836, -0.03852522745728493, 0.05737633630633354, 0.0698486715555191, 0.05313534289598465, -0.08057426661252975, -0.021534578874707222, 0.04855939745903015, 0.03166721388697624, 0.07810986042022705, -0.01093901228159666, -0.02569235861301422, -0.02215571515262127, -0.117230124771595, -0.07495806366205215, 0.23064716160297394, 0.07304292172193527, -0.037573713809251785, -0.1271110475063324, -0.06021791324019432, -0.07290840148925781, -0.0006800352712161839, -0.06196044385433197, 0.02758009545505047, -0.054349955171346664, -0.04260937124490738, -0.052455175668001175, -0.07913550734519958, -0.03544899821281433, -0.006139492150396109, 0.06532971560955048, 0.030240429565310478, 0.020658547058701515, -0.03697671741247177, 0.0765654593706131, -0.047335490584373474, -0.14347918331623077, -0.042026907205581665, -0.009049459360539913, -0.04386056959629059, -0.04580134525895119, -0.06128496676683426, -0.0323476567864418, 0.006358655169606209, 0.179488405585289, -0.02778558060526848, 0.05347561091184616, 0.050628017634153366, -0.011503378860652447, 0.013680324889719486, 0.16393199563026428, -0.04474363848567009, -0.08122965693473816, 0.011257548816502094, 0.10230779647827148, 0.04720299690961838, -0.023751050233840942, -0.11381199210882187, -0.03274780884385109, 0.09509040415287018, 0.04056251049041748, 0.007056877017021179, 0.024119559675455093, -0.06851447373628616, -0.04992954805493355, 0.07151671499013901, -0.11807310581207275, 0.032269030809402466, -0.03202010318636894, -0.07942578196525574, -0.07655630260705948, 0.011013008654117584, 0.019940482452511787, -0.034874867647886276, 0.031708404421806335, -0.08644694089889526, -0.0339922197163105, -0.07861083000898361, -0.07287545502185822, 0.02919723466038704, -0.08228939026594162, 0.0369766466319561, -0.08130404353141785, -0.19293561577796936, -0.036663688719272614, 0.034718770533800125, -0.05361150950193405, -0.06144016236066818, -0.0321732833981514, -0.045747630298137665, 0.009136714041233063, -0.0250133965164423, 0.085761159658432, -0.04401839151978493, 0.060169536620378494, 0.06024160981178284, 0.03796875476837158, -0.026268672198057175, 0.04749434068799019, -0.09142868965864182, 0.031780533492565155, -0.0940789133310318, 0.08473377674818039, -0.09062568098306656, 0.01800581067800522, -0.11204169690608978, -0.09270352870225906, 0.029232267290353775, -0.027500275522470474, 0.0928974524140358, 0.10404366999864578, -0.1472189724445343, -0.0027539089787751436, 0.1166103184223175, -0.0586804561316967, -0.13236765563488007, 0.09616012871265411, -0.04656790941953659, 0.035939112305641174, 0.06018358841538429, 0.16512209177017212, 0.09675650298595428, -0.14656652510166168, -0.03961791470646858, 0.017346760258078575, 0.06053931266069412, 0.001893856329843402, 0.07036519050598145, -0.01042430941015482, 0.0795532837510109, 0.0159336905926466, -0.049460940062999725, -0.021028194576501846, -0.06395767629146576, -0.08094634115695953, -0.04939490929245949, -0.08111385256052017, 0.01179405115544796, 0.04050719365477562, 0.009314587339758873, -0.0835500955581665, -0.09914819151163101, 0.11017514020204544, 0.10301872342824936, -0.03522203490138054, -0.000024789213057374582, -0.08899612724781036, 0.048760343343019485, -0.06301796436309814, -0.03304581716656685, -0.18356406688690186, -0.1315380483865738, 0.05514891818165779, -0.057417675852775574, 0.019958017393946648, 0.03552968427538872, 0.04861462488770485, 0.045067302882671356, -0.04750689119100571, -0.01047307439148426, -0.1113143190741539, 0.009567786008119583, -0.11220260709524155, -0.1126602366566658, -0.06370071321725845, -0.031054554507136345, 0.14586205780506134, -0.16890451312065125, 0.008890556171536446, 0.005094601307064295, 0.13510358333587646, 0.021151624619960785, -0.06524085253477097, 0.0029957632068544626, 0.031485363841056824, -0.014295500703155994, -0.08574008196592331, 0.032134197652339935, 0.002825608942657709, -0.08889927715063095, -0.08802078664302826, -0.14613689482212067, 0.07559774070978165, 0.09092223644256592, 0.05815380439162254, -0.058772098273038864, -0.003398904111236334, -0.045889850705862045, -0.03710204362869263, -0.0675162747502327, -0.059611089527606964, 0.17820784449577332, 0.006108536850661039, 0.10252785682678223, -0.08198269456624985, -0.0632903203368187, 0.016129789873957634, -0.006749642547219992, -0.035079989582300186, 0.06776419281959534, 0.01735437475144863, -0.1499783992767334, 0.1062098890542984, 0.09915389120578766, -0.0017010100418701768, 0.10461796820163727, -0.03619380295276642, -0.09484445303678513, -0.05400910973548889, -0.01534088421612978, -0.009604143910109997, 0.13317033648490906, -0.11121203750371933, 0.014970543794333935, 0.056458089500665665, 0.025569535791873932, 0.024517541751265526, -0.14710980653762817, -0.011206523515284061, 0.04104228317737579, -0.03314723074436188, -0.034420762211084366, -0.013575016520917416, 0.010474324226379395, 0.06760960072278976, 0.04661795496940613, -0.014093196019530296, 0.030369983986020088, -0.02876153402030468, -0.07960820198059082, 0.16131730377674103, -0.09003161638975143, -0.20759010314941406, -0.14149297773838043, 0.0290971789509058, -0.06319156289100647, 0.004102771170437336, 0.03441222757101059, -0.04440249130129814, -0.06530509889125824, -0.10131307691335678, -0.00577795784920454, -0.04663636535406113, -0.03465068340301514, 0.07138462364673615, 0.01497923769056797, 0.0977291688323021, -0.1348869949579239, -0.007003489416092634, -0.007191800512373447, -0.012146241031587124, -0.018307112157344818, 0.030945811420679092, 0.1285676807165146, 0.04951401799917221, -0.010256515815854073, 0.01770394667983055, -0.0355062298476696, 0.30442094802856445, -0.08090516179800034, -0.01621416211128235, 0.1564730852842331, -0.008323266170918941, 0.06330938637256622, 0.12242081761360168, 0.017618371173739433, -0.11996497958898544, 0.03231457620859146, 0.0429527647793293, -0.01714143715798855, -0.2204580307006836, -0.03193443641066551, -0.03289048373699188, -0.05333619564771652, 0.10162938386201859, 0.02129458263516426, 0.0007137933862395585, 0.054862912744283676, -0.017161382362246513, 0.05450144410133362, -0.035827815532684326, 0.0939420536160469, 0.16385239362716675, 0.020335126668214798, 0.09414506703615189, -0.03265947848558426, -0.029338445514440536, 0.05883540213108063, 0.0237234178930521, 0.2126656472682953, 0.006391590926796198, 0.1502705067396164, 0.010158703662455082, 0.15033191442489624, -0.05237696319818497, 0.028847649693489075, 0.003886974649503827, 0.012082220986485481, -0.017006216570734978, -0.048612479120492935, -0.05587292090058327, 0.04626152664422989, 0.013325315900146961, 0.033974673599004745, -0.0723261907696724, 0.03259868547320366, 0.008431784808635712, 0.2084226906299591, 0.08104031533002853, -0.3112160861492157, -0.09676950424909592, 0.0329863578081131, -0.016521379351615906, -0.07213034480810165, -0.010163482278585434, 0.10976079106330872, -0.11994803696870804, 0.066502645611763, -0.026100503280758858, 0.10006784647703171, -0.0689430758357048, 0.015675557777285576, 0.018526118248701096, 0.07992620766162872, 0.01688476838171482, 0.10674933344125748, -0.21847335994243622, 0.19183173775672913, 0.02796749770641327, 0.08340488374233246, -0.08317625522613525, 0.03728223592042923, -0.028778385370969772, 0.021023740991950035, 0.11259780824184418, 0.0070052072405815125, -0.022493012249469757, -0.1668355017900467, -0.09657754004001617, 0.02024570293724537, 0.09057929366827011, -0.047701146453619, 0.08427680283784866, -0.02610885724425316, 0.01669936254620552, 0.04276067018508911, 0.01523877214640379, -0.11295440047979355, -0.15068718791007996, 0.04672717675566673, 0.020702283829450607, -0.034119948744773865, -0.060770317912101746, -0.0820019394159317, -0.024472780525684357, 0.16109246015548706, 0.012981114909052849, -0.0636654943227768, -0.12163065373897552, 0.0877307653427124, 0.13568897545337677, -0.07040776312351227, 0.025222424417734146, 0.01300556119531393, 0.11765720695257187, 0.020391719415783882, -0.060486115515232086, 0.08643043786287308, -0.05294102802872658, -0.13931047916412354, -0.06373101472854614, 0.12540759146213531, 0.021518399938941002, 0.05098720267415047, 0.008016490377485752, 0.03515361249446869, -0.014868601225316525, -0.07177038490772247, 0.039017096161842346, 0.0490388348698616, 0.08600007742643356, 0.03320785611867905, -0.016630956903100014, 0.04625515639781952, -0.05355968698859215, -0.023395167663693428, 0.13236203789710999, 0.22909368574619293, -0.08267059922218323, 0.02965114265680313, 0.05073792487382889, -0.059238534420728683, -0.1558154970407486, 0.028617821633815765, 0.09936731308698654, 0.013795445673167706, 0.06877493113279343, -0.1450517177581787, 0.06741984933614731, 0.10052569955587387, -0.02317062020301819, 0.06563293933868408, -0.28910666704177856, -0.1312902420759201, 0.0630773976445198, 0.08911588788032532, 0.036880236119031906, -0.11820986866950989, -0.058417465537786484, -0.040963832288980484, -0.1985587328672409, 0.09626612812280655, -0.005881245713680983, 0.10206330567598343, 0.00038070825394243, 0.09655463695526123, 0.03208843618631363, -0.04672684520483017, 0.1611429750919342, 0.04443821683526039, 0.06662587076425552, -0.08004376292228699, 0.013963930308818817, 0.10746166110038757, -0.0722249299287796, 0.1045609563589096, -0.01619911566376686, 0.07188117504119873, -0.1859143227338791, -0.0359354205429554, -0.04169295355677605, 0.05113333463668823, -0.05760149657726288, -0.05759324133396149, -0.04421506077051163, 0.033320050686597824, 0.058712005615234375, -0.029730739071965218, 0.10774024575948715, 0.030968274921178818, 0.040160614997148514, 0.14357341825962067, 0.0847732350230217, -0.017803646624088287, -0.12319689989089966, 0.0022680112160742283, -0.02345309965312481, 0.05262180045247078, -0.09765017032623291, 0.02695332281291485, 0.13017496466636658, 0.024860939010977745, 0.12939751148223877, 0.005953313782811165, -0.06272454559803009, 0.012485897168517113, 0.016465680673718452, -0.11544288694858551, -0.18581421673297882, -0.01543492916971445, -0.01965618133544922, -0.1620555967092514, 0.006959240417927504, 0.11283008754253387, -0.03362736850976944, -0.02003096416592598, -0.021658655256032944, 0.020071592181921005, 0.003787579480558634, 0.1626632660627365, 0.04468320310115814, 0.06465379148721695, -0.07765687257051468, 0.12152470648288727, 0.08983863145112991, -0.10221590846776962, 0.0715307965874672, 0.015619625337421894, -0.08386096358299255, -0.022607998922467232, 0.013740613125264645, 0.11922542750835419, -0.06680502742528915, -0.030640730634331703, -0.09105833619832993, -0.06353016942739487, 0.04870101064443588, 0.06436537206172943, 0.06323859095573425, -0.011684170924127102, -0.009751853533089161, 0.011665785685181618, -0.11632265895605087, 0.11068391054868698, 0.036685626953840256, 0.08236029744148254, -0.1410212516784668, 0.021186262369155884, 0.0008012441685423255, 0.03162410110235214, -0.01252797618508339, 0.024237332865595818, -0.07800891250371933, -0.016606999561190605, -0.12524299323558807, -0.0021675226744264364, -0.026085466146469116, 0.005492423195391893, -0.031914301216602325, -0.08107401430606842, -0.0358739010989666, 0.05146143212914467, -0.055673159658908844, -0.07346602529287338, 0.017404604703187943, 0.04962354898452759, -0.13088932633399963, -0.0407605916261673, 0.0317935049533844, -0.08348246663808823, 0.09899132698774338, 0.045211415737867355, 0.025615449994802475, -0.019099848344922066, -0.004913308657705784, 0.009338336996734142, 0.014886254444718361, 0.055317893624305725, 0.07720272988080978, -0.11190184205770493, -0.016848541796207428, -0.008815671317279339, 0.02930149808526039, 0.009490142576396465, 0.0852198451757431, -0.1335759162902832, -0.039102017879486084, -0.021680014207959175, -0.06487826257944107, -0.05181092396378517, 0.021410392597317696, 0.10866666585206985, 0.011475743725895882, 0.1708267480134964, -0.049643080681562424, 0.053235575556755066, -0.195473775267601, -0.024775350466370583, 0.010421677492558956, -0.0423826240003109, -0.06153488904237747, -0.03852350637316704, 0.0795375257730484, -0.059953391551971436, 0.1532774269580841, -0.024801062420010567, 0.1222698912024498, 0.04486639052629471, -0.008463192731142044, 0.02575414627790451, -0.010410242713987827, 0.16818521916866302, 0.045478783547878265, -0.02234026789665222, 0.0912160575389862, -0.0039795092307031155, 0.06614569574594498, 0.050826579332351685, 0.11845028400421143, 0.15530481934547424, 0.01754109375178814, 0.06137775629758835, 0.07881540805101395, -0.06611921638250351, -0.17815113067626953, 0.04001443833112717, 0.00750842597335577, 0.10801181942224503, -0.03271126747131348, 0.11589338630437851, 0.07125148922204971, -0.17047399282455444, 0.05102373659610748, -0.08015383034944534, -0.09421460330486298, -0.07384572923183441, -0.06263075768947601, -0.089137502014637, -0.0895242765545845, 0.022237302735447884, -0.13484837114810944, 0.01451105810701847, 0.10055377334356308, -0.02117956057190895, -0.029220888391137123, 0.15674851834774017, -0.017141809687018394, -0.0050190347246825695, 0.029187140986323357, 0.008548942394554615, -0.007104542572051287, -0.038010530173778534, -0.03852180764079094, 0.05241977795958519, 0.04396190866827965, 0.08271463960409164, -0.02275487594306469, 0.007870643399655819, -0.0008699470781721175, 0.0019090912537649274, -0.07891829311847687, -0.009926876053214073, 0.042492225766181946, 0.018660061061382294, 0.052150145173072815, 0.042562466114759445, 0.02984444983303547, -0.04049980640411377, 0.21247802674770355, -0.06112859770655632, -0.07015925645828247, -0.11572397500276566, 0.15234673023223877, 0.0388258621096611, 0.009513488970696926, 0.0600062720477581, -0.12189480662345886, -0.009196283295750618, 0.1451418399810791, 0.12274781614542007, -0.06877818703651428, -0.031796298921108246, 0.006823369767516851, -0.0042193238623440266, -0.03733043000102043, 0.06732478737831116, 0.10205300897359848, 0.02993706613779068, -0.04754823446273804, -0.009664508514106274, -0.016562288627028465, -0.018495984375476837, -0.06720654666423798, 0.06084403023123741, 0.02182452566921711, 0.018577901646494865, -0.03877974674105644, 0.057786401361227036, 0.016286542639136314, -0.18272288143634796, 0.036496490240097046, -0.20245644450187683, -0.17558352649211884, 0.006578824948519468, 0.09823661297559738, -0.005123719107359648, 0.06335869431495667, -0.005728801246732473, 0.025559520348906517, 0.12343337386846542, -0.013973175548017025, -0.08081797510385513, -0.08077487349510193, 0.08296601474285126, -0.07703111320734024, 0.20549972355365753, -0.008067364804446697, 0.10022126138210297, 0.11432644724845886, 0.019789252430200577, -0.1482621282339096, 0.050972532480955124, 0.09351669996976852, -0.007092977873980999, 0.032963644713163376, 0.12616030871868134, -0.024836527183651924, 0.12198112159967422, 0.06596353650093079, -0.1273246854543686, -0.03631357103586197, 0.021325230598449707, 0.012176924385130405, -0.07192463427782059, -0.00944171566516161, -0.08180490881204605, 0.1658214032649994, 0.1751585751771927, -0.05899908021092415, -0.02534092590212822, -0.057429876178503036, 0.027222150936722755, 0.057224757969379425, 0.07371282577514648, -0.04072178900241852, -0.1923530399799347, 0.020663103088736534, 0.05206664279103279, 0.024585848674178123, -0.22322803735733032, -0.08677606284618378, 0.04788055643439293, -0.04424423351883888, -0.023761186748743057, 0.09970967471599579, 0.04447820410132408, 0.019251231104135513, -0.04794960096478462, -0.08696147799491882, -0.06281213462352753, 0.13544173538684845, -0.14298062026500702, -0.07153546810150146 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # xlm-roberta-base_squad This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the squad dataset. - "eval_exact_match": 82.69631031220435 - "eval_f1": 89.4562841806503 - "eval_samples": 10918 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results ### Framework versions - Transformers 4.14.1 - Pytorch 1.9.0 - Datasets 1.16.1 - Tokenizers 0.10.3
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "xlm-roberta-base_squad", "results": []}]}
question-answering
Palak/xlm-roberta-base_squad
[ "transformers", "pytorch", "xlm-roberta", "question-answering", "generated_from_trainer", "dataset:squad", "license:mit", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #xlm-roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us
# xlm-roberta-base_squad This model is a fine-tuned version of xlm-roberta-base on the squad dataset. - "eval_exact_match": 82.69631031220435 - "eval_f1": 89.4562841806503 - "eval_samples": 10918 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results ### Framework versions - Transformers 4.14.1 - Pytorch 1.9.0 - Datasets 1.16.1 - Tokenizers 0.10.3
[ "# xlm-roberta-base_squad\n\nThis model is a fine-tuned version of xlm-roberta-base on the squad dataset.\n- \"eval_exact_match\": 82.69631031220435\n- \"eval_f1\": 89.4562841806503\n- \"eval_samples\": 10918", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 32\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0", "### Training results", "### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ "TAGS\n#transformers #pytorch #xlm-roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n", "# xlm-roberta-base_squad\n\nThis model is a fine-tuned version of xlm-roberta-base on the squad dataset.\n- \"eval_exact_match\": 82.69631031220435\n- \"eval_f1\": 89.4562841806503\n- \"eval_samples\": 10918", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 32\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0", "### Training results", "### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ 51, 81, 6, 12, 8, 3, 90, 4, 31 ]
[ "passage: TAGS\n#transformers #pytorch #xlm-roberta #question-answering #generated_from_trainer #dataset-squad #license-mit #endpoints_compatible #region-us \n# xlm-roberta-base_squad\n\nThis model is a fine-tuned version of xlm-roberta-base on the squad dataset.\n- \"eval_exact_match\": 82.69631031220435\n- \"eval_f1\": 89.4562841806503\n- \"eval_samples\": 10918## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 32\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0### Training results### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ -0.10479459911584854, 0.17437772452831268, -0.0023308966774493456, 0.08952696621417999, 0.138206347823143, 0.030553385615348816, 0.08078014105558395, 0.15244494378566742, -0.061350516974925995, 0.08192287385463715, 0.10958252847194672, 0.07629124075174332, 0.06174332648515701, 0.13154523074626923, -0.035628266632556915, -0.1888715922832489, -0.0008295715088024735, 0.01625140942633152, 0.006529924459755421, 0.10923091322183609, 0.11807827651500702, -0.08945456147193909, 0.08979246765375137, 0.011188860982656479, -0.12284155935049057, 0.0171803031116724, -0.026366116479039192, -0.04363963380455971, 0.0874171182513237, 0.022995134815573692, 0.06944859772920609, -0.021020391955971718, 0.08444603532552719, -0.21471160650253296, -0.0027799480594694614, 0.047702688723802567, 0.023207491263747215, 0.09297030419111252, 0.007935694418847561, 0.003158576786518097, 0.07606370002031326, -0.13840802013874054, 0.08506666123867035, 0.015676002949476242, -0.0912150889635086, -0.16544994711875916, -0.09867580235004425, 0.0630452111363411, 0.08086152374744415, 0.11053895205259323, 0.0016271050553768873, 0.20123893022537231, -0.02745196223258972, 0.06977186352014542, 0.17354562878608704, -0.2761240601539612, -0.0530557744204998, 0.019291294738650322, 0.05822763219475746, 0.05887364596128464, -0.10340529680252075, -0.004142418038100004, 0.03343496471643448, 0.043741628527641296, 0.09654845297336578, -0.022228211164474487, -0.04476148635149002, -0.012780777178704739, -0.11002703756093979, -0.0731382742524147, 0.21727177500724792, 0.07426724582910538, -0.06508485972881317, -0.12368694692850113, -0.03338676318526268, -0.09055446833372116, -0.006014841143041849, -0.03563142195343971, 0.010771487839519978, -0.05902833119034767, -0.06173868477344513, -0.06020282581448555, -0.08760752528905869, -0.03958085924386978, 0.00948577094823122, 0.11814196407794952, 0.020650401711463928, 0.03364643454551697, -0.006420640274882317, 0.08399323374032974, -0.04870973899960518, -0.13885824382305145, -0.027300536632537842, -0.013106215745210648, -0.06250336766242981, -0.047319766134023666, -0.03819675371050835, -0.05941414460539818, -0.007124945055693388, 0.16476255655288696, -0.02641487866640091, 0.045926570892333984, 0.046455834060907364, -0.005646038800477982, 0.005276681389659643, 0.13624981045722961, -0.04136962816119194, -0.08579535037279129, 0.008019961416721344, 0.12077966332435608, 0.028858447447419167, -0.022886035963892937, -0.0908261314034462, -0.02670419216156006, 0.08934499323368073, 0.03683788329362869, 0.00752404285594821, 0.022054964676499367, -0.05404539033770561, -0.05110159143805504, 0.07606708258390427, -0.11076685041189194, 0.03204365074634552, -0.0358656570315361, -0.09130216389894485, -0.09642284363508224, 0.026565302163362503, 0.014630934223532677, -0.03478062152862549, 0.04979439079761505, -0.09147369116544724, -0.019305096939206123, -0.07136212289333344, -0.06939057260751724, 0.0043940977193415165, -0.05331670120358467, 0.020479781553149223, -0.07529392093420029, -0.19053387641906738, -0.0393996499478817, 0.046885959804058075, -0.06448535621166229, -0.0753827691078186, -0.012597808614373207, -0.045456722378730774, 0.021939244121313095, -0.020461799576878548, 0.12906761467456818, -0.048558928072452545, 0.08053164929151535, 0.037881895899772644, 0.011769985780119896, -0.02350895293056965, 0.04904000833630562, -0.09321694821119308, 0.02869059331715107, -0.07060469686985016, 0.07405230402946472, -0.09071298688650131, 0.019594088196754456, -0.12839920818805695, -0.1046157032251358, 0.006893053185194731, -0.036803025752305984, 0.08454595506191254, 0.09933236986398697, -0.1052146926522255, -0.006891011260449886, 0.1059454083442688, -0.049407199025154114, -0.12845921516418457, 0.10011027753353119, -0.04866815358400345, 0.02800452522933483, 0.05232294276356697, 0.15309873223304749, 0.13684794306755066, -0.1379421502351761, -0.043182820081710815, 0.02464868314564228, 0.05382672697305679, -0.026298554614186287, 0.09881307184696198, -0.0006069238879717886, 0.03354364261031151, 0.012987678870558739, -0.06928534805774689, -0.002802195493131876, -0.07630135118961334, -0.09391575306653976, -0.04031604155898094, -0.08413895964622498, 0.029804755002260208, 0.04294423386454582, 0.018297232687473297, -0.07957878708839417, -0.1216568723320961, 0.05247003212571144, 0.12315413355827332, -0.02966972626745701, -0.0072094895876944065, -0.07722847908735275, 0.07689466327428818, -0.08514153212308884, -0.04312308132648468, -0.19563215970993042, -0.10340040922164917, 0.048218272626399994, -0.01599770225584507, 0.034589577466249466, 0.024490535259246826, 0.05662829428911209, 0.05904191732406616, -0.046328648924827576, -0.020156998187303543, -0.1000296026468277, -0.009155269712209702, -0.11001690477132797, -0.12394116818904877, -0.07632438838481903, -0.04044007137417793, 0.18871450424194336, -0.1964857429265976, -0.006904918234795332, -0.030416540801525116, 0.10711002349853516, 0.018312232568860054, -0.06799193471670151, 0.011291523464024067, 0.03080080635845661, -0.0064875283278524876, -0.07822299003601074, 0.04568867012858391, 0.017974788323044777, -0.09301971644163132, -0.07652336359024048, -0.12699368596076965, 0.06907407194375992, 0.08042355626821518, 0.07857726514339447, -0.08333352953195572, -0.02535359188914299, -0.05919323489069939, -0.03339807689189911, -0.06342034786939621, -0.03423023223876953, 0.19250236451625824, 0.016400326043367386, 0.12268242239952087, -0.05865911766886711, -0.05200135335326195, 0.01380718033760786, 0.0020918874070048332, -0.024462785571813583, 0.07340151816606522, 0.029614189639687538, -0.18060924112796783, 0.09563814103603363, 0.06824221462011337, -0.017224637791514397, 0.11062554270029068, -0.027658428996801376, -0.09186030179262161, -0.05815272778272629, 0.011637020856142044, -0.008814042434096336, 0.1382046490907669, -0.07030081748962402, 0.012504097074270248, 0.07092158496379852, 0.0026942857075482607, 0.018343761563301086, -0.15404780209064484, -0.018338346853852272, 0.038777630776166916, -0.02809857577085495, -0.021143879741430283, -0.013151293620467186, 0.0015709841391071677, 0.07576334476470947, 0.044844914227724075, -0.029883893206715584, 0.024367880076169968, -0.019743049517273903, -0.068141408264637, 0.16276715695858002, -0.08543473482131958, -0.18241767585277557, -0.15055985748767853, 0.024618087336421013, -0.06929361820220947, -0.011455664411187172, 0.019005367532372475, -0.019105419516563416, -0.05801244080066681, -0.08950542658567429, -0.052186526358127594, -0.061257679015398026, -0.02620433084666729, 0.0460762083530426, -0.004459809046238661, 0.08689135313034058, -0.12621212005615234, -0.003650123719125986, -0.019608212634921074, -0.030091626569628716, -0.010114844888448715, 0.05016988515853882, 0.12365926802158356, 0.04988735541701317, -0.02002432383596897, 0.0235110055655241, -0.03408118709921837, 0.2845809757709503, -0.0779176726937294, -0.03085104562342167, 0.13919539749622345, 0.013384697027504444, 0.06952279061079025, 0.1088317483663559, 0.024961093440651894, -0.08324652910232544, 0.019378745928406715, 0.04308459907770157, -0.025014685466885567, -0.2150408923625946, -0.03844337910413742, -0.044262561947107315, -0.09258553385734558, 0.11703210324048996, 0.03931858763098717, 0.02041028067469597, 0.08067132532596588, -0.01731552556157112, 0.04186118021607399, -0.058774396777153015, 0.09365587681531906, 0.10170214623212814, 0.04458468034863472, 0.1038503348827362, -0.02657594159245491, -0.03193271905183792, 0.05786710977554321, 0.029967954382300377, 0.2295333743095398, -0.03127380087971687, 0.14747320115566254, 0.024425750598311424, 0.17546311020851135, -0.05215653032064438, 0.03318626806139946, -0.011482271365821362, 0.009055876173079014, -0.015754377469420433, -0.05467573180794716, -0.07183516770601273, 0.036104172468185425, 0.03111199289560318, 0.05835318937897682, -0.07141134142875671, 0.012320119887590408, -0.002867580857127905, 0.23002249002456665, 0.06412148475646973, -0.3194808065891266, -0.09139136970043182, 0.016202179715037346, -0.008788034319877625, -0.07454439252614975, -0.02440565638244152, 0.10278436541557312, -0.13638116419315338, 0.06310562044382095, -0.05926915630698204, 0.08952277898788452, -0.04865430295467377, -0.0067352354526519775, 0.03731860592961311, 0.08056338131427765, 0.00186909397598356, 0.10566315799951553, -0.17269107699394226, 0.20971259474754333, 0.03250003606081009, 0.0987888053059578, -0.08262214809656143, 0.04319773614406586, -0.0091019282117486, 0.031156137585639954, 0.12439081817865372, 0.0005193891702219844, -0.004060680512338877, -0.1830592006444931, -0.09518566727638245, 0.024801483377814293, 0.06939893960952759, -0.06158261001110077, 0.09578685462474823, -0.03649725764989853, 0.009178084321320057, 0.029286103323101997, 0.01502315979450941, -0.09747031331062317, -0.14467644691467285, 0.045214638113975525, 0.02733287401497364, -0.05233990028500557, -0.06053784117102623, -0.08867041021585464, -0.007196313235908747, 0.17058734595775604, 0.05737167224287987, -0.063776396214962, -0.1325330287218094, 0.09121432155370712, 0.132109597325325, -0.0792604386806488, 0.003157308092340827, -0.00021497736452147365, 0.10897678881883621, 0.023927276954054832, -0.054494451731443405, 0.04791278392076492, -0.05056949332356453, -0.11811728030443192, -0.04914060980081558, 0.15258514881134033, 0.013357797637581825, 0.05879291892051697, 0.013380121439695358, 0.0361567921936512, -0.03091900609433651, -0.08030563592910767, 0.03943326324224472, -0.005987665615975857, 0.11199682205915451, 0.06028914824128151, -0.01990731805562973, 0.0215898510068655, -0.06044397130608559, -0.0011509429896250367, 0.14714325964450836, 0.23122912645339966, -0.08047975599765778, 0.051199786365032196, 0.047566045075654984, -0.05618881806731224, -0.14298015832901, 0.011132769286632538, 0.108412005007267, 0.025735802948474884, 0.09641919285058975, -0.14690659940242767, 0.05728917568922043, 0.07669737935066223, -0.02897396683692932, 0.06859638541936874, -0.2792356312274933, -0.1089690700173378, 0.05887223780155182, 0.11661569029092789, 0.04471822828054428, -0.13469284772872925, -0.06022720783948898, -0.01773417927324772, -0.1840345710515976, 0.12034597992897034, -0.03382940962910652, 0.11452198028564453, -0.00899039302021265, 0.10663776844739914, 0.040907081216573715, -0.048195015639066696, 0.1613728106021881, 0.021496152505278587, 0.05890541523694992, -0.06972528249025345, 0.0009081306052394211, 0.11154873669147491, -0.05995937064290047, 0.1046508252620697, -0.05098697543144226, 0.06659048050642014, -0.2248968482017517, -0.023395277559757233, -0.0681115984916687, 0.055817291140556335, -0.0458819717168808, -0.06415922939777374, -0.036626022309064865, 0.05265221744775772, 0.05638203024864197, -0.026728112250566483, 0.10445630550384521, 0.037580907344818115, 0.08812899142503738, 0.13475705683231354, 0.07426010817289352, 0.007141917943954468, -0.14842814207077026, -0.016723476350307465, -0.02054501511156559, 0.05383557453751564, -0.10270513594150543, 0.01022084429860115, 0.11840838938951492, 0.04209768772125244, 0.14124032855033875, 0.005553957540541887, -0.060367465019226074, 0.018890107050538063, 0.035707272589206696, -0.11183758080005646, -0.15938489139080048, -0.024765603244304657, -0.02861064299941063, -0.17184439301490784, -0.0160412285476923, 0.12176641076803207, -0.0409044474363327, -0.02547643706202507, -0.02086670696735382, 0.016914483159780502, -0.012948626652359962, 0.16846390068531036, 0.05207596719264984, 0.06466002762317657, -0.07471353560686111, 0.07612413167953491, 0.0928022637963295, -0.04661474749445915, 0.05775479972362518, 0.010433378629386425, -0.0826495960354805, -0.028107525780797005, 0.012422347441315651, 0.1202419176697731, -0.046188052743673325, -0.024585308507084846, -0.07738476991653442, -0.04450668394565582, 0.03678937256336212, 0.005023575387895107, 0.051868729293346405, -0.03085574321448803, -0.024949558079242706, -0.0026071227621287107, -0.12789835035800934, 0.1088268905878067, 0.026255076751112938, 0.065107561647892, -0.13259568810462952, 0.04881247133016586, -0.002781919203698635, 0.053358640521764755, -0.009629875421524048, 0.00935649685561657, -0.05644036829471588, -0.01304212212562561, -0.10164812952280045, 0.0036900630220770836, -0.0321258082985878, 0.0065969498828053474, -0.03112747147679329, -0.08196436613798141, -0.052975960075855255, 0.05123380571603775, -0.0647595077753067, -0.07344215363264084, 0.033238641917705536, 0.04700566828250885, -0.13717558979988098, -0.03997499868273735, 0.028094787150621414, -0.0722178965806961, 0.08035697788000107, 0.057233069092035294, 0.030140837654471397, -0.01156751811504364, -0.024365929886698723, -0.0018385754665359855, 0.008600151166319847, 0.039666980504989624, 0.06321157515048981, -0.11245718598365784, -0.005494985729455948, -0.01886051520705223, 0.028693392872810364, 0.018886808305978775, 0.07568405568599701, -0.1385485976934433, -0.059642594307661057, -0.03094705194234848, -0.04425911605358124, -0.059818752110004425, 0.05427153408527374, 0.09048847109079361, 0.03662007302045822, 0.17419566214084625, -0.04409391060471535, 0.04866115376353264, -0.20917460322380066, -0.031243445351719856, -0.007481458596885204, -0.04195769131183624, -0.06338568776845932, -0.0399610660970211, 0.07551068812608719, -0.05439632013440132, 0.11124156415462494, 0.00994529202580452, 0.12779627740383148, 0.03348545730113983, 0.010983830317854881, 0.03612741082906723, -0.029042955487966537, 0.16162537038326263, 0.05538839474320412, -0.020595720037817955, 0.1041983962059021, -0.008166809566318989, 0.05064614862203598, 0.05244811624288559, 0.11867153644561768, 0.14191554486751556, 0.057006120681762695, 0.04310624301433563, 0.0629795715212822, -0.07827486842870712, -0.1642398238182068, 0.0348726361989975, 0.0080258222296834, 0.10880140215158463, -0.026596426963806152, 0.13728582859039307, 0.07598292082548141, -0.17168793082237244, 0.06093413755297661, -0.06614233553409576, -0.09939347952604294, -0.06676725298166275, -0.09082668274641037, -0.07376059889793396, -0.07625912874937057, 0.02667386271059513, -0.12219852954149246, 0.020841296762228012, 0.10686913877725601, -0.02249661460518837, -0.018982917070388794, 0.14502331614494324, -0.03792982175946236, -0.00577244907617569, 0.038731299340724945, 0.002782098948955536, 0.0017053409246727824, -0.048156749457120895, -0.027359578758478165, 0.05731913074851036, 0.02455107495188713, 0.10212862491607666, -0.04853200167417526, 0.011748936027288437, 0.026825612410902977, 0.0028642690740525723, -0.0910184234380722, -0.004953326657414436, 0.033115826547145844, 0.053573574870824814, 0.058842044323682785, 0.04002983495593071, 0.03751925751566887, -0.04557356610894203, 0.25711020827293396, -0.04528864845633507, -0.04925590381026268, -0.12620724737644196, 0.12358224391937256, 0.029510196298360825, -0.018069151788949966, 0.06770601123571396, -0.11016833782196045, 0.014580456539988518, 0.13599489629268646, 0.14200705289840698, -0.0759558230638504, -0.024402085691690445, 0.005062539130449295, -0.008500325493514538, -0.03655751422047615, 0.07883115112781525, 0.08297861367464066, 0.026996225118637085, -0.06945851445198059, 0.01207997277379036, -0.02743816375732422, -0.01726868376135826, -0.0593208447098732, 0.0546719953417778, 0.02563469484448433, 0.020573830232024193, -0.03138568252325058, 0.06201951950788498, 0.04043005406856537, -0.19831395149230957, 0.030657047405838966, -0.18929007649421692, -0.1952703297138214, -0.00768949743360281, 0.07791660726070404, 0.008250320330262184, 0.06461495906114578, -0.010910608805716038, 0.018886858597397804, 0.11309577524662018, -0.01864137314260006, -0.07290718704462051, -0.09694182872772217, 0.10633667558431625, -0.0675656720995903, 0.2164604663848877, -0.009403887204825878, 0.09082676470279694, 0.11108756065368652, 0.0015281320083886385, -0.14058980345726013, 0.046895477920770645, 0.0920598953962326, -0.01825701631605625, 0.06058810278773308, 0.14747868478298187, -0.026768147945404053, 0.10673195868730545, 0.045015934854745865, -0.10333618521690369, -0.036479853093624115, -0.031467292457818985, 0.0185961052775383, -0.06497985124588013, -0.013415439054369926, -0.07543216645717621, 0.17640461027622223, 0.18223972618579865, -0.0577593594789505, -0.02534375712275505, -0.07213529944419861, 0.026792120188474655, 0.07213761657476425, 0.08733297139406204, -0.0482979416847229, -0.17889928817749023, 0.006101495120674372, 0.047309692949056625, 0.03142130374908447, -0.27351540327072144, -0.0899609625339508, 0.0479351244866848, -0.055663831532001495, -0.016584988683462143, 0.09622549265623093, 0.03742944449186325, 0.02232319302856922, -0.04811583831906319, -0.09313704073429108, -0.0662337988615036, 0.12200135737657547, -0.14187899231910706, -0.056449104100465775 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # eval This model is a fine-tuned version of [xlm-roberta-large](https://huggingface.co/xlm-roberta-large) on the squad dataset. - eval_exact_match": 85.96026490066225 - "eval_f1": 92.25000664341768 - "eval_samples": 10918 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 12 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 0.67 ### Framework versions - Transformers 4.14.1 - Pytorch 1.9.0 - Datasets 1.16.1 - Tokenizers 0.10.3
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "xlm-roberta-base_squad", "results": []}]}
question-answering
Palak/xlm-roberta-large_squad
[ "transformers", "pytorch", "xlm-roberta", "question-answering", "generated_from_trainer", "dataset:squad", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #xlm-roberta #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
# eval This model is a fine-tuned version of xlm-roberta-large on the squad dataset. - eval_exact_match": 85.96026490066225 - "eval_f1": 92.25000664341768 - "eval_samples": 10918 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 12 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 0.67 ### Framework versions - Transformers 4.14.1 - Pytorch 1.9.0 - Datasets 1.16.1 - Tokenizers 0.10.3
[ "# eval\n\nThis model is a fine-tuned version of xlm-roberta-large on the squad dataset.\n\n- eval_exact_match\": 85.96026490066225\n- \"eval_f1\": 92.25000664341768\n- \"eval_samples\": 10918", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 12\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 0.67", "### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ "TAGS\n#transformers #pytorch #xlm-roberta #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n", "# eval\n\nThis model is a fine-tuned version of xlm-roberta-large on the squad dataset.\n\n- eval_exact_match\": 85.96026490066225\n- \"eval_f1\": 92.25000664341768\n- \"eval_samples\": 10918", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 12\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 0.67", "### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ 46, 72, 6, 12, 8, 3, 92, 31 ]
[ "passage: TAGS\n#transformers #pytorch #xlm-roberta #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# eval\n\nThis model is a fine-tuned version of xlm-roberta-large on the squad dataset.\n\n- eval_exact_match\": 85.96026490066225\n- \"eval_f1\": 92.25000664341768\n- \"eval_samples\": 10918## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 12\n- eval_batch_size: 32\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 0.67### Framework versions\n\n- Transformers 4.14.1\n- Pytorch 1.9.0\n- Datasets 1.16.1\n- Tokenizers 0.10.3" ]
[ -0.09965424239635468, 0.14388611912727356, -0.0037921529728919268, 0.09797785431146622, 0.12629343569278717, 0.021869201213121414, 0.09455336630344391, 0.15310388803482056, -0.07383833825588226, 0.0841752141714096, 0.07840606570243835, 0.059349268674850464, 0.05647196248173714, 0.11661789566278458, -0.036075957119464874, -0.20056015253067017, -0.007974841631948948, 0.018992245197296143, -0.037223298102617264, 0.09450800716876984, 0.10926640033721924, -0.11344214528799057, 0.08791867643594742, 0.002565639093518257, -0.12215451151132584, 0.05187268182635307, -0.013765688985586166, -0.04931831359863281, 0.09464855492115021, 0.013712745159864426, 0.0878768041729927, -0.019669461995363235, 0.11318271607160568, -0.23135778307914734, -0.003227734938263893, 0.059979550540447235, 0.030768632888793945, 0.08908314257860184, 0.03838132321834564, -0.015112873166799545, 0.09806209057569504, -0.15649662911891937, 0.08171562850475311, 0.02188829332590103, -0.09034498035907745, -0.17309369146823883, -0.09643572568893433, 0.07495661079883575, 0.08940301090478897, 0.10332237184047699, 0.0104463966563344, 0.1709396094083786, -0.054188072681427, 0.0909658819437027, 0.24783267080783844, -0.25691384077072144, -0.049219220876693726, 0.025572583079338074, 0.051887206733226776, 0.05311036482453346, -0.09698663651943207, -0.003824786050245166, 0.053948741406202316, 0.02894730307161808, 0.09888283908367157, -0.023020513355731964, -0.09012347459793091, 0.012856896966695786, -0.11969941109418869, -0.057551946491003036, 0.19719378650188446, 0.05026568844914436, -0.03900526463985443, -0.09872118383646011, -0.0610077828168869, -0.10881651192903519, 0.009692584164440632, -0.05629380792379379, 0.029935110360383987, -0.07021239399909973, -0.07008720934391022, -0.06118902191519737, -0.07316434383392334, -0.06688529253005981, -0.013260999694466591, 0.11485647410154343, 0.04891733080148697, 0.03962467610836029, -0.03217139467597008, 0.09837762266397476, -0.04458027705550194, -0.12564750015735626, -0.03516918048262596, 0.002491361228749156, -0.07149320840835571, -0.042324986308813095, -0.027784908190369606, -0.03574640303850174, -0.013950335793197155, 0.18225418031215668, -0.08384653180837631, 0.03987986594438553, 0.049748580902814865, 0.0008729240507818758, -0.011616374365985394, 0.16304932534694672, -0.02848806232213974, -0.03566326946020126, -0.009992574341595173, 0.08123549818992615, 0.010471755638718605, -0.0058405338786542416, -0.07654466480016708, -0.051189299672842026, 0.06591237336397171, 0.05829794332385063, -0.02409488521516323, 0.009461729787290096, -0.03127824515104294, -0.03064202517271042, 0.013295999728143215, -0.13339217007160187, 0.05292220786213875, -0.028778016567230225, -0.10741052031517029, -0.03858854994177818, 0.025956276804208755, -0.0019926868844777346, -0.05368639528751373, 0.045323941856622696, -0.08042419701814651, 0.002851393073797226, -0.08660784363746643, -0.08590424805879593, 0.014428620226681232, -0.09924875944852829, 0.00739662628620863, -0.06716229766607285, -0.20201456546783447, -0.03553827106952667, 0.029473425820469856, -0.05641037970781326, -0.029821792617440224, -0.029991699382662773, -0.04671071097254753, 0.002935258438810706, -0.009609502740204334, 0.09297198802232742, -0.03979527950286865, 0.07105068117380142, 0.04545227065682411, 0.0482347197830677, 0.03533457964658737, 0.04280856251716614, -0.09486449509859085, 0.024943552911281586, -0.0917709469795227, 0.0609840452671051, -0.08114074170589447, 0.0010810729581862688, -0.11056828498840332, -0.1114802286028862, 0.0032114076893776655, -0.025319837033748627, 0.07906594127416611, 0.10683052241802216, -0.13584290444850922, -0.025818699970841408, 0.15066871047019958, -0.06179849058389664, -0.09898416697978973, 0.09609916061162949, -0.05037353187799454, -0.009001946076750755, 0.05685258284211159, 0.15467530488967896, 0.10137581825256348, -0.15318448841571808, -0.04548294469714165, 0.02724464237689972, 0.03789437562227249, -0.005722668953239918, 0.07883083820343018, -0.006289975252002478, 0.06214095652103424, 0.01007096003741026, -0.062396228313446045, -0.019471747800707817, -0.09306848049163818, -0.08564659208059311, -0.06743599474430084, -0.0750582292675972, 0.030756628140807152, 0.03684920817613602, 0.022002017125487328, -0.09079615026712418, -0.12607041001319885, 0.05966904014348984, 0.13354159891605377, -0.042933665215969086, 0.010010605677962303, -0.07819204032421112, 0.05859350040555, -0.05737368017435074, -0.04287794604897499, -0.18522094190120697, -0.10229161381721497, 0.04108797013759613, -0.04124944284558296, 0.029959091916680336, 0.04608917981386185, 0.06496375054121017, 0.06272494792938232, -0.04271458461880684, -0.027403954416513443, -0.11780208349227905, -0.025681713595986366, -0.10642474889755249, -0.13742266595363617, -0.06852985918521881, -0.030437983572483063, 0.09047702699899673, -0.1925864964723587, 0.014843655750155449, -0.005440332926809788, 0.13690797984600067, 0.027036061510443687, -0.050463009625673294, -0.015180543065071106, 0.040691766887903214, -0.02480349875986576, -0.08289923518896103, 0.029418347403407097, -0.018998602405190468, -0.06454098224639893, -0.07444349676370621, -0.1261911243200302, 0.06893736869096756, 0.07452914118766785, 0.0490429624915123, -0.0771879106760025, -0.018757596611976624, -0.06754487752914429, -0.030546212568879128, -0.08655118942260742, -0.016728652641177177, 0.20256587862968445, 0.0067832437343895435, 0.1331598311662674, -0.07095049321651459, -0.07453250139951706, -0.0009747206931933761, -0.004952389746904373, -0.008764748461544514, 0.09760241210460663, 0.06113884970545769, -0.06104227527976036, 0.09481977671384811, 0.06934016942977905, -0.0310853011906147, 0.11983827501535416, -0.04246605187654495, -0.10667268931865692, -0.04800058901309967, -0.002277299528941512, -0.018292129039764404, 0.12431330978870392, -0.07254377752542496, -0.0032849377021193504, 0.05337785184383392, 0.014483748935163021, 0.03473936393857002, -0.17844808101654053, -0.012499233707785606, 0.04462028667330742, -0.04510389640927315, -0.013427275232970715, -0.008825142867863178, 0.02895238809287548, 0.0890633687376976, 0.019443310797214508, -0.0196803268045187, 0.004053918179124594, -0.015145744197070599, -0.08115297555923462, 0.16785629093647003, -0.08496104925870895, -0.1639440357685089, -0.15424615144729614, 0.06340578198432922, -0.05817091464996338, -0.019678890705108643, 0.024085957556962967, -0.07376952469348907, -0.05970996990799904, -0.087957002222538, 0.0030347800347954035, -0.04813123866915703, -0.023142997175455093, 0.07004017382860184, 0.021912284195423126, 0.07676547020673752, -0.152236208319664, -0.00415817042812705, -0.017824549227952957, -0.08132471889257431, -0.013040976598858833, 0.06211650371551514, 0.13521547615528107, 0.0838424414396286, -0.04366124048829079, 0.039319802075624466, -0.034070372581481934, 0.23495425283908844, -0.0738496407866478, -0.02529272809624672, 0.13504748046398163, 0.030192403122782707, 0.06325017660856247, 0.07795799523591995, 0.026615163311362267, -0.07882299274206161, 0.020107779651880264, 0.0676758661866188, -0.024425098672509193, -0.26494547724723816, -0.033147916197776794, -0.017914440482854843, -0.047960106283426285, 0.0979495570063591, 0.041597556322813034, 0.030743153765797615, 0.059453852474689484, -0.014472209848463535, 0.031307440251111984, -0.052626073360443115, 0.09182821214199066, 0.1003810316324234, 0.028504759073257446, 0.09283097833395004, -0.0392264798283577, -0.046240974217653275, 0.05837620049715042, 0.016169549897313118, 0.26217377185821533, -0.02533559314906597, 0.1232130229473114, 0.019228238612413406, 0.14791657030582428, -0.05327644944190979, 0.04150877147912979, 0.0026665315963327885, 0.004196523688733578, 0.006302088964730501, -0.054861076176166534, -0.026440991088747978, 0.04607589542865753, -0.017400743439793587, 0.07515385001897812, -0.10430856049060822, 0.06376156955957413, 0.029876038432121277, 0.25072741508483887, 0.0379486046731472, -0.29054775834083557, -0.07995440810918808, 0.01843317784368992, -0.03457148000597954, -0.04942145198583603, 0.007620209362357855, 0.1344442516565323, -0.13131749629974365, 0.03850691020488739, -0.03725169971585274, 0.10265693813562393, -0.036135267466306686, -0.004771423526108265, 0.02612871676683426, 0.11227945983409882, 0.008458433672785759, 0.10837801545858383, -0.17797207832336426, 0.22141191363334656, 0.030916545540094376, 0.10094038397073746, -0.07165670394897461, 0.04256156086921692, -0.013519872911274433, 0.0382901206612587, 0.13403311371803284, 0.0007338913856074214, -0.06416631489992142, -0.1934312880039215, -0.07604842633008957, 0.023645978420972824, 0.11261126399040222, -0.054328061640262604, 0.10539958626031876, -0.04144659265875816, 0.001568174222484231, 0.03327275067567825, -0.050552595406770706, -0.11251942068338394, -0.11618310213088989, 0.019772255793213844, 0.016081983223557472, -0.058838699012994766, -0.06682343780994415, -0.09468962997198105, -0.044790059328079224, 0.15561144053936005, -0.008021168410778046, -0.05079035833477974, -0.1265728622674942, 0.07392153143882751, 0.13819506764411926, -0.08462495356798172, 0.0055466266348958015, 0.02715955674648285, 0.12369931489229202, 0.017774956300854683, -0.07125792652368546, 0.040514253079891205, -0.05958675965666771, -0.14142492413520813, -0.043361663818359375, 0.15174651145935059, 0.034089360386133194, 0.05789555236697197, -0.0014997035032138228, 0.034407466650009155, -0.015225935727357864, -0.08982853591442108, 0.044271524995565414, 0.04971318319439888, 0.0932309702038765, 0.07118535041809082, -0.06454915553331375, 0.04922373220324516, -0.05863822624087334, -0.005991159472614527, 0.1424945592880249, 0.21337097883224487, -0.09781961143016815, 0.0554191954433918, 0.024535883218050003, -0.08845929801464081, -0.15416593849658966, 0.05166522040963173, 0.0847577154636383, 0.025052107870578766, 0.07042744010686874, -0.1635049730539322, 0.0828346312046051, 0.10498782992362976, -0.006105172447860241, 0.05193527787923813, -0.32559362053871155, -0.11382332444190979, 0.06058480218052864, 0.0915961042046547, 0.018292786553502083, -0.13329079747200012, -0.0379018560051918, -0.012040426954627037, -0.17730581760406494, 0.0770472064614296, -0.058514248579740524, 0.10069090873003006, -0.02897053211927414, 0.10515894740819931, 0.03647968918085098, -0.04041265696287155, 0.14296026527881622, 0.052478574216365814, 0.09789137542247772, -0.05097837746143341, -0.007982837967574596, 0.09299387782812119, -0.07061460614204407, 0.10418017208576202, -0.0293642058968544, 0.07600443065166473, -0.20140022039413452, -0.027560871094465256, -0.05297238752245903, 0.07119609415531158, -0.04105238988995552, -0.056474193930625916, -0.06099753454327583, 0.027246534824371338, 0.05178820714354515, -0.026151753962039948, 0.10125096887350082, 0.04034101217985153, 0.10240015387535095, 0.11661697924137115, 0.07566270232200623, -0.004113596864044666, -0.1489168107509613, 0.008230257779359818, -0.003200976410880685, 0.053992561995983124, -0.11250920593738556, 0.03565431013703346, 0.156956747174263, 0.052627041935920715, 0.14214837551116943, 0.021373510360717773, -0.041561443358659744, 0.00921147782355547, 0.020601684227585793, -0.11829373240470886, -0.16059036552906036, -0.031785499304533005, -0.06595008820295334, -0.13452333211898804, 0.028730522841215134, 0.10066543519496918, -0.05138632282614708, -0.01961594633758068, -0.029651610180735588, 0.014911443926393986, -0.0096273273229599, 0.17448624968528748, 0.057894378900527954, 0.06554841250181198, -0.07201789319515228, 0.1095733791589737, 0.06655177474021912, -0.07111744582653046, 0.06483203917741776, 0.06337901204824448, -0.059435173869132996, -0.019103003665804863, 0.03545767441391945, 0.1618821769952774, -0.07395005226135254, -0.0379558764398098, -0.12078553438186646, -0.06450565159320831, 0.055847201496362686, 0.0937371701002121, 0.05396322160959244, -0.03199293091893196, -0.018598737195134163, 0.021379465237259865, -0.15003608167171478, 0.10797489434480667, 0.06184094026684761, 0.05680501461029053, -0.14179272949695587, 0.12375126779079437, 0.004818765912204981, 0.06232025474309921, -0.011438877321779728, 0.01427198015153408, -0.06775602698326111, -0.0020600431598722935, -0.12445643544197083, -0.011671490967273712, -0.02596629038453102, 0.002323115710169077, -0.02134610339999199, -0.075033999979496, -0.05149320513010025, 0.06162744387984276, -0.06702933460474014, -0.053992725908756256, 0.01967627927660942, 0.047730136662721634, -0.14915840327739716, -0.028142206370830536, 0.03057301603257656, -0.07277419418096542, 0.06734886765480042, 0.06029433012008667, 0.034264590591192245, 0.002681342652067542, -0.05915210396051407, -0.0023896016646176577, 0.016929635778069496, 0.044209957122802734, 0.06437436491250992, -0.104976125061512, 0.0071821375750005245, -0.02295822650194168, 0.04645664617419243, 0.00779250543564558, 0.012359539978206158, -0.13391798734664917, -0.0341862291097641, -0.04389910027384758, -0.06424351036548615, -0.06619534641504288, 0.05435336381196976, 0.11576972156763077, 0.030683767050504684, 0.1487141251564026, -0.06112152710556984, 0.07546496391296387, -0.21878324449062347, -0.019988905638456345, 0.005473883356899023, -0.02962924726307392, -0.06413266807794571, -0.033393923193216324, 0.07442989945411682, -0.06789400428533554, 0.13098211586475372, 0.022850723937153816, 0.10965994745492935, 0.0446278341114521, -0.03837451711297035, -0.00808197446167469, -0.016386471688747406, 0.1528468132019043, 0.04830338805913925, -0.02144460752606392, 0.08673936128616333, -0.009758107364177704, 0.08221827447414398, 0.03268237039446831, 0.14948877692222595, 0.15338006615638733, 0.012410483323037624, 0.05832146108150482, 0.08245296031236649, -0.11059843748807907, -0.1569247990846634, 0.04149690642952919, 0.0005256379954516888, 0.08783949166536331, -0.04149820655584335, 0.1439346969127655, 0.11075364798307419, -0.173762246966362, 0.06853623688220978, -0.03325509652495384, -0.09938432276248932, -0.11539488285779953, -0.03736788406968117, -0.08697066456079483, -0.10810129344463348, 0.039248693734407425, -0.13388510048389435, 0.03991005942225456, 0.08418755233287811, -0.005699945613741875, -0.011006536893546581, 0.15350259840488434, -0.031864095479249954, 0.009703544899821281, 0.034343697130680084, 0.014668220654129982, 0.013623116537928581, -0.05097686126828194, -0.026862911880016327, 0.06075800210237503, -0.011215072125196457, 0.08577772229909897, -0.03608234226703644, 0.012269451282918453, 0.010665188543498516, -0.011581113561987877, -0.06696520000696182, -0.008005780167877674, 0.020844152197241783, 0.03794883191585541, 0.03954874351620674, 0.05585853382945061, 0.021306484937667847, -0.05379035323858261, 0.2556677460670471, -0.07583636045455933, -0.04889342561364174, -0.11928801238536835, 0.16488872468471527, 0.06229611486196518, -0.00015633909788448364, 0.07043275982141495, -0.11945576965808868, -0.015036621131002903, 0.14981594681739807, 0.13407109677791595, -0.016677526757121086, -0.018191512674093246, -0.012429669499397278, -0.007729979697614908, -0.03585236519575119, 0.049306996166706085, 0.10166996717453003, 0.049932315945625305, -0.060833971947431564, -0.007424760609865189, -0.015208213590085506, -0.009012526832520962, -0.06524688750505447, 0.06290074437856674, 0.01550149917602539, 0.02139478363096714, -0.027808254584670067, 0.04606457054615021, 0.029535632580518723, -0.1738574355840683, 0.06504476070404053, -0.1886756271123886, -0.16433891654014587, 0.00728149339556694, 0.08426076173782349, -0.004385834559798241, 0.06807973980903625, -0.015086838975548744, -0.006214050110429525, 0.14794234931468964, -0.0070631252601742744, -0.11032143980264664, -0.10728010535240173, 0.09422757476568222, -0.09365864843130112, 0.20781975984573364, -0.008546741679310799, 0.0989469587802887, 0.10090561211109161, -0.004908276721835136, -0.12883639335632324, 0.04256521910429001, 0.0811791643500328, -0.04501672461628914, 0.016442641615867615, 0.1496782749891281, -0.030670704320073128, 0.11629660427570343, 0.04952828958630562, -0.12539933621883392, -0.03919875621795654, -0.0039061899296939373, -0.002495277440175414, -0.08298265933990479, -0.006082562729716301, -0.065446637570858, 0.15248623490333557, 0.20324935019016266, -0.03238614276051521, -0.0012322327820584178, -0.07160021364688873, 0.03486165776848793, 0.06752431392669678, 0.07838846743106842, -0.011925132013857365, -0.17504140734672546, 0.02708878554403782, 0.05083385854959488, 0.017316283658146858, -0.24502715468406677, -0.09998246282339096, 0.05028045177459717, -0.06419192254543304, -0.03340332955121994, 0.08516477793455124, 0.0768619179725647, 0.04486144706606865, -0.03714514151215553, -0.11925759166479111, -0.041341137140989304, 0.13923411071300507, -0.14777354896068573, -0.041933607310056686 ]
null
null
transformers
#Harry Potter AI bot
{"tags": ["conversational"]}
text-generation
Paradocx/Dialogpt-mid-hpai
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
#Harry Potter AI bot
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 51 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.009697278961539268, 0.03208012506365776, -0.007204889785498381, 0.004809224978089333, 0.16726240515708923, 0.014898733235895634, 0.09765533357858658, 0.13672804832458496, -0.007841327227652073, -0.031050153076648712, 0.14490588009357452, 0.20411323010921478, -0.006439372431486845, 0.0661218985915184, -0.07572533935308456, -0.2683109939098358, 0.05759621039032936, 0.046649303287267685, 0.016515716910362244, 0.1200079694390297, 0.08573378622531891, -0.05473608896136284, 0.08714032918214798, -0.014583407901227474, -0.150366872549057, 0.017733458429574966, 0.043394338339567184, -0.12260226160287857, 0.11910516023635864, 0.05462685227394104, 0.07063519209623337, 0.014929565601050854, -0.07541623711585999, -0.1631229966878891, 0.03031250834465027, 0.01425902172923088, -0.0594632662832737, 0.04757995903491974, 0.059961482882499695, -0.10165371745824814, 0.10819483548402786, 0.09530027210712433, -0.013078106567263603, 0.06798283755779266, -0.16849711537361145, -0.020869607105851173, -0.01446688175201416, 0.009899779222905636, 0.05550243332982063, 0.09964893013238907, -0.03413357585668564, 0.10497362166643143, -0.09214533120393753, 0.11017382889986038, 0.10932035744190216, -0.32057443261146545, -0.005767723545432091, 0.09167823940515518, 0.039358653128147125, 0.07352814823389053, -0.04467793554067612, 0.06258884817361832, 0.018015462905168533, 0.017986174672842026, -0.014015024527907372, -0.07283061742782593, -0.11612214148044586, 0.04717336222529411, -0.08668071031570435, -0.059868961572647095, 0.2244078367948532, -0.05464440956711769, 0.06881742179393768, -0.05281897634267807, -0.10522868484258652, -0.04308144748210907, -0.029833965003490448, 0.00475557055324316, -0.07660607248544693, 0.08692064881324768, 0.00869679357856512, -0.09547875821590424, -0.1376667022705078, -0.02496783249080181, -0.1776352822780609, 0.16140350699424744, 0.02465328387916088, 0.05232657864689827, -0.2027255892753601, 0.09623090922832489, 0.017906051129102707, -0.08045592904090881, 0.022091427817940712, -0.10046248883008957, 0.029131146147847176, 0.013760408386588097, -0.04754498973488808, -0.061387211084365845, 0.0843690037727356, 0.11199145019054413, -0.01731434464454651, 0.025486016646027565, -0.039331406354904175, 0.08100687712430954, 0.03553595021367073, 0.09077847748994827, 0.007288969587534666, -0.028338588774204254, 0.025842782109975815, -0.13719046115875244, -0.003647835226729512, -0.07116208970546722, -0.16572439670562744, -0.021088803187012672, 0.02994808368384838, 0.08289173990488052, 0.015449047088623047, 0.11682453751564026, -0.03272046521306038, -0.025152435526251793, 0.03602350503206253, -0.047656361013650894, -0.012649794109165668, 0.016648368909955025, 0.013163427822291851, 0.12399329990148544, -0.0022096503525972366, 0.03235051408410072, -0.13653022050857544, 0.031423524022102356, -0.06793295592069626, -0.003740974934771657, -0.03486552834510803, -0.040637075901031494, 0.009043924510478973, -0.06862333416938782, 0.003486064961180091, -0.15030112862586975, -0.15063877403736115, 0.007587034720927477, -0.007836631499230862, -0.04107699543237686, -0.06370922178030014, -0.06952770054340363, -0.013550350442528725, 0.04251532256603241, -0.07093454152345657, -0.011352915316820145, -0.06403283774852753, 0.11004766076803207, -0.03197755664587021, 0.07921615242958069, -0.11953279376029968, 0.08390819281339645, -0.11260783672332764, -0.02386913076043129, -0.060801517218351364, 0.09317506104707718, -0.0006014376995153725, 0.09549830108880997, -0.006563255097717047, -0.017931854352355003, -0.07981178909540176, 0.06445012241601944, -0.042872510850429535, 0.21701598167419434, -0.0615808479487896, -0.11181682348251343, 0.28781595826148987, -0.052628401666879654, -0.1370542049407959, 0.11647392809391022, 0.008682746440172195, 0.05777018144726753, 0.10703510791063309, 0.19733482599258423, -0.015276194550096989, 0.004040541127324104, 0.09471915662288666, 0.11263324320316315, -0.11276852339506149, -0.033160366117954254, 0.013019153848290443, -0.04081077128648758, -0.10867965966463089, 0.04689536616206169, 0.09810488671064377, 0.07090286910533905, -0.04786505550146103, -0.03377414867281914, -0.01366397924721241, 0.0052589005790650845, 0.08885077387094498, -0.007157256826758385, 0.10962837189435959, -0.05819983780384064, -0.03796621412038803, -0.029282379895448685, -0.012126247398555279, -0.03951939567923546, 0.03137664496898651, -0.043376367539167404, 0.10821941494941711, -0.011204327456653118, 0.06364280730485916, -0.16185984015464783, -0.07691477984189987, -0.017002692446112633, 0.1581239402294159, 0.024538565427064896, 0.09859629720449448, 0.0552486926317215, -0.040398042649030685, -0.0012767292791977525, 0.012792680412530899, 0.15581141412258148, -0.022091681137681007, -0.065607450902462, -0.052166227251291275, 0.08642971515655518, -0.05641226842999458, 0.04504093527793884, -0.05937713757157326, 0.012367865070700645, 0.05064384639263153, 0.10342344641685486, -0.00018274025933351368, 0.03323284164071083, -0.008164864964783192, 0.002145637758076191, -0.058205123990774155, 0.007405933458358049, 0.10799351334571838, 0.00036868182360194623, -0.07365862280130386, 0.22074243426322937, -0.17796069383621216, 0.1765957772731781, 0.1893044263124466, -0.299345999956131, 0.017949223518371582, -0.10759581625461578, -0.04561871662735939, 0.014407722279429436, 0.05567655712366104, -0.0454222597181797, 0.1703362911939621, -0.009871348738670349, 0.18874616920948029, -0.04946064203977585, -0.04464937001466751, -0.0200483538210392, -0.05118836089968681, -0.0024189651012420654, 0.07781197130680084, 0.10685696452856064, -0.13992026448249817, 0.1964332014322281, 0.1621224284172058, 0.048237916082143784, 0.19945049285888672, 0.015346456319093704, -0.011589210480451584, 0.0909530371427536, 0.005220826715230942, -0.058739423751831055, -0.07409929484128952, -0.2594851851463318, -0.030033592134714127, 0.07992640137672424, 0.0422382652759552, 0.1212305948138237, -0.11349532753229141, -0.038956157863140106, -0.01763172075152397, -0.023146281018853188, 0.021672505885362625, 0.0914369598031044, 0.06075398623943329, 0.13201528787612915, -0.001710098935291171, -0.007300339173525572, 0.10524573177099228, 0.01783694699406624, -0.09354141354560852, 0.18308524787425995, -0.13652534782886505, -0.37097251415252686, -0.13911493122577667, -0.18057456612586975, -0.05449081212282181, 0.05712554603815079, 0.11679314076900482, -0.12011238187551498, -0.018752124160528183, 0.01578843593597412, 0.10931742936372757, -0.08449502289295197, 0.0021454424131661654, -0.06880278885364532, 0.0321490578353405, -0.10310184955596924, -0.09194442629814148, -0.055416494607925415, -0.031392451375722885, -0.08001253753900528, 0.1423761546611786, -0.10777941346168518, 0.04476889222860336, 0.20262959599494934, 0.04653622955083847, 0.05625178664922714, -0.044105201959609985, 0.19377262890338898, -0.11264272034168243, -0.01661740615963936, 0.19215328991413116, -0.048360925167798996, 0.07476246356964111, 0.1232115849852562, -0.006348740309476852, -0.08765771239995956, 0.03011748194694519, -0.02085109055042267, -0.07988511025905609, -0.23219464719295502, -0.13938382267951965, -0.12429051846265793, 0.09477275609970093, 0.028005298227071762, 0.056365787982940674, 0.17219258844852448, 0.06577219814062119, -0.038416244089603424, 0.006410336587578058, 0.02959546446800232, 0.08237514644861221, 0.23417828977108002, -0.06035616248846054, 0.1364797055721283, -0.03420931473374367, -0.14982740581035614, 0.08169995993375778, 0.0713929831981659, 0.10213395953178406, 0.06678459793329239, 0.0804823637008667, 0.0149586396291852, 0.06188136339187622, 0.1311223804950714, 0.08191446959972382, 0.019586285576224327, -0.02480296604335308, -0.03388110175728798, -0.025523077696561813, -0.05937909707427025, 0.040128443390131, 0.06589099019765854, -0.16763372719287872, -0.039227183908224106, -0.09338314831256866, 0.09657008945941925, 0.0873042419552803, 0.06609832495450974, -0.1842060089111328, -0.008006223477423191, 0.08488986641168594, -0.03854905813932419, -0.13727426528930664, 0.09535189718008041, 0.01523482333868742, -0.15144726634025574, 0.03139317408204079, -0.04061909019947052, 0.12188644707202911, -0.07804752141237259, 0.09809603542089462, -0.08108244836330414, -0.07448557764291763, 0.02123199962079525, 0.1261177361011505, -0.30527687072753906, 0.20240111649036407, -0.0024993624538183212, -0.06486981362104416, -0.1243603527545929, -0.0032166161108762026, 0.002410882618278265, 0.07357452809810638, 0.10519039630889893, -0.007196315098553896, 0.001897757756523788, -0.06300821900367737, -0.01829923689365387, 0.032471053302288055, 0.13080233335494995, -0.0401318334043026, -0.021158374845981598, -0.050194524228572845, -0.001653497340157628, -0.03173094615340233, -0.06934895366430283, 0.02002747356891632, -0.19509181380271912, 0.08751901984214783, 0.04166261479258537, 0.09648149460554123, 0.029994789510965347, 0.004265148192644119, -0.09651939570903778, 0.24698667228221893, -0.07148019969463348, -0.10072879493236542, -0.10919588059186935, -0.046813901513814926, 0.03569883480668068, -0.05628936365246773, 0.04309194162487984, -0.0788632407784462, 0.028997479006648064, -0.06352769583463669, -0.19235502183437347, 0.12410202622413635, -0.09027006477117538, -0.04412810131907463, -0.02371402643620968, 0.2110891044139862, -0.05598580464720726, 0.010335659608244896, 0.02930437959730625, 0.01208863127976656, -0.11645778268575668, -0.09678568691015244, 0.031018631532788277, -0.007351789623498917, 0.050603240728378296, 0.041841957718133926, -0.05915454775094986, -0.017138581722974777, -0.052199993282556534, -0.022926922887563705, 0.3496883809566498, 0.14231905341148376, -0.043836336582899094, 0.19347235560417175, 0.12347975373268127, -0.07452994585037231, -0.3159443140029907, -0.1066238060593605, -0.10937739163637161, -0.04680149629712105, -0.07012093812227249, -0.2002030611038208, 0.06474938243627548, 0.00662544509395957, -0.013415241613984108, 0.12749312818050385, -0.2561831772327423, -0.07571036368608475, 0.15906259417533875, -0.017980827018618584, 0.3745945692062378, -0.1168576180934906, -0.10926306992769241, -0.03950892388820648, -0.14175476133823395, 0.16968177258968353, -0.01989765651524067, 0.11221715062856674, -0.009765521623194218, 0.14388824999332428, 0.05548359826207161, -0.023479344323277473, 0.08544106781482697, 0.004999885335564613, -0.03290518373250961, -0.10304180532693863, -0.05676887184381485, 0.007092386484146118, 0.02477436140179634, 0.018026655539870262, -0.041834570467472076, 0.02227151393890381, -0.11731979995965958, -0.04657655209302902, -0.08982590585947037, 0.04431166127324104, 0.03899754583835602, -0.07325074821710587, -0.002380647463724017, -0.07165111601352692, -0.012272949330508709, 0.022334342822432518, 0.20356793701648712, -0.08029330521821976, 0.16448934376239777, 0.09239562600851059, 0.12419285625219345, -0.14376309514045715, -0.00019283240544609725, -0.0762530043721199, -0.05611240118741989, 0.07737895101308823, -0.09433035552501678, 0.058893077075481415, 0.10901971161365509, -0.04567738622426987, 0.08828683942556381, 0.10377411544322968, 0.008936077356338501, 0.003213887568563223, 0.10916902124881744, -0.2667325437068939, -0.0296600554138422, -0.07532413303852081, 0.000883326749317348, 0.09092561900615692, 0.08562852442264557, 0.18840822577476501, 0.025361526757478714, -0.04293036088347435, -0.002770674182102084, 0.028597986325621605, -0.039021048694849014, 0.051667019724845886, 0.001123449532315135, 0.01947369985282421, -0.1530752182006836, 0.072522833943367, 0.01490565575659275, -0.15215420722961426, 0.021316176280379295, 0.16572684049606323, -0.11656328290700912, -0.1283872276544571, -0.06520111113786697, 0.08313824236392975, -0.11755692958831787, -0.01578943058848381, -0.03279297426342964, -0.13145680725574493, 0.07992171496152878, 0.12629036605358124, 0.05557859688997269, 0.0972496047616005, -0.06061713397502899, -0.020469192415475845, -0.018721895292401314, -0.014099318534135818, -0.012384648434817791, -0.007667020428925753, -0.055978111922740936, 0.0590752474963665, -0.026677248999476433, 0.1425808072090149, -0.09221141785383224, -0.1037059873342514, -0.16142144799232483, 0.0374140702188015, -0.11013076454401016, -0.08825794607400894, -0.08821134269237518, -0.050188567489385605, 0.002360827289521694, -0.019856395199894905, -0.04037635400891304, -0.05829505994915962, -0.12300454825162888, 0.0338277705013752, -0.040771447122097015, 0.024727050215005875, -0.07512269169092178, 0.015856385231018066, 0.08507686108350754, -0.03285100311040878, 0.15655414760112762, 0.1450488418340683, -0.1006515845656395, 0.10741901397705078, -0.14806775748729706, -0.09138492494821548, 0.11116421222686768, 0.015329592861235142, 0.0449691042304039, 0.09723787009716034, 0.013362943194806576, 0.0635865181684494, 0.032776717096567154, 0.05308786407113075, 0.027619892731308937, -0.11959987878799438, 0.06483134627342224, -0.03626115620136261, -0.14700546860694885, -0.049338050186634064, -0.05282869189977646, 0.01647452637553215, 0.013054544106125832, 0.09622690081596375, -0.05301849544048309, 0.10698331147432327, -0.04055701196193695, 0.0346808135509491, 0.017554637044668198, -0.1730053424835205, -0.03816922754049301, -0.08538098633289337, 0.03681723028421402, 0.014741539023816586, 0.25266793370246887, 0.030072299763560295, 0.012416383251547813, 0.032671261578798294, 0.08285367488861084, 0.03899408504366875, 0.010228337720036507, 0.17482228577136993, 0.1162426546216011, -0.06621865928173065, -0.10445023328065872, 0.0729617029428482, 0.016332454979419708, 0.01286179106682539, 0.13617953658103943, 0.008365051820874214, 0.005795429926365614, 0.08649782836437225, -0.016865963116288185, 0.009968153201043606, -0.10052056610584259, -0.13426925241947174, -0.022176474332809448, 0.05151832848787308, -0.04655967652797699, 0.11727844923734665, 0.1406494379043579, -0.01806013658642769, 0.03222079202532768, -0.021771740168333054, -0.05699979141354561, -0.1683429479598999, -0.1429590880870819, -0.06883849948644638, -0.13416796922683716, 0.00897989235818386, -0.11180389672517776, 0.05395037308335304, 0.06001098081469536, 0.06750501692295074, -0.06899319589138031, 0.10220931470394135, 0.04626858979463577, -0.11440542340278625, 0.06264589726924896, -0.0296088308095932, 0.09430401772260666, -0.02759445086121559, -0.019505485892295837, -0.09039592742919922, 0.014574515633285046, 0.011419114656746387, 0.06245238706469536, -0.04707273095846176, 0.007463190704584122, -0.14696238934993744, -0.08972041308879852, -0.0523175448179245, 0.0718572810292244, -0.050409089773893356, 0.14282815158367157, 0.00775480642914772, -0.0170906875282526, 0.039554283022880554, 0.22787313163280487, -0.07476283609867096, -0.04778539761900902, -0.05269690603017807, 0.20717895030975342, 0.02975541539490223, 0.1171872541308403, -0.022938819602131844, -0.006106364540755749, -0.0919521227478981, 0.3764844834804535, 0.30030161142349243, -0.09031439572572708, 0.011794124729931355, 0.02137952297925949, 0.04502861574292183, 0.1316293478012085, 0.1216534823179245, 0.10318691283464432, 0.3006802201271057, -0.07452366501092911, -0.04653361067175865, -0.012629742734134197, -0.023858042433857918, -0.09059546142816544, 0.1021224707365036, 0.04839762672781944, -0.06382183730602264, -0.03313443064689636, 0.0954432487487793, -0.25862133502960205, 0.1277991235256195, -0.12311873584985733, -0.17578600347042084, -0.06654827296733856, 0.009760108776390553, 0.10465722531080246, 0.015642458572983742, 0.0946015790104866, 0.007128213066607714, -0.11252258718013763, 0.06305865943431854, 0.03397420793771744, -0.22762253880500793, 0.0006893770187161863, 0.06642123311758041, -0.07006710022687912, -0.0024247700348496437, -0.026499588042497635, 0.05657242611050606, 0.0656052976846695, 0.054629553109407425, -0.00971333310008049, 0.03816632181406021, 0.0034184439573436975, -0.0585215799510479, 0.016623929142951965, 0.05121519789099693, 0.02472509816288948, -0.09763528406620026, 0.06927435845136642, -0.1574270874261856, 0.04766253009438515, -0.0030655991286039352, -0.04124255105853081, 0.006064958870410919, 0.008823691867291927, -0.06491616368293762, 0.05165379121899605, 0.07916834205389023, -0.0016257909592241049, -0.0062433634884655476, -0.057178743183612823, -0.02632102556526661, -0.027755750343203545, -0.09291748702526093, -0.10495562851428986, -0.14682936668395996, -0.11640441417694092, 0.09368976950645447, -0.01011267676949501, -0.1848134547472, 0.022154374048113823, -0.08606051653623581, 0.08319322764873505, -0.1670055389404297, 0.08040720224380493, 0.07041648775339127, 0.013038921169936657, -0.0031511052511632442, -0.02002427540719509, 0.054132770746946335, 0.086809903383255, -0.10407156497240067, -0.07400695979595184 ]
null
null
sentence-transformers
# {MODEL_NAME} This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search. <!--- Describe your model here --> ## Usage (Sentence-Transformers) Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed: ``` pip install -U sentence-transformers ``` Then you can use the model like this: ```python from sentence_transformers import SentenceTransformer sentences = ["This is an example sentence", "Each sentence is converted"] model = SentenceTransformer('{MODEL_NAME}') embeddings = model.encode(sentences) print(embeddings) ``` ## Usage (HuggingFace Transformers) Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings. ```python from transformers import AutoTokenizer, AutoModel import torch #Mean Pooling - Take attention mask into account for correct averaging def mean_pooling(model_output, attention_mask): token_embeddings = model_output[0] #First element of model_output contains all token embeddings input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float() return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9) # Sentences we want sentence embeddings for sentences = ['This is an example sentence', 'Each sentence is converted'] # Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained('{MODEL_NAME}') model = AutoModel.from_pretrained('{MODEL_NAME}') # Tokenize sentences encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') # Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input) # Perform pooling. In this case, max pooling. sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask']) print("Sentence embeddings:") print(sentence_embeddings) ``` ## Evaluation Results <!--- Describe how your model was evaluated --> For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME}) ## Training The model was trained with the parameters: **DataLoader**: `torch.utils.data.dataloader.DataLoader` of length 365 with parameters: ``` {'batch_size': 32, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'} ``` **Loss**: `sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss` Parameters of the fit()-Method: ``` { "callback": null, "epochs": 4, "evaluation_steps": 1000, "evaluator": "sentence_transformers.evaluation.EmbeddingSimilarityEvaluator.EmbeddingSimilarityEvaluator", "max_grad_norm": 1, "optimizer_class": "<class 'transformers.optimization.AdamW'>", "optimizer_params": { "lr": 2e-05 }, "scheduler": "WarmupLinear", "steps_per_epoch": null, "warmup_steps": 146, "weight_decay": 0.01 } ``` ## Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: RobertaModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False}) ) ``` ## Citing & Authors <!--- Describe where people can find more information -->
{"tags": ["sentence-transformers", "feature-extraction", "sentence-similarity", "transformers"], "pipeline_tag": "sentence-similarity"}
sentence-similarity
ParkMyungkyu/KLUE-STS-roberta-base
[ "sentence-transformers", "pytorch", "roberta", "feature-extraction", "sentence-similarity", "transformers", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #sentence-transformers #pytorch #roberta #feature-extraction #sentence-similarity #transformers #endpoints_compatible #region-us
# {MODEL_NAME} This is a sentence-transformers model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search. ## Usage (Sentence-Transformers) Using this model becomes easy when you have sentence-transformers installed: Then you can use the model like this: ## Usage (HuggingFace Transformers) Without sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings. ## Evaluation Results For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL ## Training The model was trained with the parameters: DataLoader: 'URL.dataloader.DataLoader' of length 365 with parameters: Loss: 'sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss' Parameters of the fit()-Method: ## Full Model Architecture ## Citing & Authors
[ "# {MODEL_NAME}\n\nThis is a sentence-transformers model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.", "## Usage (Sentence-Transformers)\n\nUsing this model becomes easy when you have sentence-transformers installed:\n\n\n\nThen you can use the model like this:", "## Usage (HuggingFace Transformers)\nWithout sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.", "## Evaluation Results\n\n\n\nFor an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL", "## Training\nThe model was trained with the parameters:\n\nDataLoader:\n\n'URL.dataloader.DataLoader' of length 365 with parameters:\n\n\nLoss:\n\n'sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss' \n\nParameters of the fit()-Method:", "## Full Model Architecture", "## Citing & Authors" ]
[ "TAGS\n#sentence-transformers #pytorch #roberta #feature-extraction #sentence-similarity #transformers #endpoints_compatible #region-us \n", "# {MODEL_NAME}\n\nThis is a sentence-transformers model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.", "## Usage (Sentence-Transformers)\n\nUsing this model becomes easy when you have sentence-transformers installed:\n\n\n\nThen you can use the model like this:", "## Usage (HuggingFace Transformers)\nWithout sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.", "## Evaluation Results\n\n\n\nFor an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL", "## Training\nThe model was trained with the parameters:\n\nDataLoader:\n\n'URL.dataloader.DataLoader' of length 365 with parameters:\n\n\nLoss:\n\n'sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss' \n\nParameters of the fit()-Method:", "## Full Model Architecture", "## Citing & Authors" ]
[ 43, 50, 38, 64, 29, 77, 5, 6 ]
[ "passage: TAGS\n#sentence-transformers #pytorch #roberta #feature-extraction #sentence-similarity #transformers #endpoints_compatible #region-us \n# {MODEL_NAME}\n\nThis is a sentence-transformers model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.## Usage (Sentence-Transformers)\n\nUsing this model becomes easy when you have sentence-transformers installed:\n\n\n\nThen you can use the model like this:## Usage (HuggingFace Transformers)\nWithout sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.## Evaluation Results\n\n\n\nFor an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL## Training\nThe model was trained with the parameters:\n\nDataLoader:\n\n'URL.dataloader.DataLoader' of length 365 with parameters:\n\n\nLoss:\n\n'sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss' \n\nParameters of the fit()-Method:## Full Model Architecture## Citing & Authors" ]
[ -0.020478781312704086, 0.1332581639289856, -0.009033077396452427, 0.03876883536577225, 0.11538618057966232, 0.029350480064749718, 0.15922830998897552, 0.10473249107599258, -0.046078674495220184, 0.05874176323413849, 0.0014075746294111013, 0.12590856850147247, 0.00022787156922277063, -0.009897400625050068, 0.02939893864095211, -0.2804928123950958, 0.022669965401291847, -0.039452314376831055, -0.01728633977472782, 0.0713057890534401, 0.1078634113073349, -0.07377579808235168, 0.04697391390800476, -0.013474966399371624, -0.06590309739112854, 0.03134594485163689, -0.03285793215036392, -0.02690122462809086, 0.07048534601926804, 0.0563083216547966, 0.09191616624593735, 0.02545945905148983, 0.005811599548906088, -0.19841569662094116, 0.01648653857409954, 0.08683677017688751, -0.021835118532180786, 0.06068878248333931, 0.023426832631230354, -0.07184125483036041, 0.10284528136253357, -0.11546184867620468, 0.0695435032248497, 0.03775838762521744, -0.12417139112949371, -0.043150849640369415, -0.03443015366792679, 0.0012346504954621196, 0.07095298171043396, 0.10248973220586777, -0.04702864959836006, 0.1143416166305542, -0.051309384405612946, 0.09085498005151749, 0.1474190056324005, -0.2337334156036377, -0.04892343655228615, 0.019108546897768974, 0.022621627897024155, 0.03684863820672035, -0.11458907276391983, 0.015888044610619545, -0.0586642287671566, 0.03457804024219513, 0.06956934928894043, -0.03513389453291893, 0.007624363526701927, 0.002157697221264243, -0.09201264381408691, 0.003159075742587447, 0.13102903962135315, 0.000023230817532748915, -0.014356952160596848, -0.17403078079223633, -0.09588120877742767, 0.09537447988986969, -0.03536821901798248, -0.019057614728808403, 0.05221976339817047, 0.058802150189876556, -0.04579190909862518, -0.1264684945344925, -0.08393727242946625, -0.02313288487493992, -0.05766766145825386, 0.03846338018774986, 0.007177338004112244, -0.044689200818538666, -0.008497050032019615, 0.07147780805826187, 0.015598068945109844, -0.09850640594959259, -0.018781375139951706, -0.0373401902616024, -0.10472891479730606, 0.0020612147636711597, -0.0598430410027504, -0.11343514919281006, 0.03179517760872841, 0.1474652886390686, 0.06948675960302353, 0.01927058771252632, -0.0363088883459568, 0.06383952498435974, -0.009171508252620697, 0.1760093867778778, -0.03494410216808319, -0.04536222666501999, -0.012734892778098583, 0.0105118528008461, 0.009131145663559437, -0.01950730010867119, -0.04473813995718956, -0.030380796641111374, 0.0024710437282919884, 0.06401485949754715, 0.05714407190680504, 0.06626170873641968, -0.025729140266776085, -0.06636612117290497, 0.03129522129893303, -0.12157420068979263, 0.016816329210996628, 0.02363341860473156, -0.024824926629662514, 0.062150824815034866, 0.1181279867887497, -0.03623645752668381, -0.09081392735242844, -0.004145741928368807, -0.07012272626161575, -0.004534943029284477, -0.0634787529706955, -0.1278257668018341, -0.026930537074804306, -0.04008551314473152, -0.055235251784324646, -0.09564457833766937, -0.14182423055171967, -0.06213666871190071, 0.05138105899095535, -0.04064963757991791, -0.005654680076986551, -0.1274740993976593, -0.00044699854333885014, -0.018474267795681953, 0.012439657002687454, -0.05065855756402016, 0.011554169468581676, 0.0181584432721138, -0.0667552649974823, 0.06208335608243942, 0.04742613062262535, 0.05173679441213608, -0.10885899513959885, 0.0051263971254229546, -0.14586104452610016, 0.18386609852313995, -0.043579209595918655, 0.05218039080500603, -0.09182221442461014, 0.03715621680021286, 0.012959945015609264, 0.0562250055372715, 0.0074004135094583035, 0.11753620207309723, -0.17933505773544312, -0.08751628547906876, 0.18221144378185272, -0.0556858591735363, -0.06979299336671829, 0.08755716681480408, -0.050577312707901, 0.12401536852121353, 0.12825345993041992, 0.1287030726671219, 0.12266883999109268, -0.057049207389354706, 0.01602608896791935, 0.03169020265340805, -0.048680733889341354, 0.11869707703590393, 0.021605364978313446, -0.061515096575021744, 0.1073688492178917, 0.011824356392025948, -0.050856590270996094, 0.011974192224442959, 0.004798033740371466, -0.051747143268585205, 0.008221483789384365, -0.044461917132139206, 0.04815997928380966, -0.02784905768930912, 0.02887577749788761, 0.019721858203411102, -0.10740634053945541, 0.12364847213029861, 0.05889907479286194, -0.09907062351703644, 0.04395561292767525, -0.057076744735240936, -0.013832787983119488, -0.012541607953608036, -0.0017126798629760742, -0.20056265592575073, -0.1333157867193222, 0.016670554876327515, 0.01726514846086502, 0.12288502603769302, 0.006802774965763092, 0.0561835914850235, 0.04222575202584267, -0.03721512854099274, -0.0018423143774271011, 0.02446982078254223, 0.0047590117901563644, -0.0696144551038742, -0.10605236142873764, -0.014978993684053421, -0.03707530349493027, 0.05847671627998352, -0.06321200728416443, 0.03376491740345955, -0.017817720770835876, 0.10084281116724014, 0.05383040010929108, -0.01939167082309723, 0.0039953007362782955, -0.019288774579763412, 0.014958816580474377, -0.04662095382809639, 0.06742314249277115, 0.03610021993517876, -0.12604692578315735, 0.09577792137861252, -0.15315364301204681, -0.14902235567569733, 0.07373161613941193, -0.04795509949326515, -0.05172036960721016, -0.04541695490479469, -0.024367311969399452, 0.010639618150889874, -0.05061632767319679, -0.04660636559128761, 0.2206914722919464, 0.08130144327878952, 0.11368385702371597, -0.0400259830057621, -0.02895944006741047, -0.06427465379238129, -0.030116930603981018, -0.02061132714152336, 0.09675069898366928, -0.03343767672777176, -0.11798378825187683, 0.07228802889585495, 0.05016587674617767, -0.07079552114009857, 0.13813894987106323, -0.020758429542183876, -0.05792894586920738, -0.04673265293240547, 0.003125326707959175, 0.028425853699445724, -0.014642775990068913, -0.06913711130619049, 0.0024804656859487295, 0.04794064536690712, 0.007039424031972885, 0.04002952575683594, -0.06654234975576401, 0.047094423323869705, 0.0480414517223835, -0.012916929088532925, 0.09713315218687057, 0.009754177182912827, 0.02564648538827896, 0.04699334502220154, 0.01095851045101881, 0.0477854460477829, -0.026339130476117134, -0.050326816737651825, -0.08614610880613327, 0.1409669667482376, -0.11768881231546402, -0.20585589110851288, -0.13260456919670105, 0.020453795790672302, -0.06263436377048492, 0.01888352818787098, 0.07990624755620956, -0.03981911763548851, -0.04688234627246857, -0.055453669279813766, 0.07594620436429977, 0.08188720047473907, -0.05699928477406502, -0.02024749107658863, 0.061506450176239014, 0.006290414370596409, -0.1277495175600052, -0.008335091173648834, -0.01260485127568245, -0.09045394510030746, -0.0017110413173213601, -0.05695541575551033, 0.04171296954154968, 0.08093537390232086, 0.05902770161628723, 0.008085059002041817, -0.014746014028787613, 0.1993635594844818, -0.0771087035536766, 0.06135517731308937, 0.12729693949222565, 0.001513418392278254, 0.07366560399532318, 0.08608441054821014, 0.019019152969121933, -0.05831519514322281, 0.03946715593338013, 0.08172053098678589, -0.01684449054300785, -0.16160881519317627, -0.09051049500703812, -0.0722440853714943, -0.03232411667704582, 0.11078350245952606, 0.03763202950358391, 0.010873221792280674, 0.04396972805261612, -0.02807798981666565, -0.0061540016904473305, 0.09869091957807541, 0.10450614243745804, 0.10867570340633392, -0.013160793110728264, 0.10573361068964005, -0.059372954070568085, -0.07110054045915604, 0.04047897830605507, -0.0006208699196577072, 0.17041516304016113, 0.006915077567100525, 0.1698000133037567, 0.07260008901357651, -0.015780532732605934, 0.0021715196780860424, 0.0798569917678833, -0.032433249056339264, 0.046525370329618454, -0.03179767355322838, -0.0952230766415596, -0.022043239325284958, 0.05042300000786781, 0.10325231403112411, -0.04584524780511856, -0.0256398543715477, 0.02995462343096733, 0.1459391564130783, 0.17841683328151703, 0.03251482918858528, -0.1919914335012436, -0.03081572614610195, 0.025066379457712173, -0.05454397574067116, -0.058200154453516006, -0.006327237468212843, 0.0227795597165823, -0.1133715957403183, 0.03930693492293358, -0.02248029038310051, 0.1092352420091629, -0.08771330863237381, 0.022639242932200432, -0.047041233628988266, 0.05180714279413223, 0.005973093211650848, 0.06623411178588867, -0.2159910947084427, 0.09503664076328278, 0.03068888746201992, 0.06760775297880173, -0.04044515639543533, 0.01885347068309784, 0.08220724761486053, 0.014563154429197311, 0.16812960803508759, -0.022324353456497192, -0.009579785168170929, 0.03810172528028488, -0.07411423325538635, 0.012856259942054749, 0.0646614283323288, -0.13330382108688354, 0.0881393551826477, -0.0554739311337471, -0.04185866937041283, -0.004652521573007107, 0.05566064640879631, -0.052823666483163834, -0.1876125931739807, -0.0023379921913146973, -0.00695264944806695, 0.0020368569530546665, -0.021679101511836052, -0.004013079218566418, 0.022766519337892532, 0.19922807812690735, -0.08573581278324127, -0.07179097086191177, -0.12593795359134674, -0.030482949689030647, 0.0976528525352478, -0.08496373891830444, 0.005574181210249662, -0.017147595062851906, 0.14037644863128662, -0.060233570635318756, -0.10232770442962646, 0.056385233998298645, -0.0273954588919878, -0.06847821921110153, -0.028641365468502045, 0.10366787761449814, 0.04244622215628624, 0.03726331889629364, 0.03605709597468376, 0.08144621551036835, -0.031016675755381584, -0.07963696122169495, -0.0566091425716877, 0.14493060111999512, -0.008668703958392143, 0.0858251303434372, -0.14433158934116364, -0.025790167972445488, -0.10705249011516571, 0.06155851110816002, 0.21393440663814545, 0.18621355295181274, -0.06976532936096191, 0.10813506692647934, 0.162381112575531, -0.11933091282844543, -0.23174937069416046, -0.07193318009376526, 0.013243448920547962, 0.03931714594364166, 0.009594014845788479, -0.16643692553043365, 0.06991111487150192, 0.02686365135014057, 0.008931834250688553, -0.1033678725361824, -0.22368505597114563, -0.13519254326820374, 0.1235455721616745, 0.0041892267763614655, -0.03986844792962074, -0.08610332012176514, -0.05366000905632973, -0.06849469244480133, -0.012144947424530983, 0.13194085657596588, -0.10291673988103867, 0.13226772844791412, 0.04223678261041641, -0.0035353514831513166, 0.050216495990753174, -0.004836047533899546, 0.10002189874649048, 0.044245705008506775, 0.045754481106996536, -0.02225634828209877, -0.05220654234290123, 0.11597548425197601, -0.08164474368095398, 0.10580968111753464, -0.02665862627327442, 0.047422848641872406, -0.07717437297105789, -0.03773478418588638, -0.0609714537858963, 0.02731272578239441, -0.042699143290519714, -0.05234608054161072, -0.012330534867942333, 0.04546232894062996, 0.12200729548931122, 0.00031917315209284425, 0.05029723793268204, -0.08331028372049332, 0.040442630648612976, 0.1588166356086731, 0.06536629796028137, 0.05393205210566521, -0.17169828712940216, 0.005087663885205984, 0.0007285379688255489, 0.05855457857251167, -0.08930336683988571, 0.07245447486639023, 0.07448884844779968, -0.0002456447691656649, 0.14527618885040283, 0.03585630655288696, -0.07975197583436966, -0.020658353343605995, 0.011684441938996315, -0.10575832426548004, -0.10153376311063766, -0.04567725211381912, -0.052402857691049576, -0.0855519101023674, -0.06037341430783272, 0.15384067595005035, -0.005537364631891251, -0.009677475318312645, 0.03659101948142052, 0.017967088147997856, -0.03018234297633171, 0.07016046345233917, 0.020353810861706734, 0.02465716563165188, -0.04705049842596054, 0.13471396267414093, 0.0769648551940918, -0.09108158946037292, 0.0220122542232275, 0.15042060613632202, -0.08266754448413849, -0.07833147048950195, -0.017199866473674774, 0.16681621968746185, -0.057614509016275406, 0.04360806569457054, -0.07268200069665909, -0.05852990224957466, 0.01242799498140812, 0.07517202198505402, 0.047536153346300125, 0.061909161508083344, -0.10218363255262375, 0.007500302046537399, -0.08117079734802246, 0.08152472227811813, 0.05995326489210129, 0.013903431594371796, -0.012864489108324051, 0.10185292363166809, -0.0064013353548944, -0.012384100817143917, -0.03271714225411415, -0.05508774146437645, -0.08695389330387115, 0.010980221442878246, -0.038153134286403656, 0.01399095356464386, -0.0639636367559433, -0.013268162496387959, 0.025467030704021454, 0.03877896070480347, 0.008259117603302002, -0.007599618285894394, -0.047242533415555954, -0.07491505891084671, -0.048850446939468384, 0.07020395249128342, -0.14953269064426422, -0.02309577353298664, 0.029272103682160378, -0.10714802145957947, 0.0769810751080513, 0.019141070544719696, -0.046797074377536774, 0.02338903397321701, -0.07104414701461792, -0.06385353952646255, 0.0005379089852795005, 0.027393320575356483, 0.0493384525179863, -0.08949225395917892, 0.01476831454783678, -0.052554503083229065, 0.019694436341524124, 0.006350871175527573, 0.07900464534759521, -0.0997181385755539, 0.044802404940128326, 0.0035424381494522095, -0.024612383916974068, -0.08919880539178848, 0.020225435495376587, 0.039430465549230576, 0.05156302452087402, 0.1182817593216896, -0.06661253422498703, 0.09463409334421158, -0.12196235358715057, -0.0009127148077823222, 0.019014962017536163, -0.06300847232341766, 0.09272180497646332, -0.12042878568172455, 0.06072935834527016, -0.05003400892019272, 0.07912809401750565, -0.015063975937664509, 0.043863825500011444, 0.058141518384218216, 0.017242031171917915, -0.0492943599820137, 0.04824364557862282, 0.05744302645325661, 0.03748943656682968, -0.0014668891672044992, -0.030929258093237877, 0.00494908494874835, 0.008577524684369564, -0.00420439662411809, 0.06315889954566956, 0.12538884580135345, 0.07376138120889664, 0.07997116446495056, 0.08597409725189209, 0.029778175055980682, -0.08864564448595047, 0.02546198107302189, 0.02033579908311367, 0.04396018013358116, -0.06245939061045647, 0.012259162031114101, 0.11235897988080978, -0.14497122168540955, 0.1400354951620102, 0.020721454173326492, -0.06524363160133362, -0.09078217297792435, -0.10524091124534607, -0.06305831670761108, -0.014250428415834904, -0.01702958717942238, -0.11590875685214996, -0.005894393194466829, 0.021560773253440857, 0.008056131191551685, 0.009187823161482811, 0.134817436337471, -0.07609553635120392, -0.08679983764886856, 0.09264756739139557, -0.01591316983103752, 0.04985646903514862, -0.004353643395006657, 0.025159353390336037, 0.01751955784857273, 0.10136385262012482, 0.02601163648068905, 0.0601855106651783, 0.05048935487866402, 0.01075337640941143, -0.08842365443706512, -0.08197487145662308, 0.00751308910548687, -0.007583632133901119, -0.05228067934513092, 0.08133375644683838, 0.032994162291288376, -0.07944237440824509, -0.007510073948651552, 0.24213778972625732, -0.10072991251945496, -0.12540999054908752, -0.17306184768676758, 0.1706574708223343, 0.03377421200275421, 0.03305695205926895, -0.008250831626355648, -0.07475488632917404, -0.025032922625541687, 0.16455017030239105, 0.21067577600479126, -0.07644816488027573, 0.028003482148051262, 0.0738770142197609, 0.019106676802039146, 0.028209680691361427, 0.017825737595558167, 0.04857596382498741, 0.16820673644542694, -0.056025657802820206, 0.08436660468578339, -0.0052733286283910275, -0.07928761839866638, -0.07251915335655212, 0.11602943390607834, 0.00750237051397562, 0.026728784665465355, -0.031675826758146286, 0.08878073841333389, -0.07594676315784454, -0.13635165989398956, 0.00012202366633573547, -0.11408255249261856, -0.11321862787008286, -0.04476335272192955, 0.021316107362508774, 0.015516444109380245, 0.08183682709932327, 0.0340186208486557, -0.03628718852996826, 0.1506616175174713, -0.0013915254967287183, -0.05848962813615799, -0.018731774762272835, 0.0332564152777195, -0.0751829743385315, 0.13493072986602783, -0.0017402453813701868, -0.0386592335999012, 0.10125748068094254, 0.01856750063598156, -0.04644755646586418, 0.06416552513837814, 0.0359429232776165, -0.07058636099100113, 0.09250479936599731, 0.08348456770181656, -0.03973685950040817, 0.08157806098461151, 0.060764580965042114, -0.1811094880104065, 0.07094341516494751, -0.003812205046415329, -0.06238507479429245, -0.05093371495604515, 0.037026166915893555, -0.08533453941345215, 0.08667867630720139, 0.1761733442544937, -0.003751832526177168, 0.00044657650869339705, -0.012239829637110233, -0.009122501127421856, 0.036974817514419556, 0.019799022004008293, -0.06452348083257675, -0.0686066523194313, -0.015338003635406494, 0.038782261312007904, 0.03476168215274811, -0.27239447832107544, -0.11866065114736557, 0.03686772659420967, -0.02227024734020233, -0.03173723816871643, 0.11948579549789429, 0.059632640331983566, 0.020484954118728638, -0.02511230669915676, -0.18121866881847382, 0.025990281254053116, 0.09716682881116867, -0.1349218636751175, -0.07658109068870544 ]
null
null
transformers
A fine-tuned model based on'gumgo91/IUPAC_BERT'for Blood brain barrier permeability prediction based on IUPAC string. There are also BiLSTM models available as well as these two models in 'https://github.com/mephisto121/BBBNLP if you want to check them all and check the codes too. [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1jGYf3sq93yO4EbgVaEl3nlClrVatVaXS#scrollTo=AMEdQItmilAw)
{}
text-classification
Parsa/BBB_prediction_classification_IUPAC
[ "transformers", "pytorch", "bert", "text-classification", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #bert #text-classification #autotrain_compatible #endpoints_compatible #region-us
A fine-tuned model based on'gumgo91/IUPAC_BERT'for Blood brain barrier permeability prediction based on IUPAC string. There are also BiLSTM models available as well as these two models in 'URL if you want to check them all and check the codes too. ![Open In Colab](URL
[]
[ "TAGS\n#transformers #pytorch #bert #text-classification #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ 36 ]
[ "passage: TAGS\n#transformers #pytorch #bert #text-classification #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ -0.026536712422966957, 0.04976736754179001, -0.007731540594249964, 0.02341027930378914, 0.20494870841503143, 0.04218224436044693, 0.07166644185781479, 0.1081078052520752, 0.06540437042713165, -0.032089509069919586, 0.10898502916097641, 0.22890737652778625, -0.03745893016457558, 0.11566723883152008, -0.11783017218112946, -0.2979154586791992, 0.06830790638923645, 0.06908033788204193, -0.00024103521718643606, 0.11134638637304306, 0.08377818018198013, -0.08843351900577545, 0.07041390985250473, -0.039391499012708664, -0.11928878724575043, 0.04120895266532898, 0.04201217740774155, -0.128968745470047, 0.10206019878387451, 0.04667878895998001, 0.1612440049648285, 0.020005566999316216, -0.06739236414432526, -0.154616579413414, 0.0336696095764637, -0.0003978070744778961, -0.07949702441692352, 0.041154924780130386, 0.0834934264421463, -0.11765240132808685, 0.009522730484604836, 0.03841863572597504, 0.022789550945162773, 0.05028093606233597, -0.14333219826221466, -0.07251771539449692, -0.002419156488031149, 0.029012316837906837, 0.05913800001144409, 0.0598759762942791, -0.002957597142085433, 0.1424214392900467, -0.13965049386024475, 0.12620261311531067, 0.09429286420345306, -0.2923010587692261, -0.0088807987049222, 0.08759160339832306, 0.027602141723036766, 0.05489844083786011, -0.0477239191532135, 0.050349123775959015, 0.02152593620121479, 0.006244409829378128, 0.0014351755380630493, -0.07273159921169281, -0.10605065524578094, 0.035220496356487274, -0.08316893875598907, -0.049130894243717194, 0.19201035797595978, -0.05444741249084473, 0.07102859020233154, -0.02369237318634987, -0.09560474008321762, -0.05289671570062637, -0.025107022374868393, 0.018850168213248253, -0.04439172148704529, 0.06905210763216019, 0.029569845646619797, 0.010320127010345459, -0.10565698146820068, 0.03122977539896965, -0.21760006248950958, 0.20976117253303528, 0.014575785025954247, 0.05031627416610718, -0.18153272569179535, 0.05320606753230095, 0.012799138203263283, -0.09720049053430557, 0.05345968157052994, -0.10138970613479614, 0.025928562507033348, -0.03812453895807266, -0.07192374020814896, -0.02739580348134041, 0.0816444382071495, 0.13516521453857422, 0.041108258068561554, 0.059053935110569, -0.041404832154512405, 0.08655412495136261, 0.03162030130624771, 0.13472779095172882, 0.033738039433956146, -0.03977132961153984, 0.03946247324347496, -0.1231473833322525, -0.00647841626778245, -0.07187561690807343, -0.15673033893108368, -0.028635773807764053, 0.0752328485250473, 0.08006087690591812, 0.004822878632694483, 0.09519599378108978, -0.060662589967250824, -0.03062121570110321, 0.08224024623632431, -0.07733647525310516, 0.02224470116198063, 0.026406224817037582, 0.022000659257173538, 0.101798415184021, -0.019659440964460373, -0.0015375112416222692, -0.08451379835605621, 0.14397169649600983, -0.05375861003994942, 0.007298978511244059, -0.033063821494579315, -0.08073414862155914, 0.030349934473633766, -0.1405506432056427, 0.02792905643582344, -0.17062804102897644, -0.09528942406177521, 0.012332682497799397, 0.019128363579511642, 0.001749284565448761, -0.023433325812220573, -0.03093680739402771, 0.006545908749103546, 0.0424383319914341, -0.06069215387105942, -0.06501442193984985, -0.07589498907327652, 0.09581612050533295, -0.030969982966780663, 0.07632152736186981, -0.12148407846689224, 0.0795503556728363, -0.09479566663503647, -0.02546360157430172, -0.12941965460777283, 0.0373355858027935, -0.04622753709554672, 0.1718803346157074, 0.013590065762400627, -0.04714406654238701, -0.05325024202466011, 0.06296809762716293, -0.0670921728014946, 0.1699289232492447, -0.06951840966939926, -0.11890474706888199, 0.21500444412231445, -0.08380883932113647, -0.13756290078163147, 0.08620618283748627, -0.014584669843316078, 0.0003590308187995106, 0.10495591163635254, 0.2001815140247345, 0.09155366569757462, 0.006830575410276651, 0.08820617198944092, 0.12077005207538605, -0.08545669168233871, -0.11240582168102264, -0.005832689814269543, -0.005655956454575062, -0.14152070879936218, 0.057374704629182816, 0.07428599894046783, 0.07163040339946747, -0.05171000212430954, -0.03735769912600517, -0.00994520727545023, -0.007230670657008886, 0.13671016693115234, 0.054894041270017624, 0.11879090964794159, -0.08347898721694946, 0.0004564806877169758, 0.008610726334154606, -0.01850513368844986, 0.016449708491563797, 0.02691977098584175, -0.06303919106721878, 0.11149293184280396, 0.01560661755502224, 0.030040156096220016, -0.22954900562763214, -0.07738739997148514, -0.004857209976762533, 0.1317768543958664, -0.016139987856149673, 0.11725185811519623, 0.05058206245303154, -0.058512113988399506, -0.013537165708839893, -0.022089142352342606, 0.18155772984027863, 0.021907884627580643, -0.06484844535589218, -0.06583412736654282, 0.06355377286672592, -0.07136932015419006, -0.0009439446148462594, -0.07689674198627472, 0.014060765504837036, 0.07748502492904663, 0.11273034662008286, 0.01009473018348217, 0.0715249702334404, -0.026004934683442116, 0.06679531186819077, -0.06401108205318451, 0.026931757107377052, 0.118758425116539, -0.012655431404709816, -0.0722615197300911, 0.15806308388710022, -0.14796574413776398, 0.29817652702331543, 0.2068978101015091, -0.3074952960014343, 0.003431171178817749, -0.04410141706466675, -0.0044579943642020226, 0.02857346273958683, 0.03858071565628052, 0.0010076105827465653, 0.0966758206486702, 0.001866061589680612, 0.20398299396038055, -0.02794223465025425, -0.041622765362262726, -0.013331537134945393, -0.048477016389369965, -0.030720924958586693, 0.0930258184671402, 0.06317867338657379, -0.2123168408870697, 0.1965867280960083, 0.2256951779127121, 0.020235497504472733, 0.16021281480789185, -0.007587776519358158, 0.039776433259248734, 0.09032315015792847, -0.04407169669866562, -0.028124243021011353, -0.07736045867204666, -0.19754260778427124, -0.048038698732852936, 0.07934695482254028, 0.03309439867734909, 0.06981316953897476, -0.11398086696863174, -0.028780505061149597, 0.005873502232134342, 0.02043827436864376, -0.030085694044828415, 0.07733964174985886, 0.08143315464258194, 0.11509016901254654, 0.004185348749160767, -0.07153625786304474, 0.11260948330163956, -0.000511682010255754, -0.0829218178987503, 0.18312713503837585, -0.15064290165901184, -0.35457029938697815, -0.15102937817573547, -0.20665821433067322, -0.024193694815039635, 0.05783972144126892, 0.10403360426425934, -0.11499883979558945, -0.043235164135694504, 0.04162687435746193, -0.002294909907504916, -0.062236279249191284, 0.043300777673721313, -0.06919633597135544, 0.06772983074188232, -0.056030891835689545, -0.06578446924686432, -0.07393626123666763, -0.03732382878661156, -0.014316183514893055, 0.15351881086826324, -0.13038954138755798, 0.07227112352848053, 0.17736823856830597, -0.010089537128806114, 0.06711666285991669, -0.038066890090703964, 0.17289850115776062, -0.08991120010614395, -0.028915125876665115, 0.17436328530311584, -0.08250410854816437, 0.0786806121468544, 0.1608263999223709, 0.022829625755548477, -0.06656418740749359, 0.029574787244200706, -0.039340466260910034, -0.08898156136274338, -0.21838922798633575, -0.148067444562912, -0.11992256343364716, 0.05924250930547714, 0.062341220676898956, 0.0696515217423439, 0.12266137450933456, 0.059203725308179855, 0.015477290377020836, -0.0007553264731541276, -0.00036606789217330515, 0.07583136856555939, 0.2552264630794525, -0.0020075372885912657, 0.14774726331233978, -0.05434580519795418, -0.1357378363609314, 0.08277737349271774, 0.01842481829226017, 0.11255297809839249, 0.09960044920444489, 0.014388641342520714, 0.00644803699105978, 0.061373550444841385, 0.16953399777412415, 0.12216762453317642, 0.03140265494585037, -0.015582526102662086, -0.02194071002304554, 0.0020790479611605406, -0.07317613065242767, 0.01304252166301012, 0.07926664501428604, -0.15208743512630463, -0.08082117140293121, -0.15637393295764923, 0.09561088681221008, 0.07428078353404999, 0.049813296645879745, -0.2040114849805832, 0.009754637256264687, 0.09391142427921295, -0.030124526470899582, -0.09943006932735443, 0.07721024006605148, -0.04655788093805313, -0.14170965552330017, 0.10022298991680145, -0.03493443876504898, 0.13963255286216736, -0.08681212365627289, 0.0932496190071106, -0.03828613832592964, -0.11996456235647202, 0.032416898757219315, 0.11279566586017609, -0.2732636332511902, 0.2332872599363327, 0.010946370661258698, -0.07370878010988235, -0.07934506237506866, -0.026762953028082848, 0.041629474610090256, 0.2184235155582428, 0.059754300862550735, 0.002968377433717251, -0.06075876206159592, -0.18870088458061218, -0.006565387360751629, 0.009197798557579517, 0.1304895579814911, -0.03775089234113693, -0.01569267176091671, -0.0419037826359272, -0.03293319419026375, -0.029366329312324524, -0.038726381957530975, 0.035019759088754654, -0.17148956656455994, 0.05544491484761238, 0.0377817265689373, 0.07244952023029327, 0.019013661891222, -0.04418788477778435, -0.12382309138774872, 0.197775200009346, -0.07774023711681366, -0.07762428373098373, -0.11156740039587021, -0.0784677043557167, 0.02202191948890686, -0.08583451807498932, 0.06005251407623291, -0.08652466535568237, 0.016779478639364243, -0.06327100098133087, -0.20542916655540466, 0.13557077944278717, -0.09933868050575256, -0.02806651033461094, -0.06584896147251129, 0.15001823008060455, -0.0767940878868103, 0.01665390096604824, 0.032243192195892334, 0.01813752017915249, -0.08950473368167877, -0.07632966339588165, -0.0018591269617900252, 0.016023563221096992, 0.051190294325351715, 0.06099539250135422, -0.10083866119384766, -0.06429562717676163, -0.03685735538601875, 0.014633421786129475, 0.2983008325099945, 0.15201643109321594, -0.064842589199543, 0.15137214958667755, 0.1384977251291275, -0.07145120203495026, -0.3431456387042999, -0.08484037965536118, -0.10642081499099731, -0.04306303709745407, -0.046289920806884766, -0.16240264475345612, 0.1177973523736, -0.01295482087880373, -0.017609771341085434, 0.08445491641759872, -0.15145616233348846, -0.08574660867452621, 0.20116515457630157, -0.026354258880019188, 0.39032095670700073, -0.10577060282230377, -0.09967513382434845, -0.058081019669771194, -0.121647909283638, 0.1404253989458084, 0.01047214213758707, 0.08493805676698685, -0.009310578927397728, 0.06269232928752899, 0.044362664222717285, -0.039699576795101166, 0.0946895033121109, 0.011379079893231392, 0.01612018421292305, -0.1131248027086258, -0.11301706731319427, 0.007961280643939972, -0.020395388826727867, -0.015760453417897224, -0.00797135941684246, 0.010495016351342201, -0.1657717525959015, -0.04052681475877762, -0.07803480327129364, 0.05654771998524666, 0.04092846438288689, -0.03739452734589577, 0.006628011353313923, -0.021788200363516808, -0.004880301654338837, 0.005630741827189922, 0.2613779604434967, -0.0545232780277729, 0.17406699061393738, 0.09997852146625519, 0.13454613089561462, -0.1609681397676468, 0.02098797634243965, -0.07239851355552673, -0.06214495375752449, 0.07092934846878052, -0.07368351519107819, 0.07252470403909683, 0.13665559887886047, -0.06541655212640762, 0.06771941483020782, 0.11662118136882782, 0.059408992528915405, -0.036289941519498825, 0.15761438012123108, -0.22834214568138123, 0.03067406453192234, -0.05344350263476372, -0.016603386029601097, 0.06837030500173569, 0.06279473006725311, 0.13393236696720123, 0.05459635332226753, -0.043871067464351654, 0.002848769072443247, -0.009000388905405998, -0.0024182628840208054, 0.06304551661014557, 0.05966060236096382, 0.04511803761124611, -0.1356714963912964, 0.04809432104229927, 0.04747392609715462, -0.180133655667305, -0.015834158286452293, 0.13596458733081818, -0.16409757733345032, -0.1254369467496872, -0.014882639050483704, 0.14192621409893036, -0.10471435636281967, -0.05383450537919998, -0.06129157543182373, -0.13381800055503845, 0.07413940876722336, 0.2071639746427536, 0.1222052350640297, 0.0868266150355339, -0.05386051535606384, -0.04268129542469978, 0.013526189140975475, -0.0048812017776072025, 0.000410786597058177, 0.025256112217903137, -0.10814032703638077, 0.030668657273054123, -0.016766557469964027, 0.1549694985151291, -0.0959596261382103, -0.07802556455135345, -0.1802983283996582, 0.04513133689761162, -0.09466184675693512, -0.0370473749935627, -0.0709371566772461, -0.02417352795600891, 0.006666520144790411, -0.05355566740036011, -0.037236955016851425, -0.06802111119031906, -0.12928707897663116, 0.041479259729385376, -0.020978856831789017, 0.04500049725174904, -0.06915231049060822, -0.044879116117954254, 0.10159214586019516, -0.03420831635594368, 0.09736357629299164, 0.10930681228637695, -0.09245149046182632, 0.10182034969329834, -0.1422983705997467, -0.12590330839157104, 0.12893438339233398, 0.02711927518248558, 0.07729022204875946, 0.07507030665874481, 0.03763207420706749, 0.06823369115591049, 0.010589729994535446, 0.06981854140758514, 0.07503756135702133, -0.12291653454303741, 0.05968776345252991, -0.029884058982133865, -0.1753014773130417, -0.0433085560798645, -0.04439191892743111, 0.0955316424369812, 0.0027241117786616087, 0.1497408151626587, -0.05511422082781792, 0.10076558589935303, -0.034150879830121994, 0.007692268583923578, -0.018073493614792824, -0.21772713959217072, -0.0606854185461998, -0.0871635228395462, 0.02593981847167015, 0.0011935612419620156, 0.25321871042251587, 0.060501862317323685, 0.047848355025053024, 0.05219770595431328, 0.0803675726056099, -0.005114862695336342, 0.024529730901122093, 0.17962703108787537, 0.10480687767267227, -0.056110929697752, -0.06051088124513626, 0.06334816664457321, 0.020930401980876923, 0.0037487365771085024, 0.1370624452829361, 0.07126225531101227, -0.025097444653511047, 0.07946188747882843, -0.022343091666698456, 0.04902622103691101, -0.13099198043346405, -0.18797338008880615, -0.0362858846783638, 0.0822635069489479, 0.008193853311240673, 0.06443964689970016, 0.08426562696695328, -0.029682213440537453, 0.05306922271847725, -0.05066857486963272, -0.05351173132658005, -0.18930195271968842, -0.08380404114723206, -0.09822331368923187, -0.10457178205251694, 0.005494547076523304, -0.07776138931512833, -0.007010516710579395, 0.0889914408326149, 0.04661707207560539, -0.04920143634080887, 0.07530294358730316, 0.005946568213403225, -0.05706603452563286, 0.08126048743724823, -0.03929980844259262, 0.035573069006204605, -0.004914599005132914, -0.03371923416852951, -0.14008314907550812, -0.01516553945839405, -0.04801462963223457, 0.04064973443746567, -0.06195714324712753, 0.005159671418368816, -0.14515186846256256, -0.12241504341363907, -0.02803012728691101, 0.052396051585674286, -0.057535864412784576, 0.1368633210659027, 0.0011574793606996536, 0.005521169863641262, 0.04932842403650284, 0.20023545622825623, -0.06511733680963516, -0.05715160444378853, -0.03652770072221756, 0.25829756259918213, 0.07457830011844635, 0.1169453114271164, -0.00899919681251049, -0.005612371955066919, -0.09091201424598694, 0.33026692271232605, 0.2956998944282532, -0.053279418498277664, 0.04956240579485893, 0.021509597077965736, 0.03881188482046127, 0.15772217512130737, 0.14865562319755554, 0.08992809802293777, 0.23702938854694366, -0.06422201544046402, -0.03442860767245293, -0.01987340860068798, -0.020380141213536263, -0.12080796808004379, 0.0805739313364029, 0.0668584331870079, -0.04897911474108696, -0.07300020754337311, 0.10295598208904266, -0.20242395997047424, 0.1379757672548294, -0.0004371747490949929, -0.21978750824928284, -0.07235769927501678, -0.033254899084568024, 0.14364826679229736, -0.008563272655010223, 0.08543892204761505, -0.0008842989918775856, -0.11287369579076767, 0.02438463270664215, 0.019631659612059593, -0.2149261236190796, -0.020671725273132324, 0.06799837201833725, -0.04828084260225296, -0.0022455095313489437, -0.019154418259859085, 0.03060556948184967, 0.07118745148181915, 0.06662483513355255, -0.008223235607147217, 0.04078114032745361, 0.002477803034707904, -0.04111843183636665, 0.005046389531344175, 0.015769492834806442, 0.0016257762908935547, -0.09229632467031479, 0.06661305576562881, -0.16755518317222595, 0.05424373224377632, -0.08128474652767181, -0.06260967999696732, -0.009263207204639912, 0.034962452948093414, -0.054396457970142365, 0.04791352152824402, 0.10115863382816315, 0.0075232600793242455, -0.032317593693733215, -0.050817299634218216, -0.03972724452614784, -0.0014734352007508278, -0.1398703157901764, -0.14837034046649933, -0.09151425212621689, -0.09747706353664398, 0.11042709648609161, 0.001969041768461466, -0.15724746882915497, -0.0015884727472439408, -0.0997534990310669, 0.06987065821886063, -0.16953794658184052, 0.09285256266593933, 0.0341578908264637, 0.015343928709626198, -0.015839118510484695, -0.06526738405227661, 0.05342889577150345, 0.07730220258235931, -0.12141184508800507, -0.08985189348459244 ]
null
null
transformers
A fine-tuned model based on'DeepChem/ChemBERTa-77M-MLM'for Blood brain barrier permeability prediction based on SMILES string. There are also BiLSTM models available as well as these two models in 'https://github.com/mephisto121/BBBNLP if you want to check them all and check the codes too. [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1jGYf3sq93yO4EbgVaEl3nlClrVatVaXS#scrollTo=AMEdQItmilAw)
{}
text-classification
Parsa/BBB_prediction_classification_SMILES
[ "transformers", "pytorch", "safetensors", "roberta", "text-classification", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #safetensors #roberta #text-classification #autotrain_compatible #endpoints_compatible #region-us
A fine-tuned model based on'DeepChem/ChemBERTa-77M-MLM'for Blood brain barrier permeability prediction based on SMILES string. There are also BiLSTM models available as well as these two models in 'URL if you want to check them all and check the codes too. ![Open In Colab](URL
[]
[ "TAGS\n#transformers #pytorch #safetensors #roberta #text-classification #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ 42 ]
[ "passage: TAGS\n#transformers #pytorch #safetensors #roberta #text-classification #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ -0.028687525540590286, 0.05357837677001953, -0.008051144890487194, 0.02121330052614212, 0.18056228756904602, 0.008536121807992458, 0.10035593807697296, 0.09559491276741028, 0.021165013313293457, -0.015957044437527657, 0.1127614974975586, 0.22432823479175568, -0.04488643258810043, 0.14961805939674377, -0.1361830085515976, -0.2550555169582367, 0.09017318487167358, 0.02369609847664833, 0.027832787483930588, 0.12280373275279999, 0.09783495962619781, -0.09009058773517609, 0.05287287011742592, -0.05070585757493973, -0.09785091131925583, 0.023647451773285866, 0.05718918889760971, -0.1523657888174057, 0.10643739998340607, 0.025555817410349846, 0.17115284502506256, 0.048184722661972046, -0.05446022376418114, -0.15328837931156158, 0.04500175639986992, 0.008868418633937836, -0.08748690783977509, 0.03704885393381119, 0.07283347100019455, -0.1362168937921524, 0.0197979137301445, 0.005970855243504047, 0.030576243996620178, 0.05220610648393631, -0.13764233887195587, -0.08703942596912384, -0.015679989010095596, 0.03648972511291504, 0.08004462718963623, 0.07187248021364212, -0.013943949714303017, 0.18568210303783417, -0.12566176056861877, 0.1375054270029068, 0.09868274629116058, -0.29195722937583923, -0.020140940323472023, 0.07208111882209778, 0.03356672823429108, 0.052708398550748825, -0.053229283541440964, 0.04895586520433426, 0.03549731895327568, -0.006376679986715317, 0.014970123767852783, -0.06349419802427292, -0.1349799633026123, -0.0054835183545947075, -0.07942013442516327, -0.04219941049814224, 0.17521651089191437, -0.06053846701979637, 0.048617299646139145, -0.06496677547693253, -0.10622873157262802, -0.007505121640861034, -0.020201539620757103, 0.02106471173465252, -0.047393154352903366, 0.042985763400793076, 0.02648822031915188, 0.025213440880179405, -0.11297889053821564, 0.018354615196585655, -0.19332897663116455, 0.2449568808078766, 0.022367993369698524, 0.03844251111149788, -0.16031219065189362, 0.0455021858215332, 0.051323529332876205, -0.11527538299560547, 0.05237147957086563, -0.11573725193738937, 0.04763741418719292, -0.026028281077742577, -0.04576777294278145, -0.051783908158540726, 0.13541893661022186, 0.16378942131996155, -0.010150546208024025, 0.052606355398893356, -0.044483087956905365, 0.08205068856477737, 0.0311953816562891, 0.09590413421392441, 0.03716582804918289, -0.017036469653248787, 0.08940738439559937, -0.07865145802497864, 0.043769966810941696, -0.05804463103413582, -0.1130538210272789, -0.014178289100527763, 0.11133165657520294, 0.11841985583305359, 0.019680097699165344, 0.09998185932636261, -0.053439658135175705, 0.009848693385720253, 0.09483878314495087, -0.09009052067995071, -0.00286822859197855, 0.02751816436648369, 0.042749010026454926, 0.03670160844922066, -0.013163954950869083, 0.0001406454248353839, -0.07429168373346329, 0.12228423357009888, -0.04441437870264053, -0.010570630431175232, -0.016430392861366272, -0.07107995450496674, 0.041426777839660645, -0.11925705522298813, 0.03486739099025726, -0.20263424515724182, -0.10825727134943008, -0.004694396164268255, -0.0006538681336678565, 0.011702848598361015, -0.025432176887989044, -0.023370537906885147, -0.008629834279417992, 0.023224301636219025, -0.05615955963730812, -0.09900694340467453, -0.07529179751873016, 0.11329738795757294, -0.023960789665579796, 0.06649302691221237, -0.09213969111442566, 0.043317120522260666, -0.12282411754131317, -0.023155461996793747, -0.13906267285346985, 0.021420307457447052, -0.060733452439308167, 0.19523757696151733, 0.024639148265123367, -0.03426804021000862, -0.053747087717056274, 0.06121164932847023, -0.06189485266804695, 0.18846091628074646, -0.05172055587172508, -0.08511020243167877, 0.25352224707603455, -0.14154402911663055, -0.1520945280790329, 0.10324986279010773, -0.01216895692050457, -0.006694854702800512, 0.12150928378105164, 0.20341061055660248, 0.08390066772699356, -0.0021429932676255703, 0.05450264737010002, 0.0910419151186943, -0.10571908950805664, -0.09156394749879837, -0.01977878250181675, 0.004970287438482046, -0.12940698862075806, 0.0509643591940403, 0.07108232378959656, 0.06502484530210495, -0.05122176557779312, -0.04652746766805649, -0.02547992393374443, -0.02631751261651516, 0.11796171963214874, 0.052819423377513885, 0.08379749953746796, -0.11410196125507355, -0.01301014143973589, -0.06504908949136734, -0.003103498136624694, 0.01162157766520977, 0.009154658764600754, -0.08006343990564346, 0.11529769003391266, 0.03819761052727699, 0.032176423817873, -0.2082485556602478, -0.10716935992240906, -0.017610061913728714, 0.12259180843830109, -0.03385048732161522, 0.0499425083398819, 0.05587891861796379, -0.025704186409711838, -0.01693795621395111, -0.051369018852710724, 0.1861991435289383, 0.022664329037070274, -0.0417642779648304, -0.07898439466953278, 0.09121650457382202, -0.07767072319984436, 0.04707082360982895, -0.0945407822728157, 0.033437579870224, 0.0749025046825409, 0.10365236550569534, 0.019974110648036003, 0.06828706711530685, -0.0035114099737256765, 0.053780265152454376, -0.0629541277885437, 0.010569148696959019, 0.10277323424816132, -0.0005315540474839509, -0.07141530513763428, 0.1387539505958557, -0.1787014901638031, 0.3387095332145691, 0.20280465483665466, -0.2447892725467682, -0.018307527527213097, -0.019603203982114792, -0.000624000676907599, 0.03637458384037018, 0.011823481880128384, 0.02661529742181301, 0.03594260662794113, -0.008607544004917145, 0.18841056525707245, -0.04855750501155853, -0.04329128563404083, 0.005911396816372871, -0.06494293361902237, -0.02638426050543785, 0.10038795322179794, 0.005017680115997791, -0.20793050527572632, 0.18424390256404877, 0.2098608911037445, 0.013231195509433746, 0.15390662848949432, -0.022132787853479385, 0.06091843545436859, 0.09437524527311325, -0.0036855540238320827, -0.011818736791610718, -0.06407582759857178, -0.12641794979572296, -0.029347272589802742, 0.06379223614931107, 0.0192562285810709, 0.04896288737654686, -0.11387570202350616, -0.05276139825582504, -0.014295121654868126, 0.014125852845609188, -0.01702273078262806, 0.08024405688047409, 0.06200481578707695, 0.1316721886396408, -0.02840934693813324, -0.08182787150144577, 0.09868922084569931, -0.012210250832140446, -0.07807506620883942, 0.2041827142238617, -0.12403123080730438, -0.35641878843307495, -0.12418276071548462, -0.14684775471687317, -0.01430930383503437, 0.057747047394514084, 0.10973217338323593, -0.11890494078397751, -0.04324058070778847, -0.0007837987504899502, -0.026301728561520576, -0.0025604928378015757, 0.03995449095964432, -0.053971074521541595, 0.08929763734340668, -0.02694692276418209, -0.06773676723241806, -0.06545247882604599, -0.04588039964437485, -0.04340643808245659, 0.17999723553657532, -0.0960284024477005, 0.08838844299316406, 0.14430342614650726, -0.014337943866848946, 0.03445785865187645, -0.05251175910234451, 0.14604978263378143, -0.08839847892522812, -0.01784934476017952, 0.20199494063854218, -0.08590759336948395, 0.08055099844932556, 0.16898398101329803, 0.011480960994958878, -0.06806742399930954, 0.05007896199822426, -0.046182781457901, -0.08831911534070969, -0.23398305475711823, -0.14831389486789703, -0.07547327876091003, 0.06805048882961273, 0.040501225739717484, 0.07742555439472198, 0.12228664010763168, 0.08204404264688492, -0.013154292479157448, -0.05536463484168053, 0.057813726365566254, 0.08001340925693512, 0.2091694325208664, 0.011235128156840801, 0.15463800728321075, -0.07005049288272858, -0.15075640380382538, 0.07244475930929184, -0.014865296892821789, 0.10284514725208282, 0.07300306111574173, -0.03617596998810768, 0.022798115387558937, 0.08151102066040039, 0.16239064931869507, 0.13600873947143555, 0.03619614988565445, -0.028352156281471252, -0.012917578220367432, -0.008862030692398548, -0.08344098925590515, -0.011058352887630463, -0.011077409610152245, -0.11227833479642868, -0.08565634489059448, -0.10095768421888351, 0.12476317584514618, 0.08664648234844208, 0.03475816920399666, -0.20997588336467743, 0.0068705277517437935, 0.10255434364080429, -0.014334184117615223, -0.09322701394557953, 0.09463687241077423, -0.036214374005794525, -0.1271391361951828, 0.11243819445371628, -0.037678636610507965, 0.1123918890953064, -0.07785673439502716, 0.07917766273021698, -0.09205556660890579, -0.10741183906793594, 0.008469371125102043, 0.10393676906824112, -0.24764928221702576, 0.23624561727046967, 0.0063825384713709354, -0.02396906539797783, -0.06987612694501877, -0.02889864332973957, 0.049811311066150665, 0.20977729558944702, 0.11106166988611221, -0.010705395601689816, -0.0931035503745079, -0.16144949197769165, -0.036591190844774246, 0.024681925773620605, 0.1129908487200737, -0.012363116256892681, -0.0007720108842477202, -0.04980746656656265, -0.026892034336924553, -0.033920109272003174, -0.09794943034648895, 0.005899900104850531, -0.13991780579090118, 0.025717465206980705, 0.05151733011007309, 0.07728003710508347, -0.019772108644247055, -0.04247400909662247, -0.11907421797513962, 0.17629963159561157, -0.0943412333726883, -0.08572119474411011, -0.1030087023973465, -0.08951152116060257, 0.008425500243902206, -0.0853089839220047, 0.06565527617931366, -0.08078784495592117, 0.029935374855995178, -0.07544174045324326, -0.1839129775762558, 0.1162954717874527, -0.12050503492355347, -0.07013790309429169, -0.04547905921936035, 0.15918822586536407, -0.07179775834083557, -0.011930862441658974, 0.045498427003622055, 0.03160962834954262, -0.07723617553710938, -0.0847110003232956, -0.006070843432098627, -0.0025287463795393705, 0.05893851816654205, 0.08181621879339218, -0.09463334828615189, -0.17228421568870544, -0.04075782373547554, 0.005448542069643736, 0.2634821832180023, 0.22065258026123047, -0.05522479861974716, 0.1278686821460724, 0.16999582946300507, -0.047345586121082306, -0.35681891441345215, -0.10636944323778152, -0.13432654738426208, -0.05758921056985855, -0.022506171837449074, -0.0961056798696518, 0.1178196594119072, 0.014942457899451256, -0.03949359804391861, 0.06986929476261139, -0.14639140665531158, -0.08758215606212616, 0.20878203213214874, 0.0066594453528523445, 0.3874402344226837, -0.14591927826404572, -0.08249549567699432, -0.0636710375547409, -0.07315634936094284, 0.11284900456666946, -0.07798133045434952, 0.06157802417874336, -0.0018941548187285662, -0.00215904344804585, 0.0470365434885025, -0.05119771137833595, 0.08633345365524292, -0.036393966525793076, 0.04593115299940109, -0.12906810641288757, -0.08657682687044144, 0.04123641178011894, -0.019044602289795876, 0.0013887169770896435, -0.03687390685081482, 0.03321627900004387, -0.09738212078809738, -0.04325532913208008, -0.06387157738208771, 0.06891244649887085, 0.02934158593416214, -0.02799014374613762, 0.007533113472163677, -0.01600917987525463, -0.008880393579602242, 0.005680275149643421, 0.2624731957912445, -0.053298115730285645, 0.207061767578125, 0.1087726578116417, 0.14947538077831268, -0.11818893998861313, 0.08521092683076859, -0.0535595677793026, -0.07671057432889938, 0.05562760680913925, -0.05980202928185463, 0.07211682945489883, 0.09977073967456818, -0.05637345463037491, 0.05348524823784828, 0.10181916505098343, 0.053793616592884064, -0.01970146968960762, 0.17691640555858612, -0.24682003259658813, 0.0005763924564234912, -0.0343514010310173, -0.005487521644681692, 0.058634281158447266, 0.10069156438112259, 0.14047035574913025, 0.0444621741771698, -0.04863717406988144, -0.01693565584719181, 0.018980342894792557, 0.012691697105765343, 0.063657246530056, 0.07029462605714798, 0.04138794168829918, -0.12769213318824768, 0.0558137483894825, 0.039351146668195724, -0.1515320986509323, -0.008817311376333237, 0.13155698776245117, -0.1713404655456543, -0.1359114795923233, -0.003029818180948496, 0.12071827054023743, -0.0710398480296135, -0.07518527656793594, -0.08469753712415695, -0.15104129910469055, 0.03759932518005371, 0.23905500769615173, 0.11622844636440277, 0.07967973500490189, 0.0017419878859072924, -0.0443345345556736, -0.01727883890271187, 0.027810322120785713, 0.010590997524559498, 0.02177119255065918, -0.13366933166980743, 0.025486627593636513, -0.010413159616291523, 0.12143415212631226, -0.09977542608976364, -0.05290146544575691, -0.1783708930015564, 0.04474503919482231, -0.0660024955868721, -0.002245434559881687, -0.07691054046154022, -0.010120369493961334, -0.01692410558462143, -0.046502433717250824, -0.03150279447436333, -0.05717881768941879, -0.09545788913965225, 0.04733770713210106, -0.014737613499164581, 0.025869283825159073, -0.10075892508029938, -0.05455879122018814, 0.08499354869127274, -0.03517667204141617, 0.11838100105524063, 0.09590063244104385, -0.09763941913843155, 0.09349380433559418, -0.2064322978258133, -0.09777843207120895, 0.1414613276720047, -0.000040834365790942684, 0.04127771407365799, 0.0658639445900917, 0.0404028482735157, 0.08315624296665192, 0.003812352893874049, 0.08022204786539078, 0.07299994677305222, -0.11897262185811996, 0.08983026444911957, -0.018279602751135826, -0.16814731061458588, -0.03457396477460861, -0.06283866614103317, 0.09657544642686844, -0.0372745506465435, 0.175554558634758, -0.08704979717731476, 0.0849972814321518, -0.053200483322143555, 0.01616283878684044, -0.0161876417696476, -0.22524885833263397, -0.11346051096916199, -0.0551878847181797, 0.029724178835749626, -0.002843765774741769, 0.23892228305339813, 0.059128351509571075, 0.027141716331243515, 0.06242341175675392, 0.05767638608813286, 0.010062826797366142, 0.03636417165398598, 0.16902495920658112, 0.06896614283323288, -0.06225694715976715, -0.06868152320384979, 0.024818651378154755, 0.02150101587176323, -0.0840960368514061, 0.1198815181851387, 0.10814256221055984, -0.02691606804728508, 0.0555819533765316, -0.0110814543440938, 0.0631573498249054, -0.07232340425252914, -0.17918190360069275, -0.06016736477613449, 0.03748639300465584, 0.0328981839120388, 0.023631447926163673, 0.146884486079216, 0.0026433004532009363, 0.01722995936870575, -0.061607059091329575, -0.040862955152988434, -0.193616583943367, -0.07574476301670074, -0.11609566956758499, -0.07851025462150574, 0.0165273305028677, -0.08360549807548523, -0.04045351967215538, 0.06842494010925293, 0.04717373847961426, -0.05272261053323746, 0.10506433993577957, 0.03136230632662773, -0.03220399469137192, 0.0831017792224884, -0.02797803469002247, 0.025991175323724747, 0.02994905784726143, -0.04604106768965721, -0.12475353479385376, -0.016080649569630623, -0.04435006529092789, 0.0447852648794651, -0.06299641728401184, 0.034183941781520844, -0.15767285227775574, -0.11725696176290512, -0.022203804925084114, 0.0733208954334259, -0.034305911511182785, 0.11806819587945938, 0.0181909017264843, -0.008031811565160751, 0.06349647790193558, 0.22537550330162048, -0.03213720768690109, -0.10814366489648819, -0.033361662179231644, 0.24464012682437897, 0.06281514465808868, 0.11121105402708054, -0.009794646874070168, -0.019673621281981468, -0.040056176483631134, 0.29380613565444946, 0.2987655997276306, -0.03834248334169388, 0.07157011330127716, -0.029159875586628914, 0.033186569809913635, 0.12427068501710892, 0.1324220597743988, 0.10565037280321121, 0.25367164611816406, -0.05410556122660637, -0.017002731561660767, -0.02023966982960701, -0.003884621663019061, -0.14741107821464539, 0.06076105684041977, 0.03571495786309242, -0.027772584930062294, -0.07183908671140671, 0.12135129421949387, -0.17450423538684845, 0.12612582743167877, -0.006723157595843077, -0.20173387229442596, -0.0672672912478447, -0.022110125049948692, 0.1437765657901764, -0.005568705033510923, 0.06927821785211563, -0.003113536164164543, -0.10825800150632858, -0.022247539833188057, 0.016543032601475716, -0.1690586656332016, -0.012068711221218109, 0.03188177943229675, -0.02499835379421711, 0.0685410276055336, -0.017356334254145622, 0.029364103451371193, 0.08199650049209595, 0.019354434683918953, -0.029578445479273796, 0.10188102722167969, 0.0064616333693265915, -0.03571145236492157, 0.041736286133527756, 0.008721070364117622, 0.001746322843246162, -0.0854547768831253, 0.07708802819252014, -0.1426282376050949, 0.05399128794670105, -0.06522337347269058, -0.07470900565385818, -0.02300148457288742, 0.07814358919858932, -0.052328191697597504, 0.04645124077796936, 0.07956159859895706, -0.008192131295800209, -0.0023664445616304874, -0.042661264538764954, -0.01466395054012537, -0.005791082512587309, -0.12926004827022552, -0.10192789882421494, -0.11432249844074249, -0.06986135244369507, 0.11943621933460236, 0.014220036566257477, -0.2061404585838318, 0.009836873039603233, -0.1346365213394165, 0.05065581202507019, -0.19764550030231476, 0.07481163740158081, 0.057710688561201096, 0.01441802829504013, -0.007551587652415037, -0.03651706501841545, 0.0420125313103199, 0.0855083242058754, -0.10152594745159149, -0.09494296461343765 ]
null
null
transformers
from transformers import MT5ForConditionalGeneration, AutoTokenizer model = MT5ForConditionalGeneration.from_pretrained("Parth/mT5-question-generator") tokenizer = AutoTokenizer.from_pretrained("google/mt5-base")
{}
text2text-generation
Parth/mT5-question-generator
[ "transformers", "pytorch", "mt5", "text2text-generation", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #mt5 #text2text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
from transformers import MT5ForConditionalGeneration, AutoTokenizer model = MT5ForConditionalGeneration.from_pretrained("Parth/mT5-question-generator") tokenizer = AutoTokenizer.from_pretrained("google/mt5-base")
[]
[ "TAGS\n#transformers #pytorch #mt5 #text2text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 49 ]
[ "passage: TAGS\n#transformers #pytorch #mt5 #text2text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.03247205540537834, 0.008802899159491062, -0.006354454439133406, 0.010667198337614536, 0.17799293994903564, 0.015382261015474796, 0.11927555501461029, 0.12881627678871155, 0.005648191086947918, -0.017856856808066368, 0.15311330556869507, 0.2192111611366272, -0.01395078282803297, 0.09104213118553162, -0.10423795878887177, -0.24964120984077454, 0.04000631347298622, 0.05688021332025528, 0.02192343771457672, 0.12785422801971436, 0.0683683529496193, -0.07686739414930344, 0.09180748462677002, -0.03782757371664047, -0.1629144847393036, 0.04483022168278694, 0.05165455862879753, -0.12006033957004547, 0.10880202054977417, 0.040760889649391174, 0.10833796113729477, 0.029112650081515312, -0.058540575206279755, -0.14441463351249695, 0.025394633412361145, 0.024162614718079567, -0.05531970411539078, 0.06695004552602768, 0.12746109068393707, -0.1069447323679924, 0.06599201261997223, 0.06857258081436157, -0.01249475497752428, 0.06093573942780495, -0.14540249109268188, 0.026302598416805267, -0.033595893532037735, 0.000501884613186121, 0.09577777981758118, 0.09955472499132156, -0.0014538326067849994, 0.1336830109357834, -0.1190258339047432, 0.11686736345291138, 0.15553653240203857, -0.29510700702667236, 0.010198472999036312, 0.08669993281364441, 0.03853175416588783, 0.07914209365844727, -0.01894368603825569, 0.04763365536928177, 0.031193047761917114, 0.023923354223370552, 0.01714995875954628, -0.07414375990629196, -0.10484828799962997, 0.039966754615306854, -0.09946644306182861, -0.06046971678733826, 0.23978057503700256, -0.06425098329782486, 0.07130776345729828, -0.01474788412451744, -0.11349231004714966, -0.06468478590250015, -0.0019674503710120916, 0.008873797953128815, -0.058908045291900635, 0.07076217234134674, 0.010395191609859467, -0.040165528655052185, -0.11989294737577438, -0.01339038833975792, -0.1835329830646515, 0.11016619950532913, -0.008551111444830894, 0.05852426216006279, -0.21846306324005127, 0.09113515168428421, 0.013130535371601582, -0.10046662390232086, 0.05681027099490166, -0.09862425923347473, 0.03388199210166931, -0.023057246580719948, -0.06931141763925552, -0.1132219061255455, 0.05456939712166786, 0.09576255828142166, 0.04512261971831322, 0.038082338869571686, -0.07074864208698273, 0.09369611740112305, 0.024160603061318398, 0.06751588732004166, 0.007608450949192047, -0.07217174768447876, 0.04148251190781593, -0.11202318966388702, -0.0015243865782395005, -0.0692119225859642, -0.15127044916152954, -0.05819070711731911, 0.05758891627192497, 0.10242556035518646, 0.0243549607694149, 0.07375647127628326, -0.0398036390542984, -0.03976113721728325, 0.033358264714479446, -0.07932782918214798, -0.004407680127769709, 0.00710319634526968, -0.004596755839884281, 0.16020210087299347, 0.009324765764176846, 0.018765978515148163, -0.13253585994243622, 0.09554686397314072, -0.07744433730840683, 0.014230625703930855, -0.0509052649140358, -0.060733065009117126, 0.03315047547221184, -0.09138786792755127, 0.011818391270935535, -0.15415462851524353, -0.17493492364883423, 0.005892942193895578, 0.0050512561574578285, -0.01598230004310608, -0.031002094969153404, -0.043078236281871796, -0.03290363401174545, 0.05322432518005371, -0.07414176315069199, 0.036153875291347504, -0.04339534789323807, 0.08642607182264328, -0.04057873412966728, 0.06381313502788544, -0.10164246708154678, 0.08162252604961395, -0.11347855627536774, -0.0016376032726839185, -0.10024012625217438, 0.061632830649614334, 0.004888607654720545, 0.12682358920574188, -0.03581279516220093, -0.02464127726852894, -0.07926245778799057, 0.03483397513628006, -0.011945287697017193, 0.21094636619091034, -0.08152677118778229, -0.11021365225315094, 0.1879364550113678, -0.07468575984239578, -0.1403008997440338, 0.08478160947561264, 0.005911586340516806, 0.03139251470565796, 0.08236317336559296, 0.1588445007801056, 0.048823945224285126, -0.007978598587214947, 0.08611413091421127, 0.09610417485237122, -0.09832625091075897, -0.13185659050941467, 0.013133253902196884, -0.019363515079021454, -0.13396760821342468, 0.04520009085536003, 0.10029850900173187, 0.06343046575784683, -0.0690971314907074, -0.035642366856336594, -0.0428784117102623, -0.011879811063408852, 0.07858917117118835, 0.0011438956717029214, 0.13090534508228302, -0.05680390074849129, 0.013213207945227623, 0.021459300071001053, -0.011428453028202057, -0.01332840695977211, 0.03819642961025238, -0.024233844131231308, 0.10694291442632675, -0.08024387806653976, 0.03428592160344124, -0.2044709473848343, -0.048934467136859894, -0.007900926284492016, 0.12092742323875427, -0.004612825810909271, 0.08112660050392151, 0.058829180896282196, -0.02011093869805336, -0.012360739521682262, -0.020902158692479134, 0.1545429825782776, -0.01283858623355627, -0.10395924001932144, -0.07652033865451813, 0.036482252180576324, -0.06376958638429642, -0.024659687653183937, -0.06699222326278687, 0.017639730125665665, 0.0051707434467971325, 0.12165486812591553, 0.0072322734631598, 0.05250490456819534, -0.024619469419121742, 0.03858618065714836, -0.0804455429315567, 0.021691903471946716, 0.10708007216453552, -0.009018800221383572, -0.054880354553461075, 0.20170986652374268, -0.18809448182582855, 0.23027707636356354, 0.19894523918628693, -0.29437363147735596, 0.036204781383275986, -0.0962168350815773, -0.020468637347221375, -0.002135684248059988, 0.04397270828485489, -0.023481864482164383, 0.0777517557144165, 0.009088986553251743, 0.1900269091129303, -0.05207496136426926, -0.058402013033628464, -0.0053734490647912025, -0.05356953293085098, -0.029290076345205307, 0.07178544998168945, 0.12434651702642441, -0.17011547088623047, 0.17978157103061676, 0.23428833484649658, 0.01473091822117567, 0.1620381623506546, 0.011644197627902031, -0.04454120993614197, 0.062099479138851166, -0.024961771443486214, -0.03766492009162903, -0.09724601358175278, -0.18492409586906433, -0.027101896703243256, 0.07674476504325867, 0.05902737006545067, 0.1182398870587349, -0.10352705419063568, -0.028240500018000603, -0.008812467567622662, 0.016684215515851974, -0.009518086910247803, 0.0760439783334732, 0.07472711056470871, 0.12874123454093933, -0.019543487578630447, -0.000046879617002559826, 0.1103324145078659, 0.0010127548594027758, -0.12079042196273804, 0.186933696269989, -0.14518344402313232, -0.33166664838790894, -0.18088701367378235, -0.19747640192508698, -0.06222398951649666, 0.050322093069553375, 0.09170646220445633, -0.10376584529876709, -0.024963803589344025, 0.0044587342999875546, 0.09943562000989914, -0.09480268508195877, 0.031279709190130234, -0.05548911541700363, 0.06716878712177277, -0.06619614362716675, -0.07650262862443924, -0.04839160293340683, -0.019261548295617104, -0.04822634160518646, 0.1396450400352478, -0.11354948580265045, 0.048589661717414856, 0.19766056537628174, 0.021576598286628723, 0.05424954742193222, -0.01823168620467186, 0.17412340641021729, -0.06385006755590439, -0.009741626679897308, 0.23295654356479645, -0.05166426673531532, 0.08091825246810913, 0.12770976126194, -0.0020878969226032495, -0.07671891152858734, 0.03864402696490288, -0.03631149232387543, -0.08980801701545715, -0.2765643894672394, -0.13049930334091187, -0.12615323066711426, 0.07300391048192978, 0.06304160505533218, 0.043250467628240585, 0.1603052318096161, 0.07496021687984467, -0.011053545400500298, 0.024150917306542397, -0.0006156670860946178, 0.07454051077365875, 0.17188118398189545, -0.016544366255402565, 0.14714473485946655, -0.05331993103027344, -0.1080305203795433, 0.07841064035892487, 0.06226341798901558, 0.13590948283672333, 0.06920741498470306, 0.021969614550471306, 0.015656007453799248, 0.07521288096904755, 0.16085536777973175, 0.15746264159679413, 0.041117534041404724, -0.015303169377148151, -0.016506079584360123, -0.022872988134622574, -0.06463537365198135, 0.0432131253182888, 0.05577348172664642, -0.11483434587717056, -0.08495260775089264, -0.0724823921918869, 0.07664936780929565, 0.10974811017513275, 0.06682390719652176, -0.23741473257541656, 0.016966620460152626, 0.0871424600481987, -0.0434100441634655, -0.09427584707736969, 0.08946286141872406, 0.0038104455452412367, -0.1368124932050705, 0.05448998883366585, -0.05135731026530266, 0.13966992497444153, -0.04810076206922531, 0.0900053158402443, -0.05872584879398346, -0.06323063373565674, 0.020664479583501816, 0.10692714154720306, -0.3541460931301117, 0.20758792757987976, 0.011527951806783676, -0.06093791872262955, -0.11825107038021088, -0.0057329474948346615, 0.01827836222946644, 0.11354909837245941, 0.07954831421375275, 0.005323044024407864, -0.06418422609567642, -0.10157541930675507, -0.034152161329984665, 0.004723701626062393, 0.13940384984016418, -0.02400369942188263, 0.003448218572884798, -0.04438253492116928, -0.020158851519227028, -0.023178286850452423, -0.000718157272785902, -0.006451705005019903, -0.1613864004611969, 0.0666336938738823, 0.025228632614016533, 0.06822126358747482, 0.01730911061167717, -0.02732805535197258, -0.07030908018350601, 0.20019540190696716, -0.055100217461586, -0.0734892338514328, -0.13220442831516266, -0.0626208633184433, 0.06701112538576126, -0.07306446135044098, 0.04298105835914612, -0.06441843509674072, 0.0504937544465065, -0.06429126858711243, -0.23106469213962555, 0.12129522860050201, -0.10896743834018707, -0.04214934632182121, -0.05486045777797699, 0.19954514503479004, -0.07281329482793808, 0.01817229390144348, 0.017324624583125114, 0.013015411794185638, -0.09135245531797409, -0.07786673307418823, 0.014232970774173737, 0.01847606711089611, 0.058559149503707886, 0.05814133957028389, -0.08037004619836807, -0.014857451431453228, -0.03221143037080765, -0.009274402633309364, 0.32020795345306396, 0.13638851046562195, -0.04181653633713722, 0.1722007840871811, 0.14273306727409363, -0.09682104736566544, -0.32250353693962097, -0.0542713925242424, -0.09283889830112457, -0.02921893633902073, -0.04146459698677063, -0.16693620383739471, 0.09226667881011963, 0.025867247954010963, -0.0081669632345438, 0.11432857066392899, -0.26718688011169434, -0.08819488435983658, 0.13733826577663422, 0.011434170417487621, 0.36291927099227905, -0.10883680731058121, -0.11266031861305237, -0.07516881823539734, -0.14414578676223755, 0.14854134619235992, -0.046829547733068466, 0.09237111359834671, -0.03225376084446907, 0.1023406982421875, 0.04431767761707306, -0.05918179079890251, 0.0826093852519989, 0.03412443771958351, 0.011541778221726418, -0.09786099195480347, -0.024788152426481247, 0.043923161923885345, -0.01647278480231762, 0.0186665840446949, -0.018545938655734062, 0.02520407736301422, -0.1436202973127365, -0.030395017936825752, -0.08319204300642014, 0.054647669196128845, 0.024612871930003166, -0.06083676591515541, 0.035113632678985596, -0.07397028058767319, 0.015755590051412582, 0.0031852610409259796, 0.1968139111995697, -0.04927601292729378, 0.16436325013637543, 0.17140397429466248, 0.1386384516954422, -0.14085504412651062, 0.0313987135887146, -0.06756629794836044, -0.06380796432495117, 0.07617787271738052, -0.09445741772651672, 0.07254795730113983, 0.12724515795707703, -0.04090844467282295, 0.06647319346666336, 0.11278997361660004, 0.02755293995141983, -0.01717589795589447, 0.12903688848018646, -0.24685613811016083, 0.05759644880890846, -0.0774354338645935, 0.007829134352505207, 0.06760475039482117, 0.0691077932715416, 0.17701676487922668, 0.022113438695669174, -0.03186488151550293, -0.018044183030724525, 0.014400118961930275, -0.05583859980106354, 0.10128172487020493, 0.01910126954317093, 0.03747108206152916, -0.14913128316402435, 0.10593283176422119, 0.014065378345549107, -0.15610112249851227, -0.0030291201546788216, 0.18251965939998627, -0.1271468847990036, -0.1175045371055603, -0.005189466755837202, 0.11735180765390396, -0.1323971152305603, -0.027732165530323982, -0.056806646287441254, -0.13107746839523315, 0.1038786843419075, 0.1834852397441864, 0.06416597962379456, 0.0826336219906807, -0.05614696815609932, -0.05633258447051048, -0.05537234991788864, -0.012746404856443405, -0.004845258314162493, 0.02958609163761139, -0.0928758755326271, 0.08036726713180542, -0.03491966053843498, 0.14262840151786804, -0.0898408517241478, -0.06918130815029144, -0.15198032557964325, 0.042447883635759354, -0.1359063982963562, -0.06605049222707748, -0.08334993571043015, -0.06526677310466766, -0.018778005614876747, -0.01698305644094944, -0.06606939435005188, -0.04105973616242409, -0.1284734159708023, 0.024907397106289864, -0.05333518236875534, 0.03229110315442085, -0.05507495999336243, -0.0035495858173817396, 0.07976923137903214, -0.049493879079818726, 0.1077856719493866, 0.1420586258172989, -0.09717884659767151, 0.11032459139823914, -0.11992289125919342, -0.11299397796392441, 0.11099837720394135, 0.027232574298977852, 0.05815976485610008, 0.06443625688552856, 0.022027408704161644, 0.0828590840101242, 0.02554231509566307, 0.042817723006010056, 0.02065151557326317, -0.12103132158517838, 0.030482446774840355, -0.04751116782426834, -0.14892028272151947, -0.06529612094163895, -0.054095201194286346, 0.061603277921676636, 0.011807980015873909, 0.12336400151252747, -0.04274550825357437, 0.12397313863039017, -0.07301304489374161, 0.01582331769168377, 0.004852397367358208, -0.1726864129304886, -0.047937918454408646, -0.07537665218114853, 0.04021551460027695, 0.0031554480083286762, 0.24552465975284576, 0.01341304462403059, 0.0232835803180933, 0.03332333266735077, 0.08759349584579468, -0.006232629995793104, 0.02662578783929348, 0.19984489679336548, 0.10153228044509888, -0.04841103032231331, -0.07740340381860733, 0.0841817706823349, 0.027718786150217056, 0.045756641775369644, 0.1580987125635147, 0.05288330838084221, 0.02733166329562664, 0.10682656615972519, -0.00838334672152996, 0.026530034840106964, -0.10955263674259186, -0.15393203496932983, -0.013403541408479214, 0.0689178928732872, -0.013366389088332653, 0.06380236148834229, 0.15174338221549988, -0.031019950285553932, 0.035003937780857086, -0.02637603133916855, -0.04663315415382385, -0.18330888450145721, -0.13960829377174377, -0.08663944154977798, -0.11958782374858856, -0.0004337868595030159, -0.10688566416501999, 0.054552797228097916, 0.09666495025157928, 0.06784127652645111, -0.05839335545897484, 0.0752289667725563, 0.03856367990374565, -0.09408316761255264, 0.060062143951654434, -0.04271860048174858, 0.07346413284540176, -0.01862303353846073, -0.01636376976966858, -0.08663247525691986, -0.015964046120643616, -0.017242593690752983, 0.047769226133823395, -0.0590713806450367, 0.013490854762494564, -0.13409970700740814, -0.11919060349464417, -0.03309672325849533, 0.045155834406614304, -0.04299371317028999, 0.15545444190502167, -0.0017370387213304639, -0.022618813440203667, 0.025549639016389847, 0.21336813271045685, -0.09325596690177917, -0.05712331831455231, -0.04163253679871559, 0.23262251913547516, 0.060663092881441116, 0.10347004979848862, -0.01984133943915367, 0.010419715195894241, -0.0872872993350029, 0.3477313220500946, 0.2864106297492981, -0.0785008892416954, 0.01852458342909813, 0.025229377672076225, 0.032489433884620667, 0.11828687787055969, 0.14369255304336548, 0.0973535031080246, 0.23375727236270905, -0.07746781408786774, -0.008508224971592426, -0.026193976402282715, -0.022704333066940308, -0.07821500301361084, 0.13037200272083282, 0.022821061313152313, -0.07975984364748001, -0.02735958993434906, 0.09264971315860748, -0.23305658996105194, 0.15770022571086884, -0.09615077078342438, -0.18416492640972137, -0.0657498687505722, -0.004348475951701403, 0.13509869575500488, 0.0012923850445076823, 0.0851651206612587, -0.00950760766863823, -0.09292657673358917, 0.03073572926223278, 0.018268031999468803, -0.21988633275032043, -0.02724556252360344, 0.04685826599597931, -0.07749628275632858, -0.018672892823815346, -0.004388496745377779, 0.0267090555280447, 0.07139899581670761, 0.06441280245780945, -0.048781443387269974, 0.03654508665204048, 0.003955124877393246, -0.0477292500436306, 0.023610156029462814, 0.07727914303541183, 0.013922389596700668, -0.05439634621143341, 0.04671293869614601, -0.1666029691696167, 0.039284031838178635, -0.049411799758672714, -0.033983949571847916, 0.0128423310816288, -0.011059774085879326, -0.03385746479034424, 0.06773266941308975, 0.08311868458986282, -0.008829239755868912, -0.0032988465391099453, -0.06938329339027405, -0.0213033314794302, -0.020847970619797707, -0.08694157749414444, -0.10070307552814484, -0.148762509226799, -0.09853886067867279, 0.0973188504576683, -0.0049340552650392056, -0.21099382638931274, 0.004006146918982267, -0.09866927564144135, 0.02192481979727745, -0.20368202030658722, 0.08495088666677475, 0.06789152324199677, 0.01056761760264635, -0.0028865975327789783, -0.07395254075527191, 0.05188513174653053, 0.10077827423810959, -0.11843311786651611, -0.09581757336854935 ]
null
null
null
'hello'
{}
null
Patrickdg/distilbert-consumer-complaints
[ "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #region-us
'hello'
[]
[ "TAGS\n#region-us \n" ]
[ 6 ]
[ "passage: TAGS\n#region-us \n" ]
[ 0.024608636274933815, -0.026205500587821007, -0.009666500613093376, -0.10395516455173492, 0.08638657629489899, 0.059816278517246246, 0.01882290467619896, 0.020661840215325356, 0.23975107073783875, -0.005599027033895254, 0.1219947561621666, 0.0015615287702530622, -0.037353623658418655, 0.03733762726187706, -0.0035912662278860807, -0.17583473026752472, 0.03876631706953049, -0.018274923786520958, 0.01843859627842903, 0.026470553129911423, -0.07776834815740585, -0.07564429938793182, 0.015296397730708122, -0.10247814655303955, -0.083692267537117, 0.11002834886312485, 0.031466204673051834, -0.019670886918902397, 0.10779199749231339, -0.04243955761194229, 0.18699054419994354, -0.011512263678014278, -0.11213519424200058, -0.2536850869655609, 0.021806683391332626, -0.01765260472893715, -0.08747660368680954, 0.01506110467016697, 0.0665089413523674, -0.09014441072940826, -0.0588928684592247, 0.0795099288225174, -0.01132340170443058, 0.04246443510055542, -0.27593839168548584, -0.12684126198291779, -0.05297930911183357, -0.1421966552734375, 0.08651168644428253, 0.04035491496324539, 0.008764253929257393, 0.15506891906261444, -0.20897391438484192, 0.004104613792151213, 0.08255259692668915, -0.2538507878780365, 0.05591634660959244, 0.17671173810958862, 0.03623908758163452, 0.18037272989749908, 0.0060391901060938835, 0.11029672622680664, 0.0716743916273117, -0.024263937026262283, -0.17590197920799255, -0.08127854019403458, -0.04696211963891983, 0.16642488539218903, -0.06727185100317001, -0.14248386025428772, 0.34701237082481384, 0.00015008423360995948, 0.009657775051891804, 0.16921205818653107, -0.059524230659008026, -0.09972117841243744, 0.07259953022003174, 0.016484731808304787, 0.018492350354790688, 0.1471305936574936, 0.16307872533798218, -0.0458691343665123, -0.13837823271751404, -0.018630273640155792, -0.22798998653888702, 0.17510560154914856, -0.03248048573732376, 0.13137903809547424, -0.27447956800460815, 0.01684025302529335, -0.2570667266845703, 0.0032130838371813297, 0.04178816080093384, -0.06004921346902847, -0.0226522795855999, -0.013265985064208508, -0.08018817007541656, 0.004899587947875261, 0.06192673370242119, 0.1266920566558838, -0.06128726154565811, 0.06128238886594772, -0.09319206327199936, 0.141696035861969, 0.07166698575019836, 0.07868369668722153, 0.13037432730197906, 0.041205424815416336, -0.07187089323997498, -0.21872246265411377, -0.0026476888451725245, -0.06275863200426102, -0.09502086788415909, -0.0020165652967989445, -0.11606067419052124, 0.17244569957256317, -0.030802514404058456, -0.09825427830219269, -0.11208184063434601, 0.09148659557104111, -0.032992321997880936, -0.03437839448451996, -0.03552987426519394, -0.020977836102247238, 0.019381176680326462, 0.04704452306032181, -0.1548958420753479, -0.005131472367793322, 0.07039852440357208, 0.11502562463283539, -0.1346137970685959, -0.003783059772104025, -0.07908964157104492, 0.03039063885807991, 0.07654735445976257, -0.16510222852230072, 0.03158547356724739, -0.1124754324555397, -0.07531405985355377, 0.002912673633545637, -0.015710093080997467, -0.016202643513679504, 0.166526660323143, -0.0020451415330171585, 0.0714716836810112, -0.026345307007431984, -0.05890209600329399, -0.11243434250354767, -0.08489254862070084, 0.05390460044145584, 0.03670717030763626, 0.03266148269176483, -0.2193479984998703, 0.014805203303694725, -0.12762966752052307, 0.1360815018415451, -0.10566820204257965, -0.04705966264009476, -0.022842247039079666, 0.20562705397605896, 0.037286072969436646, 0.08762791007757187, -0.22171171009540558, 0.039756543934345245, -0.05404696613550186, 0.18480908870697021, -0.1502426266670227, -0.0799463614821434, 0.20813211798667908, -0.07964949309825897, -0.10115210711956024, 0.021235812455415726, 0.020391687750816345, 0.026287272572517395, 0.0766737088561058, 0.4564172327518463, -0.09766800701618195, -0.09146861732006073, 0.10178250074386597, 0.17055274546146393, -0.12427149713039398, -0.1827561855316162, 0.06446871906518936, -0.16666454076766968, -0.1973118633031845, 0.0018917324487119913, 0.09222044050693512, 0.038269978016614914, -0.07875611633062363, -0.020746968686580658, 0.06325206160545349, -0.0007678253459744155, 0.09095914661884308, 0.03755716234445572, 0.09034032374620438, -0.08716782182455063, 0.11115926504135132, -0.05017651244997978, 0.004037132486701012, 0.1343354731798172, 0.027325427159667015, -0.03223329409956932, 0.08694463223218918, -0.0485352948307991, 0.05295134335756302, -0.1662379503250122, -0.15068690478801727, 0.03398871049284935, 0.06283251196146011, 0.03186952322721481, 0.1280253529548645, 0.08141885697841644, -0.10732853412628174, 0.022690722718834877, -0.004228927195072174, 0.058398615568876266, 0.03891623765230179, 0.006107209715992212, 0.008764320984482765, 0.0961301177740097, -0.10607069730758667, -0.13589619100093842, -0.07336436957120895, -0.014715781435370445, 0.14371353387832642, -0.0302802175283432, 0.07690227776765823, -0.004240254405885935, 0.00013200697139836848, 0.06930823624134064, 0.08137880265712738, 0.016412746161222458, 0.08971183747053146, -0.05237193778157234, -0.05160155147314072, 0.10863113403320312, -0.13533565402030945, 0.17837053537368774, 0.14053137600421906, -0.20532016456127167, 0.029453208670020103, -0.06838275492191315, 0.03670361638069153, -0.008162540383636951, 0.0975119024515152, -0.08272241055965424, -0.02106042578816414, 0.013134466484189034, 0.0052274600602686405, -0.013007243163883686, 0.017682146281003952, -0.07295988500118256, -0.07787393033504486, -0.10233919322490692, 0.08436838537454605, 0.11562882363796234, -0.10282530635595322, 0.14214380085468292, 0.4384984076023102, 0.11495281755924225, 0.21582984924316406, -0.09581480920314789, -0.0412987545132637, 0.007486371789127588, 0.0001535322517156601, -0.04476691037416458, 0.08031861484050751, -0.15973517298698425, -0.038901735097169876, 0.027348900213837624, 0.07128690183162689, 0.11475157737731934, -0.14959022402763367, -0.09639324247837067, -0.00793045200407505, 0.0022841424215584993, -0.1249532699584961, 0.023905446752905846, -0.03974650055170059, 0.04015624523162842, 0.07232289016246796, -0.021535737439990044, 0.13939237594604492, -0.04166141897439957, -0.0639561116695404, 0.07585346698760986, -0.2017085999250412, -0.23179671168327332, -0.12309670448303223, -0.14680525660514832, 0.04366797208786011, 0.05154111236333847, 0.01726446859538555, -0.17635835707187653, -0.015074856579303741, 0.07706750929355621, 0.07820965349674225, -0.20886357128620148, -0.022814949974417686, -0.004290030337870121, 0.0895976573228836, -0.10227091610431671, -0.0017130117630586028, -0.04419664293527603, -0.10150232166051865, 0.0017003051470965147, 0.07279510796070099, -0.137485533952713, 0.13807645440101624, 0.21589438617229462, 0.07225540280342102, 0.07359948754310608, -0.019093448296189308, 0.09936179965734482, -0.10856141895055771, -0.16549113392829895, 0.08348225057125092, -0.06234746053814888, 0.047262318432331085, 0.17534415423870087, 0.03307317942380905, -0.13904969394207, -0.015682822093367577, -0.0402069091796875, -0.15603256225585938, -0.238995760679245, -0.09178274869918823, -0.1182505264878273, 0.16442428529262543, 0.0009358620154671371, 0.06651917099952698, 0.08258313685655594, -0.022042419761419296, 0.16447891294956207, -0.07379321753978729, -0.07578866183757782, -0.006978808436542749, 0.12375060468912125, -0.056660156697034836, -0.03080669604241848, -0.10566964000463486, -0.008295975625514984, 0.1151021271944046, 0.15304014086723328, 0.12214863300323486, 0.2957419455051422, 0.08268889784812927, 0.026645636186003685, 0.08958091586828232, 0.17622539401054382, 0.09495089203119278, 0.07838419824838638, -0.045413073152303696, -0.014814783819019794, 0.014317171648144722, -0.04022889584302902, 0.010141594335436821, 0.14683100581169128, -0.2679629921913147, -0.006678564939647913, -0.2710230350494385, 0.0965198427438736, -0.10913380235433578, 0.11837165057659149, -0.01015760749578476, 0.10194015502929688, 0.11082887649536133, 0.03233652561903, -0.03858073800802231, 0.16613617539405823, 0.08450309932231903, -0.11277695000171661, 0.001758623169735074, 0.03737903758883476, 0.09715615212917328, -0.02818971499800682, 0.12721189856529236, -0.11048974841833115, -0.1464834064245224, 0.013753619976341724, 0.07152791321277618, -0.15373679995536804, 0.3138748109340668, 0.012069208547472954, -0.13481520116329193, -0.01481647603213787, -0.09957809001207352, -0.006440147757530212, 0.1254177987575531, 0.09333524852991104, 0.07935678958892822, -0.2185502052307129, -0.13339371979236603, 0.05872276425361633, -0.00575496768578887, 0.22408108413219452, -0.034034017473459244, -0.11356475204229355, -0.027013886719942093, 0.04241163283586502, -0.06043251231312752, 0.08524788916110992, 0.023536119610071182, -0.08113526552915573, -0.032957352697849274, 0.05323701351881027, 0.012368366122245789, 0.00524376705288887, 0.09360801428556442, 0.020107939839363098, -0.0009265501867048442, 0.01785753294825554, 0.047885000705718994, -0.0675911232829094, -0.1984109878540039, 0.09357594698667526, -0.05215044692158699, 0.0015536568826064467, -0.08013670891523361, -0.15122665464878082, -0.08837161958217621, -0.16009655594825745, 0.12540200352668762, -0.034406669437885284, 0.12700119614601135, -0.06619787961244583, 0.17341409623622894, -0.07871770113706589, 0.04481020197272301, -0.047349292784929276, 0.050332702696323395, -0.007268077693879604, -0.07756082713603973, 0.16585899889469147, -0.15564003586769104, 0.01809087023139, 0.19572502374649048, -0.018915493041276932, 0.07177707552909851, 0.021322092041373253, -0.0636206790804863, 0.23147478699684143, 0.3014698624610901, 0.008138049393892288, 0.1665448248386383, 0.3018903136253357, -0.07466315478086472, -0.2642788887023926, -0.05505012720823288, -0.2841376066207886, -0.05371501296758652, 0.10716094076633453, -0.22523896396160126, 0.06986407935619354, 0.14383509755134583, -0.06471995264291763, 0.30228954553604126, -0.21825523674488068, 0.012589273042976856, 0.15434536337852478, -0.08868814259767532, 0.5515313148498535, -0.1133413165807724, -0.17677772045135498, -0.008122089318931103, -0.08741296827793121, 0.10602109134197235, -0.0340677872300148, 0.06877441704273224, 0.013465235009789467, 0.04797380417585373, 0.048932258039712906, -0.03111894056200981, 0.22701001167297363, 0.008710170164704323, 0.09015397727489471, -0.07378865778446198, -0.18624304234981537, 0.11639340221881866, -0.04359482601284981, -0.08891059458255768, 0.0849778801202774, -0.05942516401410103, -0.11078983545303345, 0.04663389176130295, -0.07950539886951447, -0.024862350896000862, 0.08423490077257156, -0.04678233340382576, -0.042606171220541, -0.008054176345467567, -0.1618063747882843, -0.0002289071271661669, 0.31360217928886414, -0.07096036523580551, 0.16695955395698547, 0.03677211329340935, 0.00038613268407061696, -0.11027684062719345, 0.030288029462099075, -0.05203165486454964, -0.021576624363660812, 0.09578979015350342, -0.11096979677677155, 0.03204701095819473, 0.14160704612731934, -0.04864364117383957, 0.05846960097551346, 0.09256096184253693, -0.0849417969584465, 0.007583672646433115, 0.17753590643405914, -0.17537221312522888, -0.1273445188999176, -0.006135711446404457, -0.09862716495990753, 0.14055661857128143, 0.04394126310944557, 0.05191568285226822, 0.16669964790344238, 0.03967129811644554, -0.029474308714270592, -0.02817419543862343, -0.1153380498290062, -0.0201893113553524, 0.040153320878744125, 0.00045633706031367183, -0.08791285753250122, 0.2262638509273529, 0.06409153342247009, -0.1328488290309906, -0.051157206296920776, 0.2161225974559784, -0.06805316358804703, -0.04911920800805092, -0.223562553524971, 0.10752306133508682, -0.07112517952919006, -0.0965060144662857, 0.05453834682703018, -0.02270081453025341, 0.005106312222778797, 0.181985542178154, 0.03941008821129799, 0.11070270836353302, 0.03738937899470329, -0.02448922023177147, 0.15798696875572205, -0.142850860953331, -0.14191335439682007, -0.025354057550430298, -0.08757315576076508, -0.13844476640224457, -0.026804137974977493, 0.1617041826248169, -0.09177309274673462, -0.14772607386112213, -0.2621181011199951, 0.10968475043773651, -0.16432365775108337, -0.10192688554525375, -0.03469514101743698, -0.08968492597341537, 0.0696166530251503, 0.030301768332719803, -0.03093348816037178, -0.06706760823726654, -0.18593791127204895, 0.0816768929362297, 0.06349513679742813, 0.045533183962106705, -0.017847947776317596, 0.0067379772663116455, 0.1720137596130371, 0.025955144315958023, 0.10040043294429779, 0.16762186586856842, 0.011397695168852806, 0.2246655523777008, -0.1671202927827835, -0.11496317386627197, 0.1336962729692459, -0.026543032377958298, 0.06762003898620605, 0.16792191565036774, -0.0772583931684494, 0.015526676550507545, -0.028136352077126503, 0.07066910713911057, -0.11003983020782471, -0.105624258518219, 0.007937257178127766, 0.02567129209637642, -0.2755882740020752, -0.005599735304713249, -0.19717298448085785, 0.14788752794265747, 0.02579621411859989, 0.03297143429517746, 0.10257530212402344, 0.10404334217309952, 0.08312062919139862, -0.0017710148822516203, 0.03226327523589134, -0.1176818460226059, 0.02753005363047123, -0.059239376336336136, -0.020663779228925705, 0.017624232918024063, 0.36952024698257446, -0.03603357449173927, -0.046802736818790436, 0.003710439894348383, 0.1307835876941681, -0.02139742486178875, 0.017395347356796265, 0.13209912180900574, 0.12607666850090027, -0.08595693111419678, -0.1504845917224884, 0.04888554662466049, -0.04565655067563057, -0.02836887165904045, 0.1464131623506546, 0.05905961990356445, 0.1050296202301979, 0.0908031314611435, -0.014463032595813274, -0.00318976235575974, 0.012856799177825451, -0.15486004948616028, 0.06223496049642563, -0.010558074340224266, 0.012565906159579754, 0.017934376373887062, 0.15238402783870697, -0.005540105979889631, 0.07739730179309845, -0.09889880567789078, 0.004208535887300968, -0.13498884439468384, -0.07913459837436676, 0.03617347031831741, -0.13393273949623108, 0.04141177982091904, -0.01871878281235695, 0.029611799865961075, 0.30386561155319214, 0.02558239921927452, -0.020639164373278618, 0.12512871623039246, -0.1214587539434433, -0.12050267308950424, -0.001594188273884356, -0.029960084706544876, 0.0791488066315651, -0.02633434161543846, -0.0997740775346756, -0.1001306027173996, -0.15166029334068298, -0.09759195148944855, 0.05182836204767227, -0.04993441700935364, -0.059362251311540604, -0.17634081840515137, -0.05707859992980957, -0.05147340148687363, 0.14025864005088806, -0.12263951450586319, 0.15159130096435547, -0.014490418136119843, 0.004084470681846142, 0.04405883327126503, 0.1950942426919937, -0.03644494712352753, 0.08714226633310318, 0.0154351145029068, 0.1522706001996994, -0.05119588226079941, 0.14720745384693146, -0.10931728035211563, -0.04014137014746666, -0.06710435450077057, 0.21513493359088898, 0.25630924105644226, -0.06136954948306084, -0.008937356993556023, -0.012760217301547527, 0.058654606342315674, 0.1073930487036705, 0.16049085557460785, 0.002326392102986574, 0.2802925705909729, -0.03133585304021835, 0.04815128445625305, 0.02901598811149597, 0.013607407920062542, -0.06336209923028946, 0.03397751972079277, 0.07539387792348862, -0.035039983689785004, -0.1412304788827896, 0.15837742388248444, -0.21980468928813934, 0.18157227337360382, 0.11640069633722305, -0.19996967911720276, -0.013728445395827293, -0.04882071167230606, 0.1689416468143463, -0.0856364443898201, 0.1637246012687683, -0.0903693437576294, -0.2108195722103119, -0.2056000679731369, 0.03867346793413162, -0.34623071551322937, -0.254462867975235, 0.10422009229660034, 0.1488201916217804, 0.04015883058309555, -0.018507536500692368, -0.019967829808592796, -0.018367022275924683, 0.04877542704343796, -0.0067357709631323814, 0.06014643982052803, 0.031397558748722076, -0.02988368645310402, -0.24127542972564697, -0.029804671183228493, 0.023964406922459602, -0.07093082368373871, 0.07464958727359772, -0.06874357163906097, -0.022495782002806664, 0.08059766888618469, -0.03066304884850979, 0.03298592567443848, -0.035373736172914505, -0.16326889395713806, 0.027529051527380943, 0.03900543600320816, 0.036012712866067886, 0.00634160777553916, 0.0008072225609794259, -0.03455270454287529, 0.0644603744149208, -0.16716794669628143, -0.16015739738941193, 0.14140215516090393, -0.06745140254497528, 0.2779497504234314, -0.05812826007604599, -0.0809100940823555, 0.04766704887151718, -0.03426874056458473, 0.1807648241519928, -0.07756473124027252, 0.047254521399736404, 0.12766779959201813, 0.011127962730824947, 0.03121316432952881, -0.3092964291572571, 0.11082969605922699, -0.000795336440205574, -0.006093299947679043, -0.07581598311662674 ]
null
null
transformers
##An MT5ForConditionalGeneration trained on 3 tasks from PAN Profiling Hate Speech Spreaders on Twitter dataset (ES): * topic attribution - topics were assigned with BertTopic library using embeddings from `Hate-speech-CNERG/dehatebert-mono-spanish` bert model (train and test sets from the PAN task) * hate speech identification (train set from the PAN task) in order to generate tone of comment use prefix **hater classification:**
{}
text2text-generation
PaulAdversarial/PAN_twitter_hate_speech_2021_ES_MT5
[ "transformers", "pytorch", "mt5", "text2text-generation", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #mt5 #text2text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
##An MT5ForConditionalGeneration trained on 3 tasks from PAN Profiling Hate Speech Spreaders on Twitter dataset (ES): * topic attribution - topics were assigned with BertTopic library using embeddings from 'Hate-speech-CNERG/dehatebert-mono-spanish' bert model (train and test sets from the PAN task) * hate speech identification (train set from the PAN task) in order to generate tone of comment use prefix hater classification:
[]
[ "TAGS\n#transformers #pytorch #mt5 #text2text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 49 ]
[ "passage: TAGS\n#transformers #pytorch #mt5 #text2text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.03247205540537834, 0.008802899159491062, -0.006354454439133406, 0.010667198337614536, 0.17799293994903564, 0.015382261015474796, 0.11927555501461029, 0.12881627678871155, 0.005648191086947918, -0.017856856808066368, 0.15311330556869507, 0.2192111611366272, -0.01395078282803297, 0.09104213118553162, -0.10423795878887177, -0.24964120984077454, 0.04000631347298622, 0.05688021332025528, 0.02192343771457672, 0.12785422801971436, 0.0683683529496193, -0.07686739414930344, 0.09180748462677002, -0.03782757371664047, -0.1629144847393036, 0.04483022168278694, 0.05165455862879753, -0.12006033957004547, 0.10880202054977417, 0.040760889649391174, 0.10833796113729477, 0.029112650081515312, -0.058540575206279755, -0.14441463351249695, 0.025394633412361145, 0.024162614718079567, -0.05531970411539078, 0.06695004552602768, 0.12746109068393707, -0.1069447323679924, 0.06599201261997223, 0.06857258081436157, -0.01249475497752428, 0.06093573942780495, -0.14540249109268188, 0.026302598416805267, -0.033595893532037735, 0.000501884613186121, 0.09577777981758118, 0.09955472499132156, -0.0014538326067849994, 0.1336830109357834, -0.1190258339047432, 0.11686736345291138, 0.15553653240203857, -0.29510700702667236, 0.010198472999036312, 0.08669993281364441, 0.03853175416588783, 0.07914209365844727, -0.01894368603825569, 0.04763365536928177, 0.031193047761917114, 0.023923354223370552, 0.01714995875954628, -0.07414375990629196, -0.10484828799962997, 0.039966754615306854, -0.09946644306182861, -0.06046971678733826, 0.23978057503700256, -0.06425098329782486, 0.07130776345729828, -0.01474788412451744, -0.11349231004714966, -0.06468478590250015, -0.0019674503710120916, 0.008873797953128815, -0.058908045291900635, 0.07076217234134674, 0.010395191609859467, -0.040165528655052185, -0.11989294737577438, -0.01339038833975792, -0.1835329830646515, 0.11016619950532913, -0.008551111444830894, 0.05852426216006279, -0.21846306324005127, 0.09113515168428421, 0.013130535371601582, -0.10046662390232086, 0.05681027099490166, -0.09862425923347473, 0.03388199210166931, -0.023057246580719948, -0.06931141763925552, -0.1132219061255455, 0.05456939712166786, 0.09576255828142166, 0.04512261971831322, 0.038082338869571686, -0.07074864208698273, 0.09369611740112305, 0.024160603061318398, 0.06751588732004166, 0.007608450949192047, -0.07217174768447876, 0.04148251190781593, -0.11202318966388702, -0.0015243865782395005, -0.0692119225859642, -0.15127044916152954, -0.05819070711731911, 0.05758891627192497, 0.10242556035518646, 0.0243549607694149, 0.07375647127628326, -0.0398036390542984, -0.03976113721728325, 0.033358264714479446, -0.07932782918214798, -0.004407680127769709, 0.00710319634526968, -0.004596755839884281, 0.16020210087299347, 0.009324765764176846, 0.018765978515148163, -0.13253585994243622, 0.09554686397314072, -0.07744433730840683, 0.014230625703930855, -0.0509052649140358, -0.060733065009117126, 0.03315047547221184, -0.09138786792755127, 0.011818391270935535, -0.15415462851524353, -0.17493492364883423, 0.005892942193895578, 0.0050512561574578285, -0.01598230004310608, -0.031002094969153404, -0.043078236281871796, -0.03290363401174545, 0.05322432518005371, -0.07414176315069199, 0.036153875291347504, -0.04339534789323807, 0.08642607182264328, -0.04057873412966728, 0.06381313502788544, -0.10164246708154678, 0.08162252604961395, -0.11347855627536774, -0.0016376032726839185, -0.10024012625217438, 0.061632830649614334, 0.004888607654720545, 0.12682358920574188, -0.03581279516220093, -0.02464127726852894, -0.07926245778799057, 0.03483397513628006, -0.011945287697017193, 0.21094636619091034, -0.08152677118778229, -0.11021365225315094, 0.1879364550113678, -0.07468575984239578, -0.1403008997440338, 0.08478160947561264, 0.005911586340516806, 0.03139251470565796, 0.08236317336559296, 0.1588445007801056, 0.048823945224285126, -0.007978598587214947, 0.08611413091421127, 0.09610417485237122, -0.09832625091075897, -0.13185659050941467, 0.013133253902196884, -0.019363515079021454, -0.13396760821342468, 0.04520009085536003, 0.10029850900173187, 0.06343046575784683, -0.0690971314907074, -0.035642366856336594, -0.0428784117102623, -0.011879811063408852, 0.07858917117118835, 0.0011438956717029214, 0.13090534508228302, -0.05680390074849129, 0.013213207945227623, 0.021459300071001053, -0.011428453028202057, -0.01332840695977211, 0.03819642961025238, -0.024233844131231308, 0.10694291442632675, -0.08024387806653976, 0.03428592160344124, -0.2044709473848343, -0.048934467136859894, -0.007900926284492016, 0.12092742323875427, -0.004612825810909271, 0.08112660050392151, 0.058829180896282196, -0.02011093869805336, -0.012360739521682262, -0.020902158692479134, 0.1545429825782776, -0.01283858623355627, -0.10395924001932144, -0.07652033865451813, 0.036482252180576324, -0.06376958638429642, -0.024659687653183937, -0.06699222326278687, 0.017639730125665665, 0.0051707434467971325, 0.12165486812591553, 0.0072322734631598, 0.05250490456819534, -0.024619469419121742, 0.03858618065714836, -0.0804455429315567, 0.021691903471946716, 0.10708007216453552, -0.009018800221383572, -0.054880354553461075, 0.20170986652374268, -0.18809448182582855, 0.23027707636356354, 0.19894523918628693, -0.29437363147735596, 0.036204781383275986, -0.0962168350815773, -0.020468637347221375, -0.002135684248059988, 0.04397270828485489, -0.023481864482164383, 0.0777517557144165, 0.009088986553251743, 0.1900269091129303, -0.05207496136426926, -0.058402013033628464, -0.0053734490647912025, -0.05356953293085098, -0.029290076345205307, 0.07178544998168945, 0.12434651702642441, -0.17011547088623047, 0.17978157103061676, 0.23428833484649658, 0.01473091822117567, 0.1620381623506546, 0.011644197627902031, -0.04454120993614197, 0.062099479138851166, -0.024961771443486214, -0.03766492009162903, -0.09724601358175278, -0.18492409586906433, -0.027101896703243256, 0.07674476504325867, 0.05902737006545067, 0.1182398870587349, -0.10352705419063568, -0.028240500018000603, -0.008812467567622662, 0.016684215515851974, -0.009518086910247803, 0.0760439783334732, 0.07472711056470871, 0.12874123454093933, -0.019543487578630447, -0.000046879617002559826, 0.1103324145078659, 0.0010127548594027758, -0.12079042196273804, 0.186933696269989, -0.14518344402313232, -0.33166664838790894, -0.18088701367378235, -0.19747640192508698, -0.06222398951649666, 0.050322093069553375, 0.09170646220445633, -0.10376584529876709, -0.024963803589344025, 0.0044587342999875546, 0.09943562000989914, -0.09480268508195877, 0.031279709190130234, -0.05548911541700363, 0.06716878712177277, -0.06619614362716675, -0.07650262862443924, -0.04839160293340683, -0.019261548295617104, -0.04822634160518646, 0.1396450400352478, -0.11354948580265045, 0.048589661717414856, 0.19766056537628174, 0.021576598286628723, 0.05424954742193222, -0.01823168620467186, 0.17412340641021729, -0.06385006755590439, -0.009741626679897308, 0.23295654356479645, -0.05166426673531532, 0.08091825246810913, 0.12770976126194, -0.0020878969226032495, -0.07671891152858734, 0.03864402696490288, -0.03631149232387543, -0.08980801701545715, -0.2765643894672394, -0.13049930334091187, -0.12615323066711426, 0.07300391048192978, 0.06304160505533218, 0.043250467628240585, 0.1603052318096161, 0.07496021687984467, -0.011053545400500298, 0.024150917306542397, -0.0006156670860946178, 0.07454051077365875, 0.17188118398189545, -0.016544366255402565, 0.14714473485946655, -0.05331993103027344, -0.1080305203795433, 0.07841064035892487, 0.06226341798901558, 0.13590948283672333, 0.06920741498470306, 0.021969614550471306, 0.015656007453799248, 0.07521288096904755, 0.16085536777973175, 0.15746264159679413, 0.041117534041404724, -0.015303169377148151, -0.016506079584360123, -0.022872988134622574, -0.06463537365198135, 0.0432131253182888, 0.05577348172664642, -0.11483434587717056, -0.08495260775089264, -0.0724823921918869, 0.07664936780929565, 0.10974811017513275, 0.06682390719652176, -0.23741473257541656, 0.016966620460152626, 0.0871424600481987, -0.0434100441634655, -0.09427584707736969, 0.08946286141872406, 0.0038104455452412367, -0.1368124932050705, 0.05448998883366585, -0.05135731026530266, 0.13966992497444153, -0.04810076206922531, 0.0900053158402443, -0.05872584879398346, -0.06323063373565674, 0.020664479583501816, 0.10692714154720306, -0.3541460931301117, 0.20758792757987976, 0.011527951806783676, -0.06093791872262955, -0.11825107038021088, -0.0057329474948346615, 0.01827836222946644, 0.11354909837245941, 0.07954831421375275, 0.005323044024407864, -0.06418422609567642, -0.10157541930675507, -0.034152161329984665, 0.004723701626062393, 0.13940384984016418, -0.02400369942188263, 0.003448218572884798, -0.04438253492116928, -0.020158851519227028, -0.023178286850452423, -0.000718157272785902, -0.006451705005019903, -0.1613864004611969, 0.0666336938738823, 0.025228632614016533, 0.06822126358747482, 0.01730911061167717, -0.02732805535197258, -0.07030908018350601, 0.20019540190696716, -0.055100217461586, -0.0734892338514328, -0.13220442831516266, -0.0626208633184433, 0.06701112538576126, -0.07306446135044098, 0.04298105835914612, -0.06441843509674072, 0.0504937544465065, -0.06429126858711243, -0.23106469213962555, 0.12129522860050201, -0.10896743834018707, -0.04214934632182121, -0.05486045777797699, 0.19954514503479004, -0.07281329482793808, 0.01817229390144348, 0.017324624583125114, 0.013015411794185638, -0.09135245531797409, -0.07786673307418823, 0.014232970774173737, 0.01847606711089611, 0.058559149503707886, 0.05814133957028389, -0.08037004619836807, -0.014857451431453228, -0.03221143037080765, -0.009274402633309364, 0.32020795345306396, 0.13638851046562195, -0.04181653633713722, 0.1722007840871811, 0.14273306727409363, -0.09682104736566544, -0.32250353693962097, -0.0542713925242424, -0.09283889830112457, -0.02921893633902073, -0.04146459698677063, -0.16693620383739471, 0.09226667881011963, 0.025867247954010963, -0.0081669632345438, 0.11432857066392899, -0.26718688011169434, -0.08819488435983658, 0.13733826577663422, 0.011434170417487621, 0.36291927099227905, -0.10883680731058121, -0.11266031861305237, -0.07516881823539734, -0.14414578676223755, 0.14854134619235992, -0.046829547733068466, 0.09237111359834671, -0.03225376084446907, 0.1023406982421875, 0.04431767761707306, -0.05918179079890251, 0.0826093852519989, 0.03412443771958351, 0.011541778221726418, -0.09786099195480347, -0.024788152426481247, 0.043923161923885345, -0.01647278480231762, 0.0186665840446949, -0.018545938655734062, 0.02520407736301422, -0.1436202973127365, -0.030395017936825752, -0.08319204300642014, 0.054647669196128845, 0.024612871930003166, -0.06083676591515541, 0.035113632678985596, -0.07397028058767319, 0.015755590051412582, 0.0031852610409259796, 0.1968139111995697, -0.04927601292729378, 0.16436325013637543, 0.17140397429466248, 0.1386384516954422, -0.14085504412651062, 0.0313987135887146, -0.06756629794836044, -0.06380796432495117, 0.07617787271738052, -0.09445741772651672, 0.07254795730113983, 0.12724515795707703, -0.04090844467282295, 0.06647319346666336, 0.11278997361660004, 0.02755293995141983, -0.01717589795589447, 0.12903688848018646, -0.24685613811016083, 0.05759644880890846, -0.0774354338645935, 0.007829134352505207, 0.06760475039482117, 0.0691077932715416, 0.17701676487922668, 0.022113438695669174, -0.03186488151550293, -0.018044183030724525, 0.014400118961930275, -0.05583859980106354, 0.10128172487020493, 0.01910126954317093, 0.03747108206152916, -0.14913128316402435, 0.10593283176422119, 0.014065378345549107, -0.15610112249851227, -0.0030291201546788216, 0.18251965939998627, -0.1271468847990036, -0.1175045371055603, -0.005189466755837202, 0.11735180765390396, -0.1323971152305603, -0.027732165530323982, -0.056806646287441254, -0.13107746839523315, 0.1038786843419075, 0.1834852397441864, 0.06416597962379456, 0.0826336219906807, -0.05614696815609932, -0.05633258447051048, -0.05537234991788864, -0.012746404856443405, -0.004845258314162493, 0.02958609163761139, -0.0928758755326271, 0.08036726713180542, -0.03491966053843498, 0.14262840151786804, -0.0898408517241478, -0.06918130815029144, -0.15198032557964325, 0.042447883635759354, -0.1359063982963562, -0.06605049222707748, -0.08334993571043015, -0.06526677310466766, -0.018778005614876747, -0.01698305644094944, -0.06606939435005188, -0.04105973616242409, -0.1284734159708023, 0.024907397106289864, -0.05333518236875534, 0.03229110315442085, -0.05507495999336243, -0.0035495858173817396, 0.07976923137903214, -0.049493879079818726, 0.1077856719493866, 0.1420586258172989, -0.09717884659767151, 0.11032459139823914, -0.11992289125919342, -0.11299397796392441, 0.11099837720394135, 0.027232574298977852, 0.05815976485610008, 0.06443625688552856, 0.022027408704161644, 0.0828590840101242, 0.02554231509566307, 0.042817723006010056, 0.02065151557326317, -0.12103132158517838, 0.030482446774840355, -0.04751116782426834, -0.14892028272151947, -0.06529612094163895, -0.054095201194286346, 0.061603277921676636, 0.011807980015873909, 0.12336400151252747, -0.04274550825357437, 0.12397313863039017, -0.07301304489374161, 0.01582331769168377, 0.004852397367358208, -0.1726864129304886, -0.047937918454408646, -0.07537665218114853, 0.04021551460027695, 0.0031554480083286762, 0.24552465975284576, 0.01341304462403059, 0.0232835803180933, 0.03332333266735077, 0.08759349584579468, -0.006232629995793104, 0.02662578783929348, 0.19984489679336548, 0.10153228044509888, -0.04841103032231331, -0.07740340381860733, 0.0841817706823349, 0.027718786150217056, 0.045756641775369644, 0.1580987125635147, 0.05288330838084221, 0.02733166329562664, 0.10682656615972519, -0.00838334672152996, 0.026530034840106964, -0.10955263674259186, -0.15393203496932983, -0.013403541408479214, 0.0689178928732872, -0.013366389088332653, 0.06380236148834229, 0.15174338221549988, -0.031019950285553932, 0.035003937780857086, -0.02637603133916855, -0.04663315415382385, -0.18330888450145721, -0.13960829377174377, -0.08663944154977798, -0.11958782374858856, -0.0004337868595030159, -0.10688566416501999, 0.054552797228097916, 0.09666495025157928, 0.06784127652645111, -0.05839335545897484, 0.0752289667725563, 0.03856367990374565, -0.09408316761255264, 0.060062143951654434, -0.04271860048174858, 0.07346413284540176, -0.01862303353846073, -0.01636376976966858, -0.08663247525691986, -0.015964046120643616, -0.017242593690752983, 0.047769226133823395, -0.0590713806450367, 0.013490854762494564, -0.13409970700740814, -0.11919060349464417, -0.03309672325849533, 0.045155834406614304, -0.04299371317028999, 0.15545444190502167, -0.0017370387213304639, -0.022618813440203667, 0.025549639016389847, 0.21336813271045685, -0.09325596690177917, -0.05712331831455231, -0.04163253679871559, 0.23262251913547516, 0.060663092881441116, 0.10347004979848862, -0.01984133943915367, 0.010419715195894241, -0.0872872993350029, 0.3477313220500946, 0.2864106297492981, -0.0785008892416954, 0.01852458342909813, 0.025229377672076225, 0.032489433884620667, 0.11828687787055969, 0.14369255304336548, 0.0973535031080246, 0.23375727236270905, -0.07746781408786774, -0.008508224971592426, -0.026193976402282715, -0.022704333066940308, -0.07821500301361084, 0.13037200272083282, 0.022821061313152313, -0.07975984364748001, -0.02735958993434906, 0.09264971315860748, -0.23305658996105194, 0.15770022571086884, -0.09615077078342438, -0.18416492640972137, -0.0657498687505722, -0.004348475951701403, 0.13509869575500488, 0.0012923850445076823, 0.0851651206612587, -0.00950760766863823, -0.09292657673358917, 0.03073572926223278, 0.018268031999468803, -0.21988633275032043, -0.02724556252360344, 0.04685826599597931, -0.07749628275632858, -0.018672892823815346, -0.004388496745377779, 0.0267090555280447, 0.07139899581670761, 0.06441280245780945, -0.048781443387269974, 0.03654508665204048, 0.003955124877393246, -0.0477292500436306, 0.023610156029462814, 0.07727914303541183, 0.013922389596700668, -0.05439634621143341, 0.04671293869614601, -0.1666029691696167, 0.039284031838178635, -0.049411799758672714, -0.033983949571847916, 0.0128423310816288, -0.011059774085879326, -0.03385746479034424, 0.06773266941308975, 0.08311868458986282, -0.008829239755868912, -0.0032988465391099453, -0.06938329339027405, -0.0213033314794302, -0.020847970619797707, -0.08694157749414444, -0.10070307552814484, -0.148762509226799, -0.09853886067867279, 0.0973188504576683, -0.0049340552650392056, -0.21099382638931274, 0.004006146918982267, -0.09866927564144135, 0.02192481979727745, -0.20368202030658722, 0.08495088666677475, 0.06789152324199677, 0.01056761760264635, -0.0028865975327789783, -0.07395254075527191, 0.05188513174653053, 0.10077827423810959, -0.11843311786651611, -0.09581757336854935 ]
null
null
transformers
##A T5ForConditionalGeneration trained on 3 tasks from PAN Profiling Hate Speech Spreaders on Twitter dataset (EN): * author attribution (train and test sets from the PAN task) * topic attribution - topics were assigned with BertTopic library using embeddings from `cardiffnlp/bertweet-base-hate` Roberta model (train and test sets from the PAN task) * hate speech identification (train set from the PAN task) in order to generate tone of comment use prefix **hater classification:**
{}
text2text-generation
PaulAdversarial/T5_PAN_Hate_Speech_Twitter_topic_author_ishatespeach
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
##A T5ForConditionalGeneration trained on 3 tasks from PAN Profiling Hate Speech Spreaders on Twitter dataset (EN): * author attribution (train and test sets from the PAN task) * topic attribution - topics were assigned with BertTopic library using embeddings from 'cardiffnlp/bertweet-base-hate' Roberta model (train and test sets from the PAN task) * hate speech identification (train set from the PAN task) in order to generate tone of comment use prefix hater classification:
[]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 51 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.00556661281734705, 0.0164349228143692, -0.007315334863960743, 0.024348841980099678, 0.166501984000206, 0.024344393983483315, 0.11518356949090958, 0.1412411630153656, -0.0020334594883024693, -0.035958148539066315, 0.1319235861301422, 0.21714316308498383, -0.006268311757594347, 0.08315006643533707, -0.08710680902004242, -0.2602083683013916, 0.03483840078115463, 0.05276213958859444, 0.0049579935148358345, 0.12762698531150818, 0.08691143244504929, -0.0646481066942215, 0.09440414607524872, -0.03804538771510124, -0.17094822227954865, 0.05289003252983093, 0.06358526647090912, -0.12921307981014252, 0.11203359812498093, 0.04470131918787956, 0.10704579949378967, 0.035496506839990616, -0.04978400841355324, -0.14172321557998657, 0.0296106468886137, 0.026950722560286522, -0.06861023604869843, 0.06236075982451439, 0.1136597990989685, -0.09615244716405869, 0.08870459347963333, 0.0556039921939373, -0.004051631316542625, 0.062145017087459564, -0.15644147992134094, -0.029072636738419533, -0.01713588647544384, 0.023426201194524765, 0.07808161526918411, 0.09537342935800552, -0.008744281716644764, 0.1289428174495697, -0.09439463913440704, 0.13085374236106873, 0.15587438642978668, -0.3147599399089813, 0.001229330780915916, 0.04856342077255249, 0.05912892520427704, 0.08336716890335083, -0.016771415248513222, 0.03740086406469345, 0.02475069649517536, 0.028233950957655907, 0.042602118104696274, -0.08473630249500275, -0.1701638251543045, 0.042069677263498306, -0.0844234824180603, -0.05861080437898636, 0.24913406372070312, -0.05834462493658066, 0.06408444046974182, -0.01192506030201912, -0.11543367058038712, -0.06456129997968674, -0.011278037913143635, -0.012939955107867718, -0.04709155857563019, 0.06321226805448532, 0.02679656445980072, -0.05803138390183449, -0.13286004960536957, -0.009396729059517384, -0.17277660965919495, 0.11429595947265625, 0.011107098311185837, 0.055417537689208984, -0.23684222996234894, 0.0894838273525238, 0.045466348528862, -0.10835321247577667, 0.06076562777161598, -0.09032987058162689, 0.01951918564736843, -0.019337479025125504, -0.05594807490706444, -0.14819347858428955, 0.06912460923194885, 0.09622079879045486, 0.022458840161561966, 0.029469674453139305, -0.06034425273537636, 0.08020190894603729, 0.030660733580589294, 0.07118292897939682, 0.002764588687568903, -0.035829924046993256, 0.05543043091893196, -0.11155472695827484, -0.008295934647321701, -0.06744275242090225, -0.14817972481250763, -0.05471540614962578, 0.08553145080804825, 0.09807302802801132, 0.029591085389256477, 0.0872388631105423, -0.044323358684778214, -0.042564451694488525, 0.010892827063798904, -0.08642291277647018, -0.011313864029943943, -0.000025251778424717486, 0.017184332013130188, 0.13600346446037292, 0.016563422977924347, 0.014259100891649723, -0.15025922656059265, 0.051360122859478, -0.07618893682956696, -0.003882812336087227, -0.03447994589805603, -0.07451999187469482, 0.026431044563651085, -0.0926901325583458, 0.016483480110764503, -0.1643074005842209, -0.153541699051857, 0.019637132063508034, 0.019818242639303207, -0.01814674586057663, -0.05446401610970497, -0.04473382234573364, -0.03356395289301872, 0.050795428454875946, -0.06109105795621872, 0.007418809924274683, -0.04204783961176872, 0.09961801022291183, -0.03490151837468147, 0.0686439722776413, -0.10930459946393967, 0.07527212053537369, -0.1283227801322937, -0.024262387305498123, -0.07863514870405197, 0.05983925983309746, 0.0200795941054821, 0.13076506555080414, -0.03833391144871712, -0.023659348487854004, -0.060193415731191635, 0.03925630450248718, -0.0268486887216568, 0.19988413155078888, -0.1042189672589302, -0.10171450674533844, 0.2354324758052826, -0.06842701882123947, -0.1713089793920517, 0.09375571459531784, 0.0027509552892297506, 0.05962032452225685, 0.08849970251321793, 0.1733561009168625, 0.0375605970621109, -0.020121192559599876, 0.10614369064569473, 0.09608419239521027, -0.10053163021802902, -0.0855114683508873, 0.012432892806828022, -0.017932431772351265, -0.12291114777326584, 0.042022403329610825, 0.09157063812017441, 0.07639165967702866, -0.048442572355270386, -0.037454720586538315, -0.04233349487185478, -0.0037850758526474237, 0.07970551401376724, 0.003191957715898752, 0.12530799210071564, -0.060514260083436966, -0.017125345766544342, -0.00970730371773243, -0.021197473630309105, -0.02157456800341606, 0.0427994504570961, -0.029060563072562218, 0.11202826350927353, -0.044375836849212646, 0.04996176064014435, -0.19708573818206787, -0.07950518280267715, -0.00009459419379709288, 0.14823977649211884, 0.0025853270199149847, 0.08866212517023087, 0.053792256861925125, -0.03104301728308201, -0.0064215571619570255, -0.01630372181534767, 0.13401302695274353, -0.006087199319154024, -0.07446936517953873, -0.07821033149957657, 0.04426480084657669, -0.06450248509645462, -0.025050701573491096, -0.07281817495822906, 0.016727562993764877, 0.009908524341881275, 0.11566682159900665, 0.031677234917879105, 0.05078164488077164, -0.01658935472369194, 0.016670551151037216, -0.08533018827438354, 0.014296403154730797, 0.09961540251970291, -0.01141710951924324, -0.05103765428066254, 0.20229773223400116, -0.17161768674850464, 0.23491214215755463, 0.18393823504447937, -0.28944259881973267, 0.006632600445300341, -0.03537070378661156, -0.025985898450016975, 0.0020077917724847794, 0.056325092911720276, -0.024217087775468826, 0.08345092087984085, -0.0014518649550154805, 0.19566179811954498, -0.061056189239025116, -0.053328365087509155, 0.005299531389027834, -0.0574905127286911, -0.008797192946076393, 0.06659712642431259, 0.08905737847089767, -0.18937712907791138, 0.16803357005119324, 0.23161664605140686, 0.02284090220928192, 0.16994214057922363, -0.006134420167654753, -0.04078718274831772, 0.06990283727645874, -0.010529168881475925, -0.03422388434410095, -0.09309064596891403, -0.17318937182426453, -0.02589605376124382, 0.07576539367437363, 0.03613846004009247, 0.08636131882667542, -0.10028368979692459, -0.03196824714541435, -0.0030699449125677347, 0.0036774331238120794, -0.0154800433665514, 0.09314090013504028, 0.07135064154863358, 0.12889064848423004, -0.02095034345984459, -0.013992778956890106, 0.11097750067710876, 0.009291634894907475, -0.11979115754365921, 0.1863044798374176, -0.1367644965648651, -0.33287522196769714, -0.14580926299095154, -0.172042116522789, -0.028890058398246765, 0.034660134464502335, 0.11098358780145645, -0.09576473385095596, -0.024381177499890327, -0.00200450187548995, 0.07208182662725449, -0.08791472017765045, 0.026362471282482147, -0.08504201471805573, 0.06435829401016235, -0.06155692785978317, -0.07484325021505356, -0.04913013055920601, -0.009192944504320621, -0.03951077535748482, 0.1387929618358612, -0.12161171436309814, 0.05527077242732048, 0.18643830716609955, 0.0024109859950840473, 0.055052563548088074, -0.03348764404654503, 0.1838768720626831, -0.06276846677064896, 0.011544623412191868, 0.21395781636238098, -0.06411220878362656, 0.0745253935456276, 0.1306767761707306, -0.0033746296539902687, -0.06828748434782028, 0.03352054953575134, -0.03297126665711403, -0.07394568622112274, -0.26617541909217834, -0.0919879674911499, -0.13333486020565033, 0.07579639554023743, 0.0671529769897461, 0.05104738473892212, 0.17074435949325562, 0.06376895308494568, -0.0021396984811872244, 0.04823816567659378, 0.02085288241505623, 0.08897827565670013, 0.17748302221298218, -0.0035200107377022505, 0.12296830117702484, -0.05708124861121178, -0.11661257594823837, 0.07658019661903381, 0.06053204461932182, 0.11748291552066803, 0.06144772842526436, 0.0673404335975647, 0.009640947915613651, 0.09566781669855118, 0.13296754658222198, 0.14900599420070648, 0.02468167617917061, -0.008608612231910229, -0.030044887214899063, -0.03201429173350334, -0.039632637053728104, 0.03699196130037308, 0.027697665616869926, -0.11566830426454544, -0.09036512672901154, -0.08160438388586044, 0.06716005504131317, 0.13062477111816406, 0.07317515462636948, -0.22939440608024597, 0.021544968709349632, 0.07740186899900436, -0.04538768529891968, -0.1127815991640091, 0.08245086669921875, -0.011997534893453121, -0.1321650892496109, 0.05299573391675949, -0.059155527502298355, 0.12868814170360565, -0.03321857377886772, 0.09155638515949249, -0.036200251430273056, -0.07173743844032288, 0.023195916786789894, 0.1081700548529625, -0.3327305316925049, 0.19491779804229736, 0.004171358421444893, -0.06359637528657913, -0.10467778891324997, -0.0019751235377043486, 0.0009388330508954823, 0.11369986087083817, 0.10460136085748672, -0.001098173321224749, -0.044636908918619156, -0.07322456687688828, -0.004860122688114643, 0.020925045013427734, 0.1282682567834854, -0.03900206461548805, 0.012867438606917858, -0.060800474137067795, -0.020295564085245132, -0.018256759271025658, -0.0022203531116247177, -0.02081105299293995, -0.16639147698879242, 0.0740898847579956, 0.018359249457716942, 0.06382673978805542, 0.017137285321950912, -0.023163197562098503, -0.04883239418268204, 0.2122536450624466, -0.05754205211997032, -0.0916326716542244, -0.12432167679071426, -0.06284135580062866, 0.05895516648888588, -0.07722198963165283, 0.0446588434278965, -0.0745319053530693, 0.022527653723955154, -0.04819094017148018, -0.243226557970047, 0.1347477287054062, -0.09412279725074768, -0.03712499886751175, -0.05321687459945679, 0.1769246608018875, -0.08598630130290985, 0.004587682895362377, 0.019671428948640823, 0.005797455552965403, -0.09387195110321045, -0.05803963914513588, 0.005930939689278603, -0.007833045907318592, 0.043111652135849, 0.027385283261537552, -0.08851016312837601, -0.04925180599093437, -0.04387403652071953, 0.0025156724732369184, 0.32157042622566223, 0.13255396485328674, -0.0451158843934536, 0.17638471722602844, 0.11866987496614456, -0.09226285666227341, -0.28666549921035767, -0.09232616424560547, -0.08508557081222534, -0.02612167038023472, -0.014090736396610737, -0.17296333611011505, 0.07169285416603088, -0.011569414287805557, 0.001220228150486946, 0.11183511465787888, -0.25064438581466675, -0.08535761386156082, 0.13973328471183777, 0.0218928512185812, 0.3438052237033844, -0.11660566926002502, -0.09572115540504456, -0.037015512585639954, -0.16204795241355896, 0.17565034329891205, -0.05098377540707588, 0.08589936047792435, -0.02931063622236252, 0.09435304999351501, 0.05406441166996956, -0.035384826362133026, 0.049389392137527466, 0.004597121383994818, 0.008760917000472546, -0.10559062659740448, -0.057581186294555664, 0.06263675540685654, -0.015296169556677341, 0.03261565789580345, -0.052770473062992096, 0.040970027446746826, -0.12462522089481354, -0.029079878702759743, -0.09401141107082367, 0.04943455010652542, 0.022681253030896187, -0.06377900391817093, 0.03368838131427765, -0.07297802716493607, 0.02431732974946499, -0.004222060553729534, 0.22948065400123596, -0.03911164030432701, 0.16924336552619934, 0.14965012669563293, 0.12832023203372955, -0.10761105269193649, 0.023132294416427612, -0.07074703276157379, -0.06771499663591385, 0.07442385703325272, -0.11606685817241669, 0.06824298948049545, 0.11452312767505646, -0.03803333640098572, 0.06692792475223541, 0.11090198904275894, 0.007042061071842909, -0.018452122807502747, 0.13045786321163177, -0.2555803954601288, 0.01838049292564392, -0.09133486449718475, -0.034731827676296234, 0.04473074525594711, 0.059897102415561676, 0.1758844554424286, 0.013937903568148613, -0.046279918402433395, -0.006654669996351004, 0.009265775792300701, -0.05292464420199394, 0.06189149618148804, 0.021519040688872337, 0.028436847031116486, -0.12892818450927734, 0.09216773509979248, 0.04164006561040878, -0.15696687996387482, 0.014136283658444881, 0.20570343732833862, -0.12955474853515625, -0.11714471131563187, 0.009732695296406746, 0.1272372156381607, -0.12340531498193741, -0.0131310960277915, -0.06835518032312393, -0.12049886584281921, 0.08346058428287506, 0.1870250254869461, 0.05310768634080887, 0.08864229917526245, -0.04889966920018196, -0.052613768726587296, -0.0414779894053936, 0.019885722547769547, 0.010466893203556538, 0.024678871035575867, -0.10123980790376663, 0.0558118037879467, -0.03420061618089676, 0.1606583297252655, -0.08762852847576141, -0.06293909251689911, -0.15223504602909088, 0.029476208612322807, -0.1285882592201233, -0.05573326349258423, -0.06554782390594482, -0.04992866516113281, -0.00988073367625475, -0.016155900433659554, -0.04487239196896553, -0.039896611124277115, -0.12048979848623276, 0.01255242433398962, -0.0355960987508297, 0.040716752409935, -0.06325489282608032, -0.015957634896039963, 0.06713803857564926, -0.04783564805984497, 0.1234511286020279, 0.1244230642914772, -0.10297807306051254, 0.13015666604042053, -0.12901681661605835, -0.10247262567281723, 0.10341163724660873, 0.022168707102537155, 0.05857599526643753, 0.07140576094388962, 0.014355037361383438, 0.06138899549841881, 0.020014280453324318, 0.030626345425844193, 0.014497648924589157, -0.11934783309698105, 0.020539645105600357, -0.028154902160167694, -0.1567612886428833, -0.06687948107719421, -0.03541361168026924, 0.032845932990312576, 0.005566192790865898, 0.12214449048042297, -0.04963922128081322, 0.12121456861495972, -0.0739358589053154, 0.01540429051965475, 0.0030216770246624947, -0.1645183116197586, -0.07887569069862366, -0.08572901785373688, 0.028631536290049553, -0.01980959065258503, 0.1820257008075714, 0.028917625546455383, 0.04076537489891052, 0.028494594618678093, 0.0580499991774559, 0.0030040740966796875, 0.027813585475087166, 0.2076839804649353, 0.07256311923265457, -0.07106154412031174, -0.10604323446750641, 0.06415551900863647, 0.010025255382061005, 0.050124362111091614, 0.15928223729133606, 0.047759078443050385, -0.0001553031470393762, 0.0993572399020195, -0.015316850505769253, 0.029706554487347603, -0.08815840631723404, -0.14614541828632355, 0.010274967178702354, 0.07664129137992859, -0.008370366878807545, 0.08240756392478943, 0.17205661535263062, -0.008965100161731243, 0.028628967702388763, -0.021051079034805298, -0.050595544278621674, -0.17782315611839294, -0.14621272683143616, -0.08161406964063644, -0.10460387915372849, -0.009174822829663754, -0.10601648688316345, 0.058225058019161224, 0.039410240948200226, 0.06155374273657799, -0.06584067642688751, 0.07655779272317886, 0.09756571054458618, -0.11320552229881287, 0.07725733518600464, -0.030474407598376274, 0.06184573471546173, -0.002521720016375184, 0.005873269401490688, -0.08997280895709991, -0.004574684891849756, -0.03502881899476051, 0.04654679447412491, -0.055807504802942276, 0.02598988637328148, -0.14996908605098724, -0.1267300844192505, -0.023885613307356834, 0.05561595410108566, -0.048713523894548416, 0.11875221878290176, 0.01903730072081089, -0.02826787903904915, 0.03075653687119484, 0.22936390340328217, -0.08101353794336319, -0.06313319504261017, -0.045722268521785736, 0.2392023652791977, 0.05798419192433357, 0.08968336880207062, 0.0011541121639311314, -0.004275737330317497, -0.08654017746448517, 0.33573782444000244, 0.2875623404979706, -0.06669571250677109, 0.02185043878853321, 0.020036788657307625, 0.03247951716184616, 0.10445816069841385, 0.14738473296165466, 0.0807812362909317, 0.25269845128059387, -0.07112409919500351, 0.004070702008903027, -0.020794164389371872, -0.0035252978559583426, -0.094292551279068, 0.12061961740255356, 0.04871074855327606, -0.07605911046266556, -0.016266820952296257, 0.09616681933403015, -0.234622985124588, 0.1442924290895462, -0.09042277187108994, -0.16555511951446533, -0.0650661438703537, -0.022269470617175102, 0.12511363625526428, 0.0057012471370399, 0.08388835191726685, -0.013910903595387936, -0.09222894161939621, 0.06527531147003174, 0.026899857446551323, -0.2155075967311859, -0.001321874326094985, 0.0646858662366867, -0.11317645013332367, -0.013093317858874798, -0.01084262877702713, 0.04255429282784462, 0.06837044656276703, 0.06063440442085266, -0.052938252687454224, 0.02750498242676258, -0.0017553698271512985, -0.0005278441240079701, 0.02887236885726452, 0.05862223729491234, 0.019802596420049667, -0.08147921413183212, 0.05538501590490341, -0.14378812909126282, 0.031788941472768784, -0.04858637601137161, -0.02897854894399643, 0.0008240027818828821, 0.0006675400654785335, -0.032527755945920944, 0.054671213030815125, 0.09969543665647507, -0.01098863035440445, 0.0038473340682685375, -0.08210724592208862, -0.030329594388604164, 0.014884191565215588, -0.09322559833526611, -0.10513637214899063, -0.10709530115127563, -0.0982118621468544, 0.10922635346651077, -0.005953612271696329, -0.20968568325042725, 0.01261181477457285, -0.10233365744352341, 0.03503730893135071, -0.20695984363555908, 0.0975503996014595, 0.10488341748714447, 0.01344336662441492, 0.0057689351961016655, -0.03261208534240723, 0.051563702523708344, 0.1015261709690094, -0.12671351432800293, -0.08701961487531662 ]
null
null
transformers
A T5ForConditionalGeneration trained on 2 tasks from PAN Profiling Hate Speech Spreaders on Twitter dataset (EN): * topic attribution - topics were assigned with BertTopic library using embeddings from `cardiffnlp/bertweet-base-hate` Roberta model (train and test sets from the PAN task) * hate speech identification (train set from the PAN task) in order to generate tone of comment use prefix **hater classification:**
{}
text2text-generation
PaulAdversarial/T5_PAN_Hate_Speech_Twitter_topic_ishatespeach
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
A T5ForConditionalGeneration trained on 2 tasks from PAN Profiling Hate Speech Spreaders on Twitter dataset (EN): * topic attribution - topics were assigned with BertTopic library using embeddings from 'cardiffnlp/bertweet-base-hate' Roberta model (train and test sets from the PAN task) * hate speech identification (train set from the PAN task) in order to generate tone of comment use prefix hater classification:
[]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 51 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.00556661281734705, 0.0164349228143692, -0.007315334863960743, 0.024348841980099678, 0.166501984000206, 0.024344393983483315, 0.11518356949090958, 0.1412411630153656, -0.0020334594883024693, -0.035958148539066315, 0.1319235861301422, 0.21714316308498383, -0.006268311757594347, 0.08315006643533707, -0.08710680902004242, -0.2602083683013916, 0.03483840078115463, 0.05276213958859444, 0.0049579935148358345, 0.12762698531150818, 0.08691143244504929, -0.0646481066942215, 0.09440414607524872, -0.03804538771510124, -0.17094822227954865, 0.05289003252983093, 0.06358526647090912, -0.12921307981014252, 0.11203359812498093, 0.04470131918787956, 0.10704579949378967, 0.035496506839990616, -0.04978400841355324, -0.14172321557998657, 0.0296106468886137, 0.026950722560286522, -0.06861023604869843, 0.06236075982451439, 0.1136597990989685, -0.09615244716405869, 0.08870459347963333, 0.0556039921939373, -0.004051631316542625, 0.062145017087459564, -0.15644147992134094, -0.029072636738419533, -0.01713588647544384, 0.023426201194524765, 0.07808161526918411, 0.09537342935800552, -0.008744281716644764, 0.1289428174495697, -0.09439463913440704, 0.13085374236106873, 0.15587438642978668, -0.3147599399089813, 0.001229330780915916, 0.04856342077255249, 0.05912892520427704, 0.08336716890335083, -0.016771415248513222, 0.03740086406469345, 0.02475069649517536, 0.028233950957655907, 0.042602118104696274, -0.08473630249500275, -0.1701638251543045, 0.042069677263498306, -0.0844234824180603, -0.05861080437898636, 0.24913406372070312, -0.05834462493658066, 0.06408444046974182, -0.01192506030201912, -0.11543367058038712, -0.06456129997968674, -0.011278037913143635, -0.012939955107867718, -0.04709155857563019, 0.06321226805448532, 0.02679656445980072, -0.05803138390183449, -0.13286004960536957, -0.009396729059517384, -0.17277660965919495, 0.11429595947265625, 0.011107098311185837, 0.055417537689208984, -0.23684222996234894, 0.0894838273525238, 0.045466348528862, -0.10835321247577667, 0.06076562777161598, -0.09032987058162689, 0.01951918564736843, -0.019337479025125504, -0.05594807490706444, -0.14819347858428955, 0.06912460923194885, 0.09622079879045486, 0.022458840161561966, 0.029469674453139305, -0.06034425273537636, 0.08020190894603729, 0.030660733580589294, 0.07118292897939682, 0.002764588687568903, -0.035829924046993256, 0.05543043091893196, -0.11155472695827484, -0.008295934647321701, -0.06744275242090225, -0.14817972481250763, -0.05471540614962578, 0.08553145080804825, 0.09807302802801132, 0.029591085389256477, 0.0872388631105423, -0.044323358684778214, -0.042564451694488525, 0.010892827063798904, -0.08642291277647018, -0.011313864029943943, -0.000025251778424717486, 0.017184332013130188, 0.13600346446037292, 0.016563422977924347, 0.014259100891649723, -0.15025922656059265, 0.051360122859478, -0.07618893682956696, -0.003882812336087227, -0.03447994589805603, -0.07451999187469482, 0.026431044563651085, -0.0926901325583458, 0.016483480110764503, -0.1643074005842209, -0.153541699051857, 0.019637132063508034, 0.019818242639303207, -0.01814674586057663, -0.05446401610970497, -0.04473382234573364, -0.03356395289301872, 0.050795428454875946, -0.06109105795621872, 0.007418809924274683, -0.04204783961176872, 0.09961801022291183, -0.03490151837468147, 0.0686439722776413, -0.10930459946393967, 0.07527212053537369, -0.1283227801322937, -0.024262387305498123, -0.07863514870405197, 0.05983925983309746, 0.0200795941054821, 0.13076506555080414, -0.03833391144871712, -0.023659348487854004, -0.060193415731191635, 0.03925630450248718, -0.0268486887216568, 0.19988413155078888, -0.1042189672589302, -0.10171450674533844, 0.2354324758052826, -0.06842701882123947, -0.1713089793920517, 0.09375571459531784, 0.0027509552892297506, 0.05962032452225685, 0.08849970251321793, 0.1733561009168625, 0.0375605970621109, -0.020121192559599876, 0.10614369064569473, 0.09608419239521027, -0.10053163021802902, -0.0855114683508873, 0.012432892806828022, -0.017932431772351265, -0.12291114777326584, 0.042022403329610825, 0.09157063812017441, 0.07639165967702866, -0.048442572355270386, -0.037454720586538315, -0.04233349487185478, -0.0037850758526474237, 0.07970551401376724, 0.003191957715898752, 0.12530799210071564, -0.060514260083436966, -0.017125345766544342, -0.00970730371773243, -0.021197473630309105, -0.02157456800341606, 0.0427994504570961, -0.029060563072562218, 0.11202826350927353, -0.044375836849212646, 0.04996176064014435, -0.19708573818206787, -0.07950518280267715, -0.00009459419379709288, 0.14823977649211884, 0.0025853270199149847, 0.08866212517023087, 0.053792256861925125, -0.03104301728308201, -0.0064215571619570255, -0.01630372181534767, 0.13401302695274353, -0.006087199319154024, -0.07446936517953873, -0.07821033149957657, 0.04426480084657669, -0.06450248509645462, -0.025050701573491096, -0.07281817495822906, 0.016727562993764877, 0.009908524341881275, 0.11566682159900665, 0.031677234917879105, 0.05078164488077164, -0.01658935472369194, 0.016670551151037216, -0.08533018827438354, 0.014296403154730797, 0.09961540251970291, -0.01141710951924324, -0.05103765428066254, 0.20229773223400116, -0.17161768674850464, 0.23491214215755463, 0.18393823504447937, -0.28944259881973267, 0.006632600445300341, -0.03537070378661156, -0.025985898450016975, 0.0020077917724847794, 0.056325092911720276, -0.024217087775468826, 0.08345092087984085, -0.0014518649550154805, 0.19566179811954498, -0.061056189239025116, -0.053328365087509155, 0.005299531389027834, -0.0574905127286911, -0.008797192946076393, 0.06659712642431259, 0.08905737847089767, -0.18937712907791138, 0.16803357005119324, 0.23161664605140686, 0.02284090220928192, 0.16994214057922363, -0.006134420167654753, -0.04078718274831772, 0.06990283727645874, -0.010529168881475925, -0.03422388434410095, -0.09309064596891403, -0.17318937182426453, -0.02589605376124382, 0.07576539367437363, 0.03613846004009247, 0.08636131882667542, -0.10028368979692459, -0.03196824714541435, -0.0030699449125677347, 0.0036774331238120794, -0.0154800433665514, 0.09314090013504028, 0.07135064154863358, 0.12889064848423004, -0.02095034345984459, -0.013992778956890106, 0.11097750067710876, 0.009291634894907475, -0.11979115754365921, 0.1863044798374176, -0.1367644965648651, -0.33287522196769714, -0.14580926299095154, -0.172042116522789, -0.028890058398246765, 0.034660134464502335, 0.11098358780145645, -0.09576473385095596, -0.024381177499890327, -0.00200450187548995, 0.07208182662725449, -0.08791472017765045, 0.026362471282482147, -0.08504201471805573, 0.06435829401016235, -0.06155692785978317, -0.07484325021505356, -0.04913013055920601, -0.009192944504320621, -0.03951077535748482, 0.1387929618358612, -0.12161171436309814, 0.05527077242732048, 0.18643830716609955, 0.0024109859950840473, 0.055052563548088074, -0.03348764404654503, 0.1838768720626831, -0.06276846677064896, 0.011544623412191868, 0.21395781636238098, -0.06411220878362656, 0.0745253935456276, 0.1306767761707306, -0.0033746296539902687, -0.06828748434782028, 0.03352054953575134, -0.03297126665711403, -0.07394568622112274, -0.26617541909217834, -0.0919879674911499, -0.13333486020565033, 0.07579639554023743, 0.0671529769897461, 0.05104738473892212, 0.17074435949325562, 0.06376895308494568, -0.0021396984811872244, 0.04823816567659378, 0.02085288241505623, 0.08897827565670013, 0.17748302221298218, -0.0035200107377022505, 0.12296830117702484, -0.05708124861121178, -0.11661257594823837, 0.07658019661903381, 0.06053204461932182, 0.11748291552066803, 0.06144772842526436, 0.0673404335975647, 0.009640947915613651, 0.09566781669855118, 0.13296754658222198, 0.14900599420070648, 0.02468167617917061, -0.008608612231910229, -0.030044887214899063, -0.03201429173350334, -0.039632637053728104, 0.03699196130037308, 0.027697665616869926, -0.11566830426454544, -0.09036512672901154, -0.08160438388586044, 0.06716005504131317, 0.13062477111816406, 0.07317515462636948, -0.22939440608024597, 0.021544968709349632, 0.07740186899900436, -0.04538768529891968, -0.1127815991640091, 0.08245086669921875, -0.011997534893453121, -0.1321650892496109, 0.05299573391675949, -0.059155527502298355, 0.12868814170360565, -0.03321857377886772, 0.09155638515949249, -0.036200251430273056, -0.07173743844032288, 0.023195916786789894, 0.1081700548529625, -0.3327305316925049, 0.19491779804229736, 0.004171358421444893, -0.06359637528657913, -0.10467778891324997, -0.0019751235377043486, 0.0009388330508954823, 0.11369986087083817, 0.10460136085748672, -0.001098173321224749, -0.044636908918619156, -0.07322456687688828, -0.004860122688114643, 0.020925045013427734, 0.1282682567834854, -0.03900206461548805, 0.012867438606917858, -0.060800474137067795, -0.020295564085245132, -0.018256759271025658, -0.0022203531116247177, -0.02081105299293995, -0.16639147698879242, 0.0740898847579956, 0.018359249457716942, 0.06382673978805542, 0.017137285321950912, -0.023163197562098503, -0.04883239418268204, 0.2122536450624466, -0.05754205211997032, -0.0916326716542244, -0.12432167679071426, -0.06284135580062866, 0.05895516648888588, -0.07722198963165283, 0.0446588434278965, -0.0745319053530693, 0.022527653723955154, -0.04819094017148018, -0.243226557970047, 0.1347477287054062, -0.09412279725074768, -0.03712499886751175, -0.05321687459945679, 0.1769246608018875, -0.08598630130290985, 0.004587682895362377, 0.019671428948640823, 0.005797455552965403, -0.09387195110321045, -0.05803963914513588, 0.005930939689278603, -0.007833045907318592, 0.043111652135849, 0.027385283261537552, -0.08851016312837601, -0.04925180599093437, -0.04387403652071953, 0.0025156724732369184, 0.32157042622566223, 0.13255396485328674, -0.0451158843934536, 0.17638471722602844, 0.11866987496614456, -0.09226285666227341, -0.28666549921035767, -0.09232616424560547, -0.08508557081222534, -0.02612167038023472, -0.014090736396610737, -0.17296333611011505, 0.07169285416603088, -0.011569414287805557, 0.001220228150486946, 0.11183511465787888, -0.25064438581466675, -0.08535761386156082, 0.13973328471183777, 0.0218928512185812, 0.3438052237033844, -0.11660566926002502, -0.09572115540504456, -0.037015512585639954, -0.16204795241355896, 0.17565034329891205, -0.05098377540707588, 0.08589936047792435, -0.02931063622236252, 0.09435304999351501, 0.05406441166996956, -0.035384826362133026, 0.049389392137527466, 0.004597121383994818, 0.008760917000472546, -0.10559062659740448, -0.057581186294555664, 0.06263675540685654, -0.015296169556677341, 0.03261565789580345, -0.052770473062992096, 0.040970027446746826, -0.12462522089481354, -0.029079878702759743, -0.09401141107082367, 0.04943455010652542, 0.022681253030896187, -0.06377900391817093, 0.03368838131427765, -0.07297802716493607, 0.02431732974946499, -0.004222060553729534, 0.22948065400123596, -0.03911164030432701, 0.16924336552619934, 0.14965012669563293, 0.12832023203372955, -0.10761105269193649, 0.023132294416427612, -0.07074703276157379, -0.06771499663591385, 0.07442385703325272, -0.11606685817241669, 0.06824298948049545, 0.11452312767505646, -0.03803333640098572, 0.06692792475223541, 0.11090198904275894, 0.007042061071842909, -0.018452122807502747, 0.13045786321163177, -0.2555803954601288, 0.01838049292564392, -0.09133486449718475, -0.034731827676296234, 0.04473074525594711, 0.059897102415561676, 0.1758844554424286, 0.013937903568148613, -0.046279918402433395, -0.006654669996351004, 0.009265775792300701, -0.05292464420199394, 0.06189149618148804, 0.021519040688872337, 0.028436847031116486, -0.12892818450927734, 0.09216773509979248, 0.04164006561040878, -0.15696687996387482, 0.014136283658444881, 0.20570343732833862, -0.12955474853515625, -0.11714471131563187, 0.009732695296406746, 0.1272372156381607, -0.12340531498193741, -0.0131310960277915, -0.06835518032312393, -0.12049886584281921, 0.08346058428287506, 0.1870250254869461, 0.05310768634080887, 0.08864229917526245, -0.04889966920018196, -0.052613768726587296, -0.0414779894053936, 0.019885722547769547, 0.010466893203556538, 0.024678871035575867, -0.10123980790376663, 0.0558118037879467, -0.03420061618089676, 0.1606583297252655, -0.08762852847576141, -0.06293909251689911, -0.15223504602909088, 0.029476208612322807, -0.1285882592201233, -0.05573326349258423, -0.06554782390594482, -0.04992866516113281, -0.00988073367625475, -0.016155900433659554, -0.04487239196896553, -0.039896611124277115, -0.12048979848623276, 0.01255242433398962, -0.0355960987508297, 0.040716752409935, -0.06325489282608032, -0.015957634896039963, 0.06713803857564926, -0.04783564805984497, 0.1234511286020279, 0.1244230642914772, -0.10297807306051254, 0.13015666604042053, -0.12901681661605835, -0.10247262567281723, 0.10341163724660873, 0.022168707102537155, 0.05857599526643753, 0.07140576094388962, 0.014355037361383438, 0.06138899549841881, 0.020014280453324318, 0.030626345425844193, 0.014497648924589157, -0.11934783309698105, 0.020539645105600357, -0.028154902160167694, -0.1567612886428833, -0.06687948107719421, -0.03541361168026924, 0.032845932990312576, 0.005566192790865898, 0.12214449048042297, -0.04963922128081322, 0.12121456861495972, -0.0739358589053154, 0.01540429051965475, 0.0030216770246624947, -0.1645183116197586, -0.07887569069862366, -0.08572901785373688, 0.028631536290049553, -0.01980959065258503, 0.1820257008075714, 0.028917625546455383, 0.04076537489891052, 0.028494594618678093, 0.0580499991774559, 0.0030040740966796875, 0.027813585475087166, 0.2076839804649353, 0.07256311923265457, -0.07106154412031174, -0.10604323446750641, 0.06415551900863647, 0.010025255382061005, 0.050124362111091614, 0.15928223729133606, 0.047759078443050385, -0.0001553031470393762, 0.0993572399020195, -0.015316850505769253, 0.029706554487347603, -0.08815840631723404, -0.14614541828632355, 0.010274967178702354, 0.07664129137992859, -0.008370366878807545, 0.08240756392478943, 0.17205661535263062, -0.008965100161731243, 0.028628967702388763, -0.021051079034805298, -0.050595544278621674, -0.17782315611839294, -0.14621272683143616, -0.08161406964063644, -0.10460387915372849, -0.009174822829663754, -0.10601648688316345, 0.058225058019161224, 0.039410240948200226, 0.06155374273657799, -0.06584067642688751, 0.07655779272317886, 0.09756571054458618, -0.11320552229881287, 0.07725733518600464, -0.030474407598376274, 0.06184573471546173, -0.002521720016375184, 0.005873269401490688, -0.08997280895709991, -0.004574684891849756, -0.03502881899476051, 0.04654679447412491, -0.055807504802942276, 0.02598988637328148, -0.14996908605098724, -0.1267300844192505, -0.023885613307356834, 0.05561595410108566, -0.048713523894548416, 0.11875221878290176, 0.01903730072081089, -0.02826787903904915, 0.03075653687119484, 0.22936390340328217, -0.08101353794336319, -0.06313319504261017, -0.045722268521785736, 0.2392023652791977, 0.05798419192433357, 0.08968336880207062, 0.0011541121639311314, -0.004275737330317497, -0.08654017746448517, 0.33573782444000244, 0.2875623404979706, -0.06669571250677109, 0.02185043878853321, 0.020036788657307625, 0.03247951716184616, 0.10445816069841385, 0.14738473296165466, 0.0807812362909317, 0.25269845128059387, -0.07112409919500351, 0.004070702008903027, -0.020794164389371872, -0.0035252978559583426, -0.094292551279068, 0.12061961740255356, 0.04871074855327606, -0.07605911046266556, -0.016266820952296257, 0.09616681933403015, -0.234622985124588, 0.1442924290895462, -0.09042277187108994, -0.16555511951446533, -0.0650661438703537, -0.022269470617175102, 0.12511363625526428, 0.0057012471370399, 0.08388835191726685, -0.013910903595387936, -0.09222894161939621, 0.06527531147003174, 0.026899857446551323, -0.2155075967311859, -0.001321874326094985, 0.0646858662366867, -0.11317645013332367, -0.013093317858874798, -0.01084262877702713, 0.04255429282784462, 0.06837044656276703, 0.06063440442085266, -0.052938252687454224, 0.02750498242676258, -0.0017553698271512985, -0.0005278441240079701, 0.02887236885726452, 0.05862223729491234, 0.019802596420049667, -0.08147921413183212, 0.05538501590490341, -0.14378812909126282, 0.031788941472768784, -0.04858637601137161, -0.02897854894399643, 0.0008240027818828821, 0.0006675400654785335, -0.032527755945920944, 0.054671213030815125, 0.09969543665647507, -0.01098863035440445, 0.0038473340682685375, -0.08210724592208862, -0.030329594388604164, 0.014884191565215588, -0.09322559833526611, -0.10513637214899063, -0.10709530115127563, -0.0982118621468544, 0.10922635346651077, -0.005953612271696329, -0.20968568325042725, 0.01261181477457285, -0.10233365744352341, 0.03503730893135071, -0.20695984363555908, 0.0975503996014595, 0.10488341748714447, 0.01344336662441492, 0.0057689351961016655, -0.03261208534240723, 0.051563702523708344, 0.1015261709690094, -0.12671351432800293, -0.08701961487531662 ]
null
null
transformers
## XLM-R Longformer Model XLM-R Longformer is a XLM-R model, that has been extended to allow sequence lengths up to 4096 tokens, instead of the regular 512. The model was pre-trained from the XLM-RoBERTa checkpoint using the Longformer [pre-training scheme](https://github.com/allenai/longformer/blob/master/scripts/convert_model_to_long.ipynb) on the English WikiText-103 corpus. The reason for this was to investigate methods for creating efficient Transformers for low-resource languages, such as Swedish, without the need to pre-train them on long-context datasets in each respecitve language. The trained model came as a result of a master thesis project at [Peltarion](https://peltarion.com/) and was fine-tuned on multilingual quesion-answering tasks, with code available [here](https://github.com/MarkusSagen/Master-Thesis-Multilingual-Longformer#xlm-r). Since both XLM-R model and Longformer models are large models, it it recommended to run the models with NVIDIA Apex (16bit precision), large GPU and several gradient accumulation steps. ## How to Use The model can be used as expected to fine-tune on a downstream task. For instance for QA. ```python import torch from transformers import AutoModel, AutoTokenizer MAX_SEQUENCE_LENGTH = 4096 MODEL_NAME_OR_PATH = "markussagen/xlm-roberta-longformer-base-4096" tokenizer = AutoTokenizer.from_pretrained( MODEL_NAME_OR_PATH, max_length=MAX_SEQUENCE_LENGTH, padding="max_length", truncation=True, ) model = AutoModelForQuestionAnswering.from_pretrained( MODEL_NAME_OR_PATH, max_length=MAX_SEQUENCE_LENGTH, ) ``` ## Training Procedure The model have been trained on the WikiText-103 corpus, using a **48GB** GPU with the following training script and parameters. The model was pre-trained for 6000 iterations and took ~5 days. See the full [training script](https://github.com/MarkusSagen/Master-Thesis-Multilingual-Longformer/blob/main/scripts/finetune_qa_models.py) and [Github repo](https://github.com/MarkusSagen/Master-Thesis-Multilingual-Longformer) for more information ```sh wget https://s3.amazonaws.com/research.metamind.io/wikitext/wikitext-103-raw-v1.zip unzip wikitext-103-raw-v1.zip export DATA_DIR=./wikitext-103-raw scripts/run_long_lm.py \ --model_name_or_path xlm-roberta-base \ --model_name xlm-roberta-to-longformer \ --output_dir ./output \ --logging_dir ./logs \ --val_file_path $DATA_DIR/wiki.valid.raw \ --train_file_path $DATA_DIR/wiki.train.raw \ --seed 42 \ --max_pos 4096 \ --adam_epsilon 1e-8 \ --warmup_steps 500 \ --learning_rate 3e-5 \ --weight_decay 0.01 \ --max_steps 6000 \ --evaluate_during_training \ --logging_steps 50 \ --eval_steps 50 \ --save_steps 6000 \ --max_grad_norm 1.0 \ --per_device_eval_batch_size 2 \ --per_device_train_batch_size 1 \ --gradient_accumulation_steps 64 \ --overwrite_output_dir \ --fp16 \ --do_train \ --do_eval ```
{"language": "multilingual", "license": "apache-2.0", "tags": ["longformer"], "datasets": ["wikitext"]}
fill-mask
Peltarion/xlm-roberta-longformer-base-4096
[ "transformers", "pytorch", "xlm-roberta", "fill-mask", "longformer", "multilingual", "dataset:wikitext", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "multilingual" ]
TAGS #transformers #pytorch #xlm-roberta #fill-mask #longformer #multilingual #dataset-wikitext #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
## XLM-R Longformer Model XLM-R Longformer is a XLM-R model, that has been extended to allow sequence lengths up to 4096 tokens, instead of the regular 512. The model was pre-trained from the XLM-RoBERTa checkpoint using the Longformer pre-training scheme on the English WikiText-103 corpus. The reason for this was to investigate methods for creating efficient Transformers for low-resource languages, such as Swedish, without the need to pre-train them on long-context datasets in each respecitve language. The trained model came as a result of a master thesis project at Peltarion and was fine-tuned on multilingual quesion-answering tasks, with code available here. Since both XLM-R model and Longformer models are large models, it it recommended to run the models with NVIDIA Apex (16bit precision), large GPU and several gradient accumulation steps. ## How to Use The model can be used as expected to fine-tune on a downstream task. For instance for QA. ## Training Procedure The model have been trained on the WikiText-103 corpus, using a 48GB GPU with the following training script and parameters. The model was pre-trained for 6000 iterations and took ~5 days. See the full training script and Github repo for more information
[ "## XLM-R Longformer Model \nXLM-R Longformer is a XLM-R model, that has been extended to allow sequence lengths up to 4096 tokens, instead of the regular 512. The model was pre-trained from the XLM-RoBERTa checkpoint using the Longformer pre-training scheme on the English WikiText-103 corpus. \n \nThe reason for this was to investigate methods for creating efficient Transformers for low-resource languages, such as Swedish, without the need to pre-train them on long-context datasets in each respecitve language. The trained model came as a result of a master thesis project at Peltarion and was fine-tuned on multilingual quesion-answering tasks, with code available here. \n \nSince both XLM-R model and Longformer models are large models, it it recommended to run the models with NVIDIA Apex (16bit precision), large GPU and several gradient accumulation steps.", "## How to Use \nThe model can be used as expected to fine-tune on a downstream task. \nFor instance for QA.", "## Training Procedure \nThe model have been trained on the WikiText-103 corpus, using a 48GB GPU with the following training script and parameters. The model was pre-trained for 6000 iterations and took ~5 days. See the full training script and Github repo for more information" ]
[ "TAGS\n#transformers #pytorch #xlm-roberta #fill-mask #longformer #multilingual #dataset-wikitext #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "## XLM-R Longformer Model \nXLM-R Longformer is a XLM-R model, that has been extended to allow sequence lengths up to 4096 tokens, instead of the regular 512. The model was pre-trained from the XLM-RoBERTa checkpoint using the Longformer pre-training scheme on the English WikiText-103 corpus. \n \nThe reason for this was to investigate methods for creating efficient Transformers for low-resource languages, such as Swedish, without the need to pre-train them on long-context datasets in each respecitve language. The trained model came as a result of a master thesis project at Peltarion and was fine-tuned on multilingual quesion-answering tasks, with code available here. \n \nSince both XLM-R model and Longformer models are large models, it it recommended to run the models with NVIDIA Apex (16bit precision), large GPU and several gradient accumulation steps.", "## How to Use \nThe model can be used as expected to fine-tune on a downstream task. \nFor instance for QA.", "## Training Procedure \nThe model have been trained on the WikiText-103 corpus, using a 48GB GPU with the following training script and parameters. The model was pre-trained for 6000 iterations and took ~5 days. See the full training script and Github repo for more information" ]
[ 61, 213, 27, 63 ]
[ "passage: TAGS\n#transformers #pytorch #xlm-roberta #fill-mask #longformer #multilingual #dataset-wikitext #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n## XLM-R Longformer Model \nXLM-R Longformer is a XLM-R model, that has been extended to allow sequence lengths up to 4096 tokens, instead of the regular 512. The model was pre-trained from the XLM-RoBERTa checkpoint using the Longformer pre-training scheme on the English WikiText-103 corpus. \n \nThe reason for this was to investigate methods for creating efficient Transformers for low-resource languages, such as Swedish, without the need to pre-train them on long-context datasets in each respecitve language. The trained model came as a result of a master thesis project at Peltarion and was fine-tuned on multilingual quesion-answering tasks, with code available here. \n \nSince both XLM-R model and Longformer models are large models, it it recommended to run the models with NVIDIA Apex (16bit precision), large GPU and several gradient accumulation steps.## How to Use \nThe model can be used as expected to fine-tune on a downstream task. \nFor instance for QA.## Training Procedure \nThe model have been trained on the WikiText-103 corpus, using a 48GB GPU with the following training script and parameters. The model was pre-trained for 6000 iterations and took ~5 days. See the full training script and Github repo for more information" ]
[ -0.04757397994399071, 0.013219211250543594, -0.0023391335271298885, 0.075689397752285, 0.09029462933540344, 0.034663811326026917, 0.05380088835954666, 0.08804911375045776, -0.008401410654187202, 0.03380405157804489, 0.07187254726886749, 0.02079908177256584, 0.11379115283489227, 0.04259520769119263, 0.028839487582445145, -0.22084484994411469, -0.010380898602306843, -0.07929480075836182, -0.015302731655538082, 0.09753976762294769, 0.07640478760004044, -0.07518622279167175, 0.06187174469232559, 0.033464960753917694, 0.007623446173965931, -0.03748580440878868, 0.01181486714631319, -0.056864868849515915, 0.07466689497232437, 0.08403097093105316, 0.05083420127630234, -0.014535963535308838, 0.09796545654535294, -0.09951696544885635, 0.03278443217277527, 0.04706957936286926, 0.01834813877940178, 0.0553482323884964, -0.04074358195066452, 0.09322167187929153, 0.01816963590681553, -0.00114451942499727, 0.009729646146297455, 0.015238313004374504, -0.03139143064618111, -0.033419981598854065, -0.10426465421915054, 0.023796269670128822, 0.03723880276083946, 0.06363401561975479, 0.017002012580633163, 0.08787837624549866, -0.13665291666984558, 0.022733477875590324, 0.06629141420125961, -0.25667938590049744, -0.05155471712350845, 0.09800077229738235, 0.1325860172510147, 0.06776273995637894, -0.03368991240859032, -0.03079652041196823, -0.008798508904874325, -0.0010225895093753934, 0.08428043872117996, -0.036422040313482285, 0.13917793333530426, -0.03829645738005638, -0.09010083228349686, -0.03129364550113678, 0.012988670729100704, 0.018685543909668922, -0.07660834491252899, -0.1708771288394928, -0.07922451198101044, -0.04593784362077713, -0.002715076319873333, -0.013343388214707375, 0.0008041891851462424, 0.031753458082675934, 0.12905609607696533, -0.12374359369277954, -0.07331417500972748, -0.05403204262256622, -0.01752607151865959, 0.18309929966926575, 0.05687180534005165, 0.06248892843723297, -0.02892259880900383, 0.11926673352718353, -0.060953378677368164, -0.08386466652154922, -0.05720072612166405, -0.09821002185344696, -0.1367494910955429, 0.034052927047014236, 0.09458701312541962, -0.04543093964457512, -0.04613661393523216, 0.1256946623325348, 0.034290678799152374, 0.011846737004816532, 0.11496936529874802, 0.008039784617722034, 0.013303767889738083, 0.16669251024723053, -0.10763641446828842, -0.09964054077863693, 0.07623840868473053, 0.04033420607447624, 0.021416282281279564, -0.05685591697692871, -0.06460262835025787, -0.04551639035344124, -0.03662266954779625, 0.053515203297138214, 0.029484232887625694, 0.011729446239769459, -0.017877968028187752, -0.11137533187866211, 0.11385571211576462, -0.10963542759418488, 0.02642705850303173, 0.04397854208946228, -0.06494476646184921, 0.06195923686027527, 0.014979077503085136, -0.012322603724896908, -0.0721941664814949, 0.03040626458823681, -0.03244774416089058, -0.01774606667459011, -0.13196003437042236, -0.1677989810705185, -0.0003699342196341604, -0.1033131554722786, -0.02480652742087841, -0.08402291685342789, -0.23510342836380005, -0.057567089796066284, 0.08258349448442459, -0.04250331595540047, -0.0362127348780632, 0.05185457691550255, -0.08006872236728668, -0.0018994829151779413, -0.026966996490955353, 0.04065752401947975, -0.020157264545559883, 0.06644723564386368, -0.09516286104917526, 0.051422666758298874, -0.03208613395690918, 0.019523808732628822, -0.04148629307746887, -0.012112977914512157, -0.19899767637252808, 0.11178376525640488, -0.011304546147584915, -0.018527796491980553, -0.03587571159005165, -0.10147053748369217, 0.0015438080299645662, -0.0005531984497793019, 0.015114388428628445, 0.11677242070436478, -0.14186415076255798, 0.012799546122550964, 0.060786694288253784, -0.07228999584913254, 0.009186496958136559, 0.11162176728248596, -0.0465850830078125, 0.15773886442184448, 0.06971868872642517, 0.05402112007141113, 0.1579197198152542, -0.10590127110481262, -0.01918250136077404, 0.01055835746228695, -0.11624971032142639, 0.040554385632276535, 0.07917950302362442, 0.05702367052435875, -0.010304401628673077, 0.026224752888083458, -0.08848840743303299, 0.012538791634142399, -0.010777129791676998, -0.06934722512960434, 0.018828600645065308, -0.023801183328032494, 0.025702307000756264, -0.0037305429577827454, -0.0027775149792432785, 0.02549412101507187, -0.11063537746667862, -0.13980834186077118, 0.0886138305068016, -0.09996578842401505, 0.009603678248822689, -0.1067858636379242, 0.04033484309911728, -0.035145897418260574, 0.029766598716378212, -0.09574439376592636, -0.032344844192266464, 0.05939565971493721, -0.0693012923002243, 0.03822935372591019, 0.04420511797070503, 0.032725028693675995, 0.03309720382094383, -0.011015353724360466, -0.0560331866145134, -0.07995244860649109, -0.07343778014183044, -0.025134820491075516, -0.08276087045669556, -0.06196443736553192, -0.05849621072411537, 0.03750519081950188, -0.13898517191410065, 0.022962797433137894, -0.10861963033676147, -0.06552892923355103, 0.024388469755649567, 0.009837771765887737, 0.026177313178777695, 0.005170437972992659, 0.0030086722690612078, -0.05141177773475647, 0.037571750581264496, 0.00879328791052103, -0.048082321882247925, 0.0622425377368927, -0.11687722057104111, -0.04009193554520607, 0.027902856469154358, 0.16781781613826752, -0.047552440315485, 0.0028871395625174046, 0.009076937101781368, -0.03361576423048973, -0.06888020783662796, -0.05128215253353119, 0.2197095900774002, 0.0293074119836092, 0.08626242727041245, -0.1004699319601059, 0.0255941990762949, 0.014839649200439453, -0.028524287045001984, 0.09603389352560043, 0.08153405785560608, 0.10484029352664948, -0.0948377177119255, 0.007120825350284576, -0.028765100985765457, 0.061118755489587784, 0.23120273649692535, 0.031210821121931076, -0.039470769464969635, -0.03450740501284599, 0.029980087652802467, 0.0002192007377743721, 0.12236124277114868, 0.02087472751736641, 0.03866194933652878, 0.03015998564660549, 0.05076593905687332, 0.0013689202023670077, -0.12338675558567047, 0.016607088968157768, 0.0005067292368039489, -0.04933018982410431, -0.035061534494161606, -0.015483761206269264, 0.02883252315223217, 0.08000026643276215, 0.008257709443569183, 0.02416740544140339, 0.004100242163985968, -0.04465288296341896, -0.09631065279245377, 0.1822129189968109, -0.08948340266942978, -0.1767214834690094, -0.1442994922399521, 0.013576827012002468, -0.019827386364340782, -0.02452268823981285, -0.01695975475013256, -0.04954751580953598, -0.041195839643478394, -0.07016659528017044, 0.1496313512325287, 0.03373955935239792, -0.02611471340060234, -0.05687869340181351, -0.021505028009414673, -0.020157549530267715, -0.12978030741214752, 0.008937940001487732, -0.06004989892244339, -0.09199462085962296, 0.07787133008241653, 0.0688338652253151, 0.061239708214998245, 0.11915147304534912, -0.03974079340696335, -0.04779737442731857, -0.004388210363686085, 0.14730726182460785, -0.08916889131069183, 0.12524017691612244, 0.18141458928585052, -0.05644157901406288, 0.038434065878391266, 0.13395659625530243, 0.02279937081038952, -0.05739891901612282, 0.015591340139508247, -0.02430427446961403, -0.10564595460891724, -0.20175470411777496, -0.04143290966749191, -0.058159735053777695, 0.048629071563482285, 0.0059923031367361546, 0.020061558112502098, -0.09096349775791168, 0.029149679467082024, 0.014150545932352543, -0.051583144813776016, 0.04312986508011818, 0.04764856398105621, 0.046748217195272446, 0.013983136042952538, 0.06468365341424942, -0.060862500220537186, 0.024469958618283272, 0.08521176874637604, 0.041767608374357224, 0.2665540874004364, -0.11137470602989197, 0.10331495106220245, 0.05135847628116608, 0.05408729985356331, 0.10246751457452774, 0.11962936073541641, -0.0426451675593853, -0.012500588782131672, -0.029133373871445656, -0.009652444161474705, -0.08324488997459412, 0.042433444410562515, 0.007118493318557739, 0.015075711533427238, -0.007455437444150448, 0.1167253628373146, 0.007870422676205635, 0.2249407023191452, 0.02173258736729622, -0.08474419265985489, -0.08742641657590866, 0.01225765235722065, -0.10034777969121933, -0.08354438841342926, 0.0016981966327875853, 0.1646323949098587, -0.14245541393756866, -0.01141328178346157, -0.0559392012655735, 0.08307698369026184, -0.09337315708398819, -0.015168674290180206, 0.002829249482601881, 0.15949830412864685, 0.02331516705453396, 0.06164165213704109, -0.10106705129146576, 0.10534503310918808, 0.03396770730614662, 0.14768628776073456, -0.04949599504470825, 0.03636026009917259, 0.04297132417559624, -0.04254233464598656, 0.11839603632688522, -0.0054153255186975, 0.10314085334539413, -0.0329454280436039, -0.16895298659801483, 0.05801192671060562, 0.011247019283473492, -0.010984105989336967, 0.12069670110940933, -0.004133155569434166, 0.0647236779332161, -0.024666914716362953, -0.0395495742559433, -0.067245714366436, -0.12058926373720169, 0.012600022368133068, -0.10091709345579147, -0.03551001101732254, -0.03793845698237419, -0.029270481318235397, -0.07859264314174652, 0.15762236714363098, -0.13506478071212769, -0.06450068205595016, -0.09674158692359924, 0.029935341328382492, 0.08023901283740997, -0.10134520381689072, 0.0031932431738823652, 0.0028913994319736958, 0.00953659974038601, -0.030168404802680016, -0.05747783184051514, 0.03862778842449188, -0.07417573779821396, -0.10930617153644562, -0.0036965173203498125, 0.031936414539813995, 0.09142443537712097, 0.08136703073978424, 0.012365180999040604, -0.02601255290210247, -0.035648759454488754, -0.15590576827526093, -0.04899093136191368, 0.14999385178089142, -0.024639861658215523, 0.06121710687875748, -0.06123967468738556, -0.04617255553603172, 0.005001980345696211, -0.0004337172140367329, 0.16961579024791718, 0.16190297901630402, -0.09463461488485336, 0.14121267199516296, 0.13385234773159027, -0.06397007405757904, -0.2588435113430023, -0.010724421590566635, 0.09657825529575348, 0.10894748568534851, -0.03543327748775482, -0.18160012364387512, 0.05254929140210152, 0.019250834360718727, 0.019986694678664207, 0.023745423182845116, -0.21987983584403992, -0.07176077365875244, -0.012741833925247192, 0.035328928381204605, 0.1970779448747635, -0.027634238824248314, 0.016018686816096306, 0.08243373781442642, -0.10226869583129883, 0.08051929622888565, -0.05546842887997627, 0.09683463722467422, 0.0051878588274121284, 0.010216405615210533, 0.003854099428281188, -0.07806970924139023, 0.09630747884511948, -0.032840628176927567, 0.041373685002326965, -0.012050531804561615, 0.06577800214290619, 0.06539683789014816, -0.026218468323349953, 0.1573542356491089, 0.057682815939188004, 0.0052814781665802, -0.09747475385665894, -0.07750733196735382, -0.08269059658050537, 0.010986842215061188, -0.036681972444057465, -0.05336565151810646, -0.04566500335931778, 0.08028621226549149, 0.048876140266656876, -0.012831456959247589, -0.053315334022045135, -0.038828302174806595, -0.07785767316818237, 0.019943907856941223, 0.19194094836711884, -0.06879263371229172, -0.09098915010690689, 0.008720891550183296, 0.03387616202235222, 0.11459900438785553, -0.07860397547483444, -0.0028890345711261034, 0.08506439626216888, 0.0032295340206474066, 0.09559604525566101, 0.010988370515406132, -0.09733571112155914, 0.08266688138246536, 0.011784056201577187, -0.08398018032312393, -0.1692172735929489, 0.03590366616845131, -0.09883717447519302, -0.08246041089296341, -0.07672392576932907, 0.12943556904792786, -0.013199702836573124, -0.013870124705135822, 0.007431654259562492, 0.05682186409831047, 0.03124907985329628, 0.11194761842489243, 0.007093522232025862, 0.02323652245104313, -0.1135961040854454, 0.037928029894828796, 0.02392859011888504, -0.09711657464504242, 0.0076829600147902966, 0.11144839227199554, -0.10630103200674057, -0.04304128512740135, -0.12740503251552582, 0.0007426264346577227, -0.04326491057872772, -0.034230221062898636, -0.010040261782705784, -0.05507279932498932, 0.0289896372705698, -0.05373623967170715, 0.0008769523701630533, -0.04072162136435509, -0.08753170073032379, -0.012394621968269348, -0.061882782727479935, 0.07996974885463715, 0.008293130435049534, -0.03446342796087265, -0.06229334697127342, -0.015215751715004444, 0.0799793154001236, 0.08105424046516418, -0.026361390948295593, -0.025848349556326866, -0.055851422250270844, -0.00024499636492691934, -0.05793504789471626, -0.013002700172364712, -0.030818380415439606, -0.004284680355340242, -0.041143130511045456, 0.02036801166832447, -0.040362268686294556, 0.026129432022571564, -0.062259819358587265, -0.02445051819086075, -0.04811802878975868, 0.010306062176823616, -0.02713141031563282, -0.017429789528250694, 0.01665426418185234, -0.06480088829994202, 0.124806247651577, 0.035205014050006866, -0.031084025278687477, 0.0307430662214756, 0.02590159885585308, -0.007749582175165415, -0.0015467590419575572, 0.04639037698507309, 0.020928021520376205, -0.13373050093650818, 0.05827099457383156, -0.010594062507152557, -0.05084168538451195, -0.024741560220718384, 0.047636449337005615, -0.08155330270528793, 0.09858419001102448, -0.013666084967553616, 0.003979971166700125, -0.05713457614183426, 0.050321970134973526, -0.010126627050340176, 0.07700455188751221, 0.13961341977119446, -0.0023558479733765125, 0.07346515357494354, -0.1192164272069931, -0.04540815204381943, 0.0004936148761771619, 0.03570433333516121, 0.003756252583116293, -0.024644285440444946, 0.04941317439079285, 0.015293796546757221, 0.0890355259180069, 0.04546169564127922, -0.010586940683424473, -0.002325220964848995, -0.03597666695713997, -0.044628534466028214, -0.00627734512090683, 0.030684370547533035, 0.030341217294335365, 0.02129548229277134, 0.050675127655267715, 0.054114364087581635, -0.019090648740530014, -0.011255526915192604, 0.22389015555381775, 0.043451420962810516, 0.16918477416038513, 0.042690031230449677, -0.025876561179757118, -0.02935798279941082, -0.17620187997817993, 0.04155658185482025, -0.05684272572398186, 0.03125287592411041, -0.016037119552493095, -0.005417940206825733, 0.16021007299423218, -0.047528304159641266, 0.10272810608148575, 0.026776932179927826, -0.0848611444234848, -0.13877838850021362, -0.12304596602916718, -0.026819437742233276, -0.009017175063490868, -0.011258061975240707, -0.10243229568004608, 0.014653685502707958, 0.06425369530916214, 0.02555588074028492, -0.0016963825328275561, 0.0977138802409172, -0.03726997226476669, -0.09901780635118484, 0.0134672150015831, 0.054454680532217026, 0.06057658791542053, 0.06488921493291855, 0.0385238379240036, -0.005363442935049534, -0.011239471845328808, 0.09504619240760803, 0.05599702522158623, 0.03508972376585007, 0.029792189598083496, -0.04290645197033882, -0.044294580817222595, -0.052807118743658066, 0.047196343541145325, 0.03202982619404793, 0.21383903920650482, 0.05047284811735153, -0.06933284550905228, -0.023582499474287033, 0.1876620501279831, -0.010300390422344208, -0.079295314848423, -0.20003782212734222, 0.15538954734802246, -0.011573496274650097, 0.006075649056583643, 0.014083748683333397, -0.0443199947476387, 0.003580473130568862, 0.21258339285850525, 0.18222790956497192, -0.10338117182254791, -0.00007835667202016339, 0.002772036474198103, -0.019207049161195755, -0.05348232761025429, 0.1627391129732132, 0.059500932693481445, 0.28510141372680664, -0.06344150006771088, 0.05285421386361122, -0.10118948668241501, -0.06722936779260635, -0.07839763909578323, 0.08488404750823975, -0.012903132475912571, 0.025298258289694786, -0.07719634473323822, 0.028092430904507637, 0.034753065556287766, -0.17848728597164154, 0.0054557775147259235, -0.041119981557130814, -0.09595026820898056, 0.004239676520228386, 0.002624346176162362, -0.019295470789074898, 0.0806078091263771, -0.006298580206930637, 0.007693154271692038, 0.12996158003807068, -0.0065828547812998295, -0.04940849915146828, -0.002858931664377451, 0.12407158315181732, 0.06655944883823395, 0.135156512260437, 0.0006716325297020376, 0.02925560623407364, 0.08704716712236404, -0.0081475293263793, -0.14178113639354706, 0.05741376802325249, 0.018881892785429955, -0.04821065813302994, -0.002612875308841467, 0.0957917794585228, 0.0026327676605433226, 0.03382342681288719, 0.062036219984292984, -0.052382659167051315, -0.012284016236662865, -0.08273126184940338, -0.07094155997037888, -0.05358317494392395, 0.08239149302244186, -0.05169293284416199, 0.15828518569469452, 0.14386001229286194, 0.009571312926709652, 0.006397570949047804, -0.0569922998547554, 0.019207807257771492, -0.018393931910395622, 0.13570553064346313, -0.017031311988830566, -0.09247305244207382, -0.023478791117668152, -0.09269112348556519, 0.037134889513254166, -0.18893922865390778, -0.08208134770393372, -0.0055597503669559956, -0.02663089707493782, -0.010864412412047386, 0.06567137688398361, -0.0015472160885110497, 0.03976566717028618, 0.0068128095008432865, 0.007983204908668995, -0.038423459976911545, 0.035835620015859604, -0.11906152963638306, -0.09067527204751968 ]
null
null
transformers
# Rick and Morty DialoGPT Model
{"tags": ["conversational"]}
text-generation
Pensador777critico/DialoGPT-small-RickandMorty
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Rick and Morty DialoGPT Model
[ "# Rick and Morty DialoGPT Model" ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Rick and Morty DialoGPT Model" ]
[ 51, 10 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Rick and Morty DialoGPT Model" ]
[ -0.01990443281829357, 0.10367733240127563, -0.006012056488543749, 0.013662099838256836, 0.1287931650876999, 0.004103946499526501, 0.13405320048332214, 0.13470496237277985, -0.029608309268951416, -0.0377325713634491, 0.1409052610397339, 0.2081032246351242, -0.009616929106414318, 0.025026321411132812, -0.08027864247560501, -0.33285143971443176, 0.04419311136007309, 0.04611847549676895, -0.04805411398410797, 0.11171722412109375, 0.09962809830904007, -0.03511058911681175, 0.07650627940893173, 0.012189619243144989, -0.11959464848041534, 0.014523470774292946, 0.01571112684905529, -0.09889741986989975, 0.11399844288825989, 0.07783890515565872, 0.031239205971360207, 0.033389654010534286, -0.042143791913986206, -0.13308840990066528, 0.04855761677026749, -0.0014628645731136203, -0.03996938467025757, 0.06519230455160141, 0.0068825362250208855, -0.09896008670330048, 0.13105708360671997, 0.11774895340204239, -0.001342291128821671, 0.030811335891485214, -0.1546017825603485, -0.03095608949661255, -0.013916928321123123, 0.04583658277988434, 0.05571185424923897, 0.1092928797006607, -0.03970988467335701, 0.11546611040830612, -0.046847838908433914, 0.11656361073255539, 0.13404695689678192, -0.27711591124534607, -0.013774634338915348, 0.14150507748126984, 0.03755388408899307, 0.031246060505509377, -0.03764049708843231, 0.09234841167926788, 0.010574371553957462, -0.009135077707469463, -0.054559025913476944, -0.07839421927928925, -0.06956472247838974, 0.03881034255027771, -0.08538595587015152, -0.0028573249001055956, 0.22309143841266632, -0.029777048155665398, 0.0931403860449791, -0.061110686510801315, -0.083645299077034, 0.0022445949725806713, -0.04396601766347885, -0.031562261283397675, -0.0995510146021843, 0.08443354815244675, -0.04024428874254227, -0.08693728595972061, -0.10731299221515656, -0.022938303649425507, -0.15873323380947113, 0.16214832663536072, 0.03501884266734123, 0.03956814110279083, -0.21219894289970398, 0.07603893429040909, -0.04213596507906914, -0.10128775984048843, 0.025763655081391335, -0.0809730738401413, 0.0031352867372334003, 0.01420458871871233, -0.034850042313337326, -0.01257789321243763, 0.09354974329471588, 0.11913833022117615, -0.002085368847474456, 0.028482265770435333, -0.03459439426660538, 0.04555915296077728, 0.04445279389619827, 0.04635937884449959, -0.030874032527208328, -0.005519113503396511, 0.024999095126986504, -0.0903957337141037, -0.010871811769902706, -0.060442280024290085, -0.1946737915277481, 0.013364237733185291, 0.05735969915986061, 0.055262304842472076, 0.030765585601329803, 0.13551434874534607, 0.0010974886827170849, -0.0475224107503891, 0.03023342229425907, -0.020769428461790085, -0.016528211534023285, 0.029149476438760757, -0.0072809201665222645, 0.1526104062795639, 0.022983204573392868, 0.05690442770719528, -0.11451500654220581, 0.012773441150784492, -0.03330712020397186, -0.006917042192071676, -0.03216493874788284, -0.061537809669971466, 0.003289242973551154, 0.0014469954185187817, 0.013694697991013527, -0.12761977314949036, -0.15719962120056152, -0.003717299085110426, 0.00613630935549736, -0.05369097366929054, -0.10004933178424835, -0.10542158782482147, -0.03153182193636894, 0.046352777630090714, -0.053748197853565216, 0.03198752924799919, -0.039340607821941376, 0.09383489936590195, -0.03441528603434563, 0.0691300630569458, -0.0863635316491127, 0.0905333161354065, -0.06098577380180359, -0.04111234471201897, -0.0643690675497055, 0.12356391549110413, 0.011561519466340542, 0.04442533850669861, -0.03781363368034363, -0.01636880449950695, -0.11087207496166229, 0.06495212018489838, -0.03516015037894249, 0.22487092018127441, -0.08996163308620453, -0.09683383256196976, 0.22284504771232605, -0.04562665522098541, -0.12769415974617004, 0.12243670970201492, -0.03600937873125076, 0.09682484716176987, 0.11536505818367004, 0.16257616877555847, 0.03866875544190407, -0.0002237519365735352, 0.10846788436174393, 0.10610917955636978, -0.07603283226490021, 0.006744202226400375, 0.0250004380941391, -0.02382737584412098, -0.09139634668827057, 0.015165179036557674, 0.07776524871587753, 0.04803644120693207, -0.05478836968541145, -0.015317765064537525, 0.015090391971170902, -0.003627530997619033, 0.06564177572727203, -0.017049036920070648, 0.11691898107528687, -0.03955721855163574, -0.07620245963335037, -0.014626736752688885, 0.028113901615142822, -0.06986767798662186, 0.026787258684635162, -0.07962338626384735, 0.02948051132261753, -0.01967560686171055, 0.06687499582767487, -0.16950036585330963, -0.09430424869060516, -0.06010226905345917, 0.23349159955978394, 0.07496993243694305, 0.11698364466428757, 0.06350064277648926, -0.056928664445877075, 0.0006459777359850705, 0.037900060415267944, 0.19767099618911743, -0.006904584355652332, -0.07503941655158997, -0.11777795851230621, 0.10312607139348984, -0.07375676929950714, 0.06138577312231064, -0.0416308231651783, 0.007855354808270931, 0.019795136526226997, 0.11127804219722748, -0.04220014438033104, 0.039965033531188965, 0.012499134056270123, -0.03696384280920029, -0.05908297002315521, 0.0004571304307319224, 0.09440597146749496, -0.0005542659782804549, -0.10514124482870102, 0.2379530370235443, -0.21215155720710754, 0.12180843949317932, 0.1799643337726593, -0.2256188690662384, 0.008836638182401657, -0.10462760180234909, -0.016665222123265266, 0.01030759233981371, 0.03996801748871803, -0.040312353521585464, 0.24249082803726196, -0.014560520648956299, 0.17035135626792908, -0.04880015179514885, -0.05010494217276573, -0.0440804697573185, -0.05291803553700447, 0.0003277618088759482, 0.12486644089221954, 0.09157522767782211, -0.18372175097465515, 0.17465431988239288, 0.06325390189886093, 0.03004654310643673, 0.1566917598247528, 0.022896459326148033, 0.020663797855377197, 0.05599488690495491, -0.0012882096925750375, -0.03033529780805111, -0.07880529016256332, -0.20945574343204498, -0.012111871503293514, 0.07547834515571594, 0.04618273675441742, 0.10363037884235382, -0.1018955409526825, -0.030724551528692245, -0.006948297843337059, -0.030821966007351875, 0.03848150745034218, 0.13554143905639648, 0.015318007208406925, 0.12024796009063721, -0.019162237644195557, -0.06668011844158173, 0.0741129145026207, 0.01461794413626194, -0.09263674914836884, 0.18050695955753326, -0.1221487745642662, -0.3382752537727356, -0.10329627990722656, -0.20327065885066986, -0.04040617123246193, 0.0422586165368557, 0.11002974957227707, -0.1460546851158142, -0.029720865190029144, 0.0010455691954120994, 0.08435780555009842, -0.1366978883743286, 0.006720550823956728, -0.017843635752797127, -0.01294276025146246, -0.1374056041240692, -0.09384968876838684, -0.04747654125094414, -0.060003772377967834, -0.03218422830104828, 0.10381519794464111, -0.1596987098455429, 0.007801016326993704, 0.230968177318573, 0.04797196388244629, 0.07053504139184952, -0.036995481699705124, 0.17910921573638916, -0.08220451325178146, 0.016473548486828804, 0.24478016793727875, -0.05610832944512367, 0.0740312784910202, 0.10560029745101929, -0.005553957540541887, -0.052998270839452744, 0.03756273165345192, 0.00788428820669651, -0.0785532221198082, -0.21784749627113342, -0.1030275970697403, -0.11046822369098663, 0.04284128174185753, 0.05120398849248886, 0.04543844982981682, 0.1585974246263504, 0.06446543335914612, -0.05187172442674637, -0.011306295171380043, 0.08315242826938629, 0.08576013147830963, 0.24794787168502808, -0.06311704963445663, 0.1473274976015091, -0.020790869370102882, -0.16434483230113983, 0.07334780693054199, 0.06416254490613937, 0.07227631658315659, 0.06913222372531891, 0.11215730756521225, 0.0020037174690514803, 0.017364054918289185, 0.12614323198795319, 0.05889604985713959, -0.011050567030906677, -0.031410302966833115, -0.04586650803685188, -0.04347039759159088, -0.020151739940047264, 0.041160233318805695, 0.05188119783997536, -0.1600257307291031, -0.02415069006383419, 0.022831739857792854, 0.046689603477716446, -0.003216250566765666, 0.08608495444059372, -0.19217506051063538, -0.018159521743655205, 0.06477150321006775, -0.0016290671192109585, -0.09313707798719406, 0.08108778297901154, -0.009849769994616508, -0.09697907418012619, 0.03780587762594223, -0.03585495799779892, 0.1301390826702118, -0.0750122219324112, 0.07286842167377472, -0.1119815781712532, -0.02080838568508625, -0.0087605444714427, 0.11860883235931396, -0.3024371266365051, 0.1707288920879364, -0.0030656929593533278, -0.04842326417565346, -0.11293680220842361, -0.015061003156006336, 0.03821004554629326, 0.08916047215461731, 0.10371578484773636, -0.030773809179663658, -0.06436607241630554, 0.0791664570569992, -0.050910793244838715, 0.03525971621274948, 0.10187692940235138, -0.04662879928946495, -0.014911266043782234, -0.05685164034366608, 0.0027524156030267477, 0.02270045317709446, -0.10804066807031631, 0.014929873868823051, -0.19113284349441528, 0.07794220000505447, 0.0811065286397934, 0.0722472071647644, 0.04095001146197319, -0.029467018321156502, -0.1261810064315796, 0.2744207978248596, 0.007417048793286085, -0.09985779225826263, -0.11269644647836685, 0.04465123638510704, 0.05646880716085434, -0.07145541161298752, -0.028514720499515533, -0.07924950867891312, 0.052012015134096146, -0.07113154232501984, -0.1981293261051178, 0.11338871717453003, -0.09873685240745544, -0.04736494645476341, -0.03962721675634384, 0.2276533544063568, -0.027753405272960663, 0.02130931057035923, 0.0393831804394722, -0.001616212772205472, -0.12734149396419525, -0.09492160379886627, 0.004517016001045704, -0.0013660878175869584, 0.02586340345442295, 0.022777099162340164, -0.04388801380991936, 0.0049570053815841675, -0.06949588656425476, -0.0037953434512019157, 0.3158918023109436, 0.10998717695474625, -0.04474896565079689, 0.1561327874660492, 0.10242960602045059, -0.06360200047492981, -0.28859275579452515, -0.11298105865716934, -0.07240703701972961, -0.05466444417834282, -0.0838940367102623, -0.18133240938186646, 0.08497140556573868, -0.042584747076034546, -0.00881777424365282, 0.042027126997709274, -0.2644155025482178, -0.09412363916635513, 0.18815293908119202, -0.01533579919487238, 0.4300551414489746, -0.11307147145271301, -0.07450833916664124, -0.05387028306722641, -0.13561248779296875, 0.18766070902347565, -0.018648525699973106, 0.0966244488954544, 0.00443116994574666, 0.20654869079589844, 0.05815155804157257, -0.0008219819865189493, 0.0747876986861229, 0.011587066575884819, -0.0452013723552227, -0.09014920890331268, -0.09217863529920578, -0.020688166841864586, 0.005974666681140661, 0.034957773983478546, -0.0941787138581276, 0.05258546397089958, -0.11336535215377808, -0.05589618906378746, -0.07209338247776031, 0.026715638116002083, 0.02418643794953823, -0.06410122662782669, -0.006407043896615505, -0.048794936388731, -0.0010418962920084596, 0.00979152973741293, 0.21295785903930664, -0.11305148899555206, 0.12096642702817917, 0.04414689913392067, 0.1508360654115677, -0.08366664499044418, -0.03614836558699608, -0.04910365119576454, -0.05565084517002106, 0.0676501989364624, -0.1319035291671753, 0.04462771117687225, 0.10053624957799911, -0.030742639675736427, 0.0898696631193161, 0.11227817088365555, -0.02972952462732792, 0.0016581144882366061, 0.07279330492019653, -0.23832836747169495, -0.08509121090173721, -0.07718803733587265, 0.05435929819941521, 0.057659514248371124, 0.09007556736469269, 0.21964938938617706, 0.011087107472121716, -0.023847850039601326, 0.027587326243519783, 0.029717741534113884, -0.01658647321164608, 0.05797221511602402, 0.008770608343183994, 0.031205764040350914, -0.14632299542427063, 0.04562913626432419, -0.010501107200980186, -0.07197817414999008, 0.03429242596030235, 0.16717956960201263, -0.10209374874830246, -0.12234743684530258, -0.04288604483008385, 0.17517046630382538, -0.13247300684452057, -0.017495078966021538, -0.05478521063923836, -0.1241658553481102, 0.07977617532014847, 0.11423204839229584, 0.05072414129972458, 0.042339734733104706, -0.09691346436738968, -0.03881148621439934, -0.05552472919225693, 0.01957569271326065, 0.018891409039497375, -0.030404040589928627, -0.037885911762714386, 0.025801094248890877, -0.04172535613179207, 0.11203933507204056, -0.087384894490242, -0.09792038798332214, -0.16838693618774414, 0.03925701230764389, -0.049022991210222244, -0.07899222522974014, -0.09344983100891113, -0.03523614630103111, 0.014231358654797077, -0.03348008170723915, -0.018664700910449028, -0.02225758694112301, -0.0958842933177948, 0.03419994190335274, -0.048781368881464005, -0.005008503329008818, -0.08496184647083282, 0.017331385985016823, 0.04781922325491905, -0.023604100570082664, 0.1431105136871338, 0.12453559041023254, -0.11789791285991669, 0.10031480342149734, -0.16611437499523163, -0.06820093840360641, 0.09455996751785278, 0.02471991442143917, 0.043245621025562286, 0.028927266597747803, 0.005174829158931971, 0.04808570072054863, 0.05950818210840225, 0.03694291412830353, 0.041101954877376556, -0.07111897319555283, 0.061451081186532974, -0.06278520077466965, -0.11226452142000198, -0.04257739707827568, -0.005422866903245449, 0.00011432790051912889, 0.07346735894680023, 0.11052975058555603, -0.05098198726773262, 0.09580544382333755, -0.050767768174409866, 0.046003878116607666, 0.0289035402238369, -0.16526201367378235, 0.008764104917645454, -0.08482556790113449, 0.05248309671878815, 0.0030253108125180006, 0.15688744187355042, 0.028536081314086914, -0.03175791725516319, 0.02630779519677162, 0.05105529725551605, 0.06318540126085281, -0.00840448122471571, 0.19050461053848267, 0.09726009517908096, -0.04487645998597145, -0.09418396651744843, 0.08849480748176575, 0.05022666975855827, 0.05143674090504646, 0.1403687596321106, -0.020687401294708252, 0.012512898072600365, 0.07724163681268692, 0.014415515586733818, 0.017872430384159088, -0.07756411284208298, -0.09487451612949371, -0.011494439095258713, 0.025514457374811172, -0.02882363088428974, 0.1138797178864479, 0.16729387640953064, -0.0008394720498472452, 0.013234704732894897, -0.01801590994000435, -0.05735309422016144, -0.20129387080669403, -0.1959676295518875, -0.09400797635316849, -0.13690303266048431, -0.0009418319095857441, -0.13835963606834412, 0.03616710752248764, 0.042394787073135376, 0.09917435795068741, -0.039446551352739334, 0.019261397421360016, 0.026794444769620895, -0.10323353111743927, 0.039175424724817276, -0.04838612675666809, 0.09421038627624512, -0.007761404849588871, 0.005773975048214197, -0.046786144375801086, 0.02436385303735733, 0.02127891033887863, 0.038409680128097534, -0.012736459262669086, 0.024856114760041237, -0.11602245271205902, -0.09478921443223953, -0.058010075241327286, 0.0558818019926548, 0.0046934462152421474, 0.18179026246070862, 0.02449701726436615, -0.03384847193956375, 0.0275272186845541, 0.19317778944969177, -0.06196035072207451, -0.09709009528160095, -0.08241496980190277, 0.2182236760854721, -0.018931716680526733, 0.09253086894750595, -0.035876765847206116, 0.012440751306712627, -0.07121489197015762, 0.33243879675865173, 0.29320472478866577, -0.10524016618728638, 0.010426074266433716, -0.0019151283195242286, 0.0405552051961422, 0.1290767937898636, 0.07575080543756485, 0.11663594841957092, 0.256552129983902, -0.06501701474189758, -0.057690393179655075, -0.014668738469481468, -0.027142031118273735, -0.06502988189458847, 0.04214107245206833, 0.04939494654536247, -0.07117093354463577, -0.00912293791770935, 0.12242040783166885, -0.24606983363628387, 0.04577518254518509, -0.13518153131008148, -0.14807558059692383, -0.0726354643702507, 0.002261551097035408, 0.09914402663707733, 0.010166509076952934, 0.08546656370162964, -0.014570544473826885, -0.0710548534989357, 0.03896206244826317, 0.021210450679063797, -0.2144380509853363, 0.021960165351629257, 0.07259857654571533, -0.028754761442542076, -0.07154250144958496, -0.013138728216290474, 0.08338925242424011, 0.09720319509506226, 0.03173141926527023, -0.009079075418412685, 0.04570826143026352, -0.0000614441087236628, -0.06747788935899734, 0.035688117146492004, 0.022403022274374962, 0.01331246830523014, -0.05491582676768303, 0.07895619422197342, -0.17176033556461334, 0.020258452743291855, -0.03599786013364792, -0.06506339460611343, -0.006352625321596861, 0.02872123196721077, -0.06236473098397255, 0.0810769721865654, 0.08681372553110123, -0.010693355463445187, -0.015406738966703415, -0.019259916618466377, -0.012411676347255707, -0.028850549831986427, -0.07069326192140579, -0.09390060603618622, -0.15529757738113403, -0.12466321885585785, 0.08110006153583527, -0.008061634376645088, -0.2096063792705536, 0.012769150547683239, -0.13104628026485443, 0.04622570425271988, -0.10809949785470963, 0.09371429681777954, 0.08394473046064377, 0.020185640081763268, -0.007141938898712397, 0.003890183288604021, 0.036074474453926086, 0.07894916087388992, -0.13067346811294556, -0.08049263805150986 ]
null
null
transformers
# Disclaimer This model was trained on Common Voice 6, if you need a catalan model for ASR, I recommend checking [wav2vec2-xls-r-1b-ca-lm](https://huggingface.co/PereLluis13/wav2vec2-xls-r-1b-ca-lm) which is a 1b model with a LM on top trained on CV8+ with much better performance or [wav2vec2-xls-r-300m-ca-lm](https://huggingface.co/PereLluis13/wav2vec2-xls-r-300m-ca-lm) which has the same size (300m) as this model but trained on CV8+ and the same LM. # Wav2Vec2-Large-XLSR-53-ca Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on catalan using the [Common Voice](https://huggingface.co/datasets/common_voice) dataset. When using this model, make sure that your speech input is sampled at 16kHz. ## Usage The model can be used directly (without a language model) as follows: ```python import torch import torchaudio from datasets import load_dataset from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor test_dataset = load_dataset("common_voice", "ca", split="test[:2%]") processor = Wav2Vec2Processor.from_pretrained("PereLluis13/Wav2Vec2-Large-XLSR-53-catalan") model = Wav2Vec2ForCTC.from_pretrained("PereLluis13/Wav2Vec2-Large-XLSR-53-catalan") resampler = torchaudio.transforms.Resample(48_000, 16_000) # Preprocessing the datasets. # We need to read the aduio files as arrays def speech_file_to_array_fn(batch): speech_array, sampling_rate = torchaudio.load(batch["path"]) batch["speech"] = resampler(speech_array).squeeze().numpy() return batch test_dataset = test_dataset.map(speech_file_to_array_fn) inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True) with torch.no_grad(): logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits predicted_ids = torch.argmax(logits, dim=-1) print("Prediction:", processor.batch_decode(predicted_ids)) print("Reference:", test_dataset["sentence"][:2]) ``` ## Evaluation The model can be evaluated as follows on the catalan test data of Common Voice. ```python import torch import torchaudio from datasets import load_dataset, load_metric from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor import re test_dataset = load_dataset("common_voice", "ca", split="test") wer = load_metric("wer") processor = Wav2Vec2Processor.from_pretrained("PereLluis13/Wav2Vec2-Large-XLSR-53-catalan") model = Wav2Vec2ForCTC.from_pretrained("PereLluis13/Wav2Vec2-Large-XLSR-53-catalan") model.to("cuda") chars_to_ignore_regex = '[\,\?\.\!\;\:\"\“]' resampler = torchaudio.transforms.Resample(48_000, 16_000) # Preprocessing the datasets. # We need to read the aduio files as arrays def speech_file_to_array_fn(batch): batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower() speech_array, sampling_rate = torchaudio.load(batch["path"]) batch["speech"] = resampler(speech_array).squeeze().numpy() return batch test_dataset = test_dataset.map(speech_file_to_array_fn) # Preprocessing the datasets. # We need to read the aduio files as arrays def evaluate(batch): inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True) with torch.no_grad(): logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits pred_ids = torch.argmax(logits, dim=-1) batch["pred_strings"] = processor.batch_decode(pred_ids) return batch result = test_dataset.map(evaluate, batched=True, batch_size=8) import jiwer # Chunk WER computation due to memory issues, taken from https://huggingface.co/pcuenq/wav2vec2-large-xlsr-53-es def chunked_wer(targets, predictions, chunk_size=None): if chunk_size is None: return jiwer.wer(targets, predictions) start = 0 end = chunk_size H, S, D, I = 0, 0, 0, 0 while start < len(targets): chunk_metrics = jiwer.compute_measures(targets[start:end], predictions[start:end]) H = H + chunk_metrics["hits"] S = S + chunk_metrics["substitutions"] D = D + chunk_metrics["deletions"] I = I + chunk_metrics["insertions"] start += chunk_size end += chunk_size return float(S + D + I) / float(H + S + D) print("WER: {:2f}".format(100 * chunked_wer(result["sentence"], result["pred_strings"], chunk_size=4000))) ``` **Test Result**: 8.11 % ## Training The Common Voice `train`, `validation` datasets were used for training. At the second epoch training was halted due to a memory issue, and was continued with lower batch size, but acc. gradient steps were scaled to keep it at 32 batch size during all training. Then the model was trained for an additional 10 epochs where half the male samples were pitched up. The script used for training can be found [here](https://github.com/huggingface/transformers/blob/master/examples/research_projects/wav2vec2/run_common_voice.py). Slight modifications were done in order to speed up the ordering by length during training, which can be found [here](https://discuss.huggingface.co/t/spanish-asr-fine-tuning-wav2vec2/4586/6). Another version trained for catalan can be found [here](https://huggingface.co/ccoreilly/wav2vec2-large-xlsr-catala), which may be better than this one since it was trained with extra data and for longer time. Whoever, since it used different splits that include part of the Common Voice test set, this version can be used to get a baseline on the Common Voice dataset.
{"language": "ca", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "model-index": [{"name": "Catalan XLSR Wav2Vec Large 53", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice ca", "type": "common_voice", "args": "ca"}, "metrics": [{"type": "wer", "value": 8.11, "name": "Test WER"}]}]}]}
automatic-speech-recognition
PereLluis13/Wav2Vec2-Large-XLSR-53-catalan
[ "transformers", "pytorch", "jax", "wav2vec2", "automatic-speech-recognition", "audio", "speech", "xlsr-fine-tuning-week", "ca", "dataset:common_voice", "license:apache-2.0", "model-index", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "ca" ]
TAGS #transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #ca #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us
# Disclaimer This model was trained on Common Voice 6, if you need a catalan model for ASR, I recommend checking wav2vec2-xls-r-1b-ca-lm which is a 1b model with a LM on top trained on CV8+ with much better performance or wav2vec2-xls-r-300m-ca-lm which has the same size (300m) as this model but trained on CV8+ and the same LM. # Wav2Vec2-Large-XLSR-53-ca Fine-tuned facebook/wav2vec2-large-xlsr-53 on catalan using the Common Voice dataset. When using this model, make sure that your speech input is sampled at 16kHz. ## Usage The model can be used directly (without a language model) as follows: ## Evaluation The model can be evaluated as follows on the catalan test data of Common Voice. Test Result: 8.11 % ## Training The Common Voice 'train', 'validation' datasets were used for training. At the second epoch training was halted due to a memory issue, and was continued with lower batch size, but acc. gradient steps were scaled to keep it at 32 batch size during all training. Then the model was trained for an additional 10 epochs where half the male samples were pitched up. The script used for training can be found here. Slight modifications were done in order to speed up the ordering by length during training, which can be found here. Another version trained for catalan can be found here, which may be better than this one since it was trained with extra data and for longer time. Whoever, since it used different splits that include part of the Common Voice test set, this version can be used to get a baseline on the Common Voice dataset.
[ "# Disclaimer\n\nThis model was trained on Common Voice 6, if you need a catalan model for ASR, I recommend checking wav2vec2-xls-r-1b-ca-lm which is a 1b model with a LM on top trained on CV8+ with much better performance or wav2vec2-xls-r-300m-ca-lm which has the same size (300m) as this model but trained on CV8+ and the same LM.", "# Wav2Vec2-Large-XLSR-53-ca \n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on catalan using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.", "## Usage\n\nThe model can be used directly (without a language model) as follows:", "## Evaluation\n\nThe model can be evaluated as follows on the catalan test data of Common Voice.\n\n\n\nTest Result: 8.11 %", "## Training\n\nThe Common Voice 'train', 'validation' datasets were used for training. At the second epoch training was halted due to a memory issue, and was continued with lower batch size, but acc. gradient steps were scaled to keep it at 32 batch size during all training. Then the model was trained for an additional 10 epochs where half the male samples were pitched up.\n\nThe script used for training can be found here. Slight modifications were done in order to speed up the ordering by length during training, which can be found here. Another version trained for catalan can be found here, which may be better than this one since it was trained with extra data and for longer time. Whoever, since it used different splits that include part of the Common Voice test set, this version can be used to get a baseline on the Common Voice dataset." ]
[ "TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #ca #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n", "# Disclaimer\n\nThis model was trained on Common Voice 6, if you need a catalan model for ASR, I recommend checking wav2vec2-xls-r-1b-ca-lm which is a 1b model with a LM on top trained on CV8+ with much better performance or wav2vec2-xls-r-300m-ca-lm which has the same size (300m) as this model but trained on CV8+ and the same LM.", "# Wav2Vec2-Large-XLSR-53-ca \n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on catalan using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.", "## Usage\n\nThe model can be used directly (without a language model) as follows:", "## Evaluation\n\nThe model can be evaluated as follows on the catalan test data of Common Voice.\n\n\n\nTest Result: 8.11 %", "## Training\n\nThe Common Voice 'train', 'validation' datasets were used for training. At the second epoch training was halted due to a memory issue, and was continued with lower batch size, but acc. gradient steps were scaled to keep it at 32 batch size during all training. Then the model was trained for an additional 10 epochs where half the male samples were pitched up.\n\nThe script used for training can be found here. Slight modifications were done in order to speed up the ordering by length during training, which can be found here. Another version trained for catalan can be found here, which may be better than this one since it was trained with extra data and for longer time. Whoever, since it used different splits that include part of the Common Voice test set, this version can be used to get a baseline on the Common Voice dataset." ]
[ 80, 104, 64, 20, 28, 197 ]
[ "passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #ca #dataset-common_voice #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Disclaimer\n\nThis model was trained on Common Voice 6, if you need a catalan model for ASR, I recommend checking wav2vec2-xls-r-1b-ca-lm which is a 1b model with a LM on top trained on CV8+ with much better performance or wav2vec2-xls-r-300m-ca-lm which has the same size (300m) as this model but trained on CV8+ and the same LM.# Wav2Vec2-Large-XLSR-53-ca \n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on catalan using the Common Voice dataset.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the catalan test data of Common Voice.\n\n\n\nTest Result: 8.11 %## Training\n\nThe Common Voice 'train', 'validation' datasets were used for training. At the second epoch training was halted due to a memory issue, and was continued with lower batch size, but acc. gradient steps were scaled to keep it at 32 batch size during all training. Then the model was trained for an additional 10 epochs where half the male samples were pitched up.\n\nThe script used for training can be found here. Slight modifications were done in order to speed up the ordering by length during training, which can be found here. Another version trained for catalan can be found here, which may be better than this one since it was trained with extra data and for longer time. Whoever, since it used different splits that include part of the Common Voice test set, this version can be used to get a baseline on the Common Voice dataset." ]
[ -0.10720319300889969, 0.12743860483169556, -0.005668469704687595, -0.013314638286828995, 0.061408475041389465, -0.0006243769894354045, 0.13616475462913513, 0.1086222231388092, 0.012392212636768818, 0.08532963693141937, -0.0741390809416771, -0.046169914305210114, 0.08674146980047226, 0.008594079874455929, 0.05612225458025932, -0.19253039360046387, 0.02367875538766384, -0.04497293755412102, 0.09439337253570557, 0.05601382628083229, 0.09271447360515594, -0.09106342494487762, 0.023017391562461853, 0.037549376487731934, -0.02724355272948742, 0.014333419501781464, 0.010564260184764862, -0.052916474640369415, 0.13908633589744568, 0.09794557094573975, 0.06610090285539627, 0.0250370092689991, 0.05215830355882645, -0.2151542752981186, 0.02083924226462841, 0.07858803123235703, 0.012409904040396214, 0.0507478229701519, 0.1143261045217514, -0.028729988262057304, 0.03676997497677803, 0.050990037620067596, 0.012610065750777721, 0.08197509497404099, -0.05309190973639488, -0.13709983229637146, -0.10104522109031677, -0.0022284837905317545, 0.0915130004286766, 0.07015896588563919, -0.050101179629564285, 0.01228363811969757, -0.04015498235821724, 0.04456997662782669, 0.14599661529064178, -0.20045077800750732, -0.002599733415991068, 0.05236027389764786, 0.0606568306684494, 0.07375992834568024, -0.06812217831611633, 0.012437785975635052, 0.060166891664266586, 0.017651287838816643, -0.038804009556770325, -0.017862331122159958, -0.07017643749713898, -0.013326595537364483, -0.10078363120555878, -0.05044673755764961, 0.12591874599456787, -0.019530553370714188, -0.09171365201473236, -0.14744506776332855, -0.026523197069764137, -0.05212806165218353, 0.035282813012599945, -0.03458234667778015, 0.023860681802034378, 0.03455174341797829, 0.03156223148107529, -0.0481184758245945, -0.11374667286872864, -0.09251513332128525, -0.04639662057161331, 0.015080224722623825, 0.024281339719891548, -0.017892230302095413, -0.007641270756721497, 0.13974346220493317, -0.07105651497840881, -0.01723979227244854, -0.04200717434287071, 0.05892442911863327, -0.13975298404693604, -0.039690423756837845, -0.04698871076107025, -0.2089148908853531, -0.005612308159470558, 0.08919178694486618, 0.05226987600326538, 0.05137602239847183, -0.04955087602138519, 0.006729769986122847, 0.0872100368142128, 0.20047244429588318, -0.06771735846996307, 0.0016403411282226443, 0.004685736261308193, 0.012405156157910824, -0.014923355542123318, -0.0247386172413826, -0.05408535152673721, -0.025401758030056953, 0.08835447579622269, 0.1237473115324974, 0.00014161586295813322, -0.01975920796394348, -0.047947198152542114, -0.03437579795718193, 0.11625654250383377, -0.14217649400234222, 0.03709441423416138, 0.052901022136211395, 0.03481140360236168, 0.03828228637576103, 0.04737463966012001, -0.010819591581821442, -0.07191582024097443, 0.05772824212908745, -0.003701755777001381, -0.009700905531644821, -0.04984490945935249, -0.06683140248060226, 0.03271028399467468, -0.0044480194337666035, -0.08070385456085205, -0.04665708914399147, -0.06458276510238647, -0.11443209648132324, -0.03576626628637314, -0.024454738944768906, -0.0146416537463665, -0.03794635087251663, 0.021027645096182823, -0.022900162264704704, -0.045341458171606064, 0.039737001061439514, -0.027580078691244125, 0.023809397593140602, -0.033045489341020584, -0.0017568512121215463, 0.08440306037664413, 0.04526509717106819, -0.04701831936836243, 0.03622988238930702, -0.08026415854692459, 0.15065546333789825, -0.05598016828298569, -0.06153148040175438, -0.07732385396957397, -0.0024680299684405327, -0.051769960671663284, 0.042500708252191544, 0.0247153602540493, 0.03863333910703659, -0.28029513359069824, -0.07389592379331589, 0.10730418562889099, -0.11051643639802933, 0.0298802237957716, 0.14239472150802612, -0.005931858904659748, 0.003798498073592782, 0.11293186247348785, 0.13908016681671143, 0.1884826421737671, -0.11754666268825531, -0.07938635349273682, -0.030730770900845528, -0.021921886131167412, 0.10155351459980011, 0.044686347246170044, -0.1154346615076065, -0.000021297455532476306, 0.016443364322185516, -0.0627397671341896, -0.05059358850121498, 0.016106948256492615, -0.02639484778046608, 0.006828549783676863, -0.0021051315125077963, -0.021103855222463608, -0.0037374382372945547, -0.07872675359249115, -0.005715550854802132, -0.07419972121715546, 0.0834464579820633, 0.12865154445171356, -0.09060308337211609, 0.06047835573554039, -0.09476338326931, 0.08959995955228806, -0.08045454323291779, -0.0032734961714595556, -0.18235516548156738, 0.040030963718891144, 0.05703718587756157, -0.10232415795326233, 0.07565657049417496, 0.08432105928659439, -0.007293547037988901, 0.024059101939201355, -0.0502692349255085, -0.010755864903330803, -0.011710438877344131, -0.03301689028739929, -0.0661771297454834, -0.0610470175743103, -0.08533433824777603, -0.028236079961061478, 0.10468028485774994, -0.07554268836975098, -0.02151455357670784, 0.104004867374897, 0.07365503162145615, -0.004751360509544611, -0.05350260064005852, 0.04918917268514633, 0.03542251139879227, -0.010458982549607754, -0.04203328862786293, -0.024409107863903046, 0.019847629591822624, -0.051250383257865906, 0.10736239701509476, -0.20614735782146454, -0.1708686202764511, 0.0366988405585289, 0.09367956221103668, -0.049588363617658615, 0.023564359173178673, -0.020956799387931824, -0.011732528917491436, -0.12802833318710327, -0.0759563148021698, 0.2707977890968323, 0.04204822704195976, 0.10206075757741928, -0.11104422062635422, -0.04343345761299133, -0.02534007653594017, -0.015065385960042477, -0.014860602095723152, 0.06239848956465721, 0.005971982143819332, 0.020601807162165642, 0.00639521237462759, -0.09397415071725845, 0.017835143953561783, 0.20051243901252747, 0.019174646586179733, -0.09266244620084763, -0.038166821002960205, -0.00454338826239109, 0.02895962819457054, 0.04822104051709175, -0.04579244181513786, 0.0072686318308115005, 0.026049703359603882, 0.06181785836815834, 0.06278957426548004, -0.13359244167804718, 0.07873565703630447, 0.039469778537750244, -0.0934673473238945, -0.0537288598716259, 0.006485472898930311, -0.00915870163589716, 0.05302494019269943, -0.02513228915631771, 0.060810547322034836, -0.04469221457839012, -0.060863740742206573, -0.14777255058288574, 0.11814907938241959, -0.09969772398471832, -0.17054565250873566, -0.22262226045131683, 0.07185836881399155, -0.01081108395010233, 0.022282680496573448, 0.05155123397707939, -0.06091918796300888, -0.04308373108506203, -0.07360228151082993, 0.0763816237449646, -0.0030601825565099716, -0.036883991211652756, -0.06521249562501907, -0.007771323435008526, 0.05278422683477402, -0.12854516506195068, 0.02305801399052143, -0.00965027790516615, -0.09402135014533997, -0.0522741936147213, 0.017180252820253372, 0.02921823039650917, 0.18159931898117065, 0.029815547168254852, -0.027645336464047432, -0.021541588008403778, 0.12526221573352814, -0.12479498982429504, 0.03960910066962242, 0.08506809920072556, -0.00968389492481947, 0.01128807570785284, 0.07001572102308273, 0.0074665904976427555, -0.05344627425074577, -0.019798459485173225, 0.07657419145107269, -0.020536702126264572, -0.30614331364631653, -0.06783409416675568, -0.04197654873132706, 0.0033420256804674864, 0.006798848044127226, 0.03958670794963837, 0.033419590443372726, -0.021942811086773872, -0.10387034714221954, -0.043470192700624466, 0.05708981305360794, 0.011222286149859428, 0.11528674513101578, -0.018313195556402206, 0.027321571484208107, -0.06770280748605728, 0.02483394742012024, 0.08274632692337036, 0.09943509101867676, 0.10772296041250229, 0.019488893449306488, 0.16376101970672607, 0.07429777830839157, 0.04842822998762131, 0.056360211223363876, 0.030684873461723328, -0.023524297401309013, 0.028607605025172234, 0.014104665257036686, -0.043984223157167435, -0.0044429972767829895, 0.004387622699141502, 0.1320851743221283, -0.03507678955793381, -0.030612733215093613, -0.0545252226293087, 0.04671485349535942, 0.2873092591762543, 0.018892236053943634, -0.09163351356983185, -0.0927497074007988, -0.005792859010398388, -0.10527154058218002, -0.06018320471048355, -0.0073143234476447105, 0.16456881165504456, -0.13912291824817657, 0.0557195246219635, -0.00019470357801765203, 0.06571081280708313, -0.11132234334945679, -0.012964612804353237, 0.00835091806948185, 0.10203444212675095, -0.012197020463645458, 0.09518534690141678, -0.11252234131097794, 0.07189151644706726, 0.021779902279376984, 0.0862765833735466, -0.019207926467061043, 0.03815104812383652, -0.007414350286126137, 0.030250171199440956, 0.07889969646930695, 0.018215369433164597, -0.10494618117809296, -0.05739763006567955, -0.07760472595691681, 0.02126961760222912, 0.03925611451268196, -0.07052303105592728, 0.07212855666875839, -0.03041982278227806, -0.031455911695957184, -0.03390304744243622, -0.0836823433637619, -0.050092704594135284, -0.13951672613620758, 0.05616443604230881, 0.03418375924229622, 0.05618201568722725, -0.053336258977651596, -0.05155821144580841, -0.08648431301116943, 0.11345592141151428, -0.1683247983455658, -0.04760365188121796, -0.06253910064697266, -0.059267137199640274, 0.1679716557264328, -0.059021759778261185, 0.005943978205323219, 0.036017581820487976, 0.19340845942497253, -0.0470772311091423, -0.03205999732017517, 0.00266048195771873, -0.06000898778438568, -0.17966248095035553, -0.02318684756755829, 0.1966818869113922, 0.021364837884902954, 0.08276902884244919, 0.0025583093520253897, 0.020506976172327995, 0.006344794295728207, -0.051880575716495514, 0.020560551434755325, 0.1620432436466217, -0.13026326894760132, 0.07746435701847076, -0.02075849287211895, -0.14419491589069366, -0.1078656017780304, -0.06933432072401047, 0.0753476545214653, 0.08431576192378998, -0.057963717728853226, 0.1285446733236313, 0.19385024905204773, -0.1465252935886383, -0.1608123630285263, -0.03492375835776329, 0.04728094860911369, 0.07796334475278854, 0.0014092657947912812, -0.1934082955121994, 0.034614045172929764, 0.08803965896368027, -0.009287802502512932, -0.03595297038555145, -0.29945918917655945, -0.1510400027036667, 0.042546652257442474, -0.07242653518915176, -0.07682298123836517, -0.017378823831677437, -0.07092099636793137, -0.060127437114715576, 0.018893230706453323, -0.028000161051750183, 0.007134560029953718, 0.11085401475429535, 0.016793156042695045, 0.025855209678411484, 0.05548549070954323, -0.022344985976815224, 0.10588761419057846, 0.04339100793004036, 0.0533122792840004, 0.01999383047223091, 0.012303637340664864, -0.024856945499777794, -0.020816395059227943, 0.10702294856309891, -0.03531166911125183, 0.029544556513428688, -0.03343531861901283, -0.07905875891447067, -0.030755721032619476, 0.03137728571891785, -0.015523700974881649, 0.030925579369068146, -0.04948481544852257, 0.022213440388441086, 0.017926029860973358, -0.01358791347593069, -0.08306634426116943, -0.1310860514640808, 0.0059615233913064, 0.10912704467773438, 0.19117268919944763, -0.044490523636341095, -0.12911689281463623, -0.01242432277649641, -0.00841507688164711, 0.09413561969995499, -0.022854628041386604, 0.05407397821545601, 0.08605141937732697, 0.026793107390403748, 0.13977551460266113, -0.01491023600101471, -0.18994417786598206, 0.04753268510103226, 0.02537478320300579, -0.05306192487478256, -0.19648946821689606, -0.010900578461587429, 0.025334838777780533, -0.10593809187412262, 0.006372531875967979, 0.11562085151672363, -0.010550271719694138, -0.054588232189416885, 0.01872393861413002, 0.04979651793837547, -0.02469777688384056, 0.16829779744148254, -0.018229974433779716, 0.05820829048752785, -0.06271650642156601, 0.11544337123632431, 0.09529183059930801, -0.033461011946201324, 0.03391459584236145, 0.013654850423336029, -0.0466630719602108, -0.04877438396215439, -0.10296476632356644, 0.10233607888221741, 0.014885311014950275, -0.08832155168056488, -0.048195257782936096, -0.1009727194905281, 0.0027785005513578653, 0.10877630859613419, -0.015117601491510868, 0.09039968997240067, -0.061029549688100815, -0.017072094604372978, -0.08527839928865433, 0.056197699159383774, 0.04594317451119423, 0.04835784062743187, -0.06129664555191994, 0.15678445994853973, 0.03212692216038704, -0.006276881787925959, 0.012776677496731281, -0.058485373854637146, 0.026221588253974915, 0.028162304311990738, -0.07117666304111481, -0.016984976828098297, -0.016104668378829956, 0.0026717849541455507, 0.026981499046087265, 0.004322289489209652, 0.015992967411875725, 0.054383229464292526, -0.043170612305402756, -0.02734648436307907, -0.06594342738389969, 0.07984736561775208, -0.10958611965179443, 0.056791190057992935, 0.0565514974296093, -0.0773930624127388, 0.07446134835481644, 0.039030369371175766, -0.01673361100256443, 0.05122298747301102, -0.12749363481998444, -0.0663495659828186, 0.004327037837356329, 0.0724962130188942, -0.0193951278924942, -0.13680046796798706, 0.0053341747261583805, 0.05254882946610451, -0.015343254432082176, -0.004752405919134617, 0.03659042343497276, -0.09068646281957626, -0.00017365170060656965, 0.005038696341216564, -0.011474521830677986, -0.0669807717204094, 0.054301828145980835, 0.06564363837242126, 0.06768783926963806, 0.1051010861992836, -0.0845566838979721, 0.08439572155475616, -0.1649903506040573, -0.001816552598029375, -0.029831836000084877, 0.024794261902570724, -0.07432208210229874, -0.03631315752863884, 0.0693499892950058, 0.00020691979443654418, 0.10333946347236633, -0.011401650495827198, 0.05871053785085678, 0.003542003221809864, 0.004677007906138897, -0.08404338359832764, 0.0209614597260952, 0.05823313817381859, -0.007759693544358015, 0.0016207611188292503, 0.029744239524006844, -0.031872332096099854, -0.0466960147023201, 0.09327486902475357, 0.015195176936686039, 0.07631916552782059, 0.12542854249477386, 0.03755316510796547, 0.1057039350271225, -0.04981369152665138, -0.06143733859062195, 0.03586734086275101, -0.1525467187166214, 0.036381419748067856, -0.033142704516649246, 0.072243832051754, 0.0905773937702179, -0.14394254982471466, 0.15476401150226593, 0.003237262135371566, -0.057391732931137085, -0.11073461920022964, -0.16582165658473969, -0.05301254242658615, -0.07343130558729172, -0.020547732710838318, -0.09039375185966492, 0.036141689866781235, 0.0504038967192173, -0.004543708637356758, -0.026446769014000893, 0.13965359330177307, -0.11662736535072327, -0.0918196588754654, 0.04894561693072319, -0.0039508156478405, 0.07174738496541977, -0.0014339351328089833, 0.042630523443222046, 0.1531689465045929, 0.039052121341228485, 0.07341606169939041, 0.04095275700092316, 0.0841352641582489, 0.05515363812446594, -0.03768580034375191, -0.057221345603466034, 0.015636611729860306, -0.05714847892522812, 0.013555511832237244, 0.13716553151607513, 0.09997876733541489, -0.025196202099323273, 0.010979736223816872, 0.12237942218780518, -0.03183024376630783, -0.05298961326479912, -0.17294739186763763, 0.17647168040275574, -0.001116940169595182, 0.012018660083413124, 0.021000457927584648, -0.09095945209264755, 0.0611429363489151, 0.1548469513654709, 0.09943079948425293, 0.05594845488667488, 0.006950770504772663, 0.014421752654016018, -0.006375137250870466, -0.010159309022128582, 0.017092537134885788, 0.02328510582447052, 0.18223875761032104, 0.02621222287416458, 0.11667004972696304, -0.01808987185359001, -0.032993581146001816, 0.019449178129434586, 0.0875818133354187, -0.13844755291938782, -0.01628625951707363, -0.02370333857834339, 0.10258205235004425, 0.054040633141994476, -0.29232579469680786, -0.017050765454769135, -0.06498132646083832, -0.08415713161230087, -0.000053173302148934454, 0.08351124823093414, 0.038730915635824203, 0.04317111521959305, -0.023525185883045197, -0.0025991389993578196, 0.25455889105796814, -0.007152267266064882, -0.027676451951265335, -0.09924907237291336, 0.04438280314207077, -0.10656405985355377, 0.1642845720052719, 0.026014234870672226, 0.04562622681260109, 0.04944424331188202, 0.051857635378837585, -0.0871177539229393, 0.1110156700015068, 0.003241444006562233, -0.0843077078461647, 0.04751823469996452, 0.15029440820217133, 0.005595056340098381, 0.14732038974761963, 0.02435190975666046, -0.01797095313668251, 0.06578710675239563, -0.10910239815711975, 0.0068321567960083485, -0.09321920573711395, 0.07871699333190918, -0.05479830875992775, 0.11611848324537277, 0.12789025902748108, -0.038565993309020996, -0.030779169872403145, -0.018915565684437752, -0.010122689418494701, -0.007555079646408558, 0.1007361114025116, -0.021150333806872368, -0.15110021829605103, 0.02905348502099514, -0.0855385884642601, 0.04360397160053253, -0.14945046603679657, -0.05436897277832031, 0.09213271737098694, -0.09972630441188812, 0.022227680310606956, 0.07585293054580688, 0.028936592862010002, 0.03569284453988075, -0.03596925362944603, -0.000980354263447225, 0.03689710050821304, 0.09621049463748932, -0.10427583009004593, -0.05727989971637726 ]
null
null
transformers
# Wav2Vec2-Large-XLSR-53-greek Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on greek using the [Common Voice](https://huggingface.co/datasets/common_voice) and [CSS10](https://github.com/Kyubyong/css10) datasets. When using this model, make sure that your speech input is sampled at 16kHz. ## Usage The model can be used directly (without a language model) as follows: ```python import torch import torchaudio from datasets import load_dataset from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor test_dataset = load_dataset("common_voice", "el", split="test") processor = Wav2Vec2Processor.from_pretrained("PereLluis13/wav2vec2-large-xlsr-53-greek") model = Wav2Vec2ForCTC.from_pretrained("PereLluis13/wav2vec2-large-xlsr-53-greek") resampler = torchaudio.transforms.Resample(48_000, 16_000) # Preprocessing the datasets. # We need to read the aduio files as arrays def speech_file_to_array_fn(batch): speech_array, sampling_rate = torchaudio.load(batch["path"]) batch["speech"] = resampler(speech_array).squeeze().numpy() return batch test_dataset = test_dataset.map(speech_file_to_array_fn) inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True) with torch.no_grad(): logits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits predicted_ids = torch.argmax(logits, dim=-1) print("Prediction:", processor.batch_decode(predicted_ids)) print("Reference:", test_dataset["sentence"][:2]) ``` ## Evaluation The model can be evaluated as follows on the greek test data of Common Voice. ```python import torch import torchaudio from datasets import load_dataset, load_metric from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor import re test_dataset = load_dataset("common_voice", "el", split="test") wer = load_metric("wer") processor = Wav2Vec2Processor.from_pretrained("PereLluis13/wav2vec2-large-xlsr-53-greek") model = Wav2Vec2ForCTC.from_pretrained("PereLluis13/wav2vec2-large-xlsr-53-greek") model.to("cuda") chars_to_ignore_regex = '[\,\?\.\!\-\;\:\"\“\%\‘\”\�]' resampler = torchaudio.transforms.Resample(48_000, 16_000) # Preprocessing the datasets. # We need to read the aduio files as arrays def speech_file_to_array_fn(batch): batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower() speech_array, sampling_rate = torchaudio.load(batch["path"]) batch["speech"] = resampler(speech_array).squeeze().numpy() return batch test_dataset = test_dataset.map(speech_file_to_array_fn) # Preprocessing the datasets. # We need to read the aduio files as arrays def evaluate(batch): inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True) with torch.no_grad(): logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits pred_ids = torch.argmax(logits, dim=-1) batch["pred_strings"] = processor.batch_decode(pred_ids) return batch result = test_dataset.map(evaluate, batched=True, batch_size=8) print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"]))) ``` **Test Result**: 20.89 % ## Training The Common Voice `train`, `validation`, and CSS10 datasets were used for training, added as `extra` split to the dataset. The sampling rate and format of the CSS10 files is different, hence the function `speech_file_to_array_fn` was changed to: ``` def speech_file_to_array_fn(batch): try: speech_array, sampling_rate = sf.read(batch["path"] + ".wav") except: speech_array, sampling_rate = librosa.load(batch["path"], sr = 16000, res_type='zero_order_hold') sf.write(batch["path"] + ".wav", speech_array, sampling_rate, subtype='PCM_24') batch["speech"] = speech_array batch["sampling_rate"] = sampling_rate batch["target_text"] = batch["text"] return batch ``` As suggested by [Florian Zimmermeister](https://github.com/flozi00). The script used for training can be found in [run_common_voice.py](examples/research_projects/wav2vec2/run_common_voice.py), still pending of PR. The only changes are to `speech_file_to_array_fn`. Batch size was kept at 32 (using `gradient_accumulation_steps`) using one of the [OVH](https://www.ovh.com/) machines, with a V100 GPU (thank you very much [OVH](https://www.ovh.com/)). The model trained for 40 epochs, the first 20 with the `train+validation` splits, and then `extra` split was added with the data from CSS10 at the 20th epoch.
{"language": "el", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice", "CSS10"], "metrics": ["wer"], "model-index": [{"name": "Greek XLSR Wav2Vec2 Large 53 - CV + CSS10", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Common Voice el", "type": "common_voice", "args": "el"}, "metrics": [{"type": "wer", "value": 20.89, "name": "Test WER"}]}]}]}
automatic-speech-recognition
PereLluis13/wav2vec2-large-xlsr-53-greek
[ "transformers", "pytorch", "jax", "wav2vec2", "automatic-speech-recognition", "audio", "speech", "xlsr-fine-tuning-week", "el", "dataset:common_voice", "dataset:CSS10", "license:apache-2.0", "model-index", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "el" ]
TAGS #transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #el #dataset-common_voice #dataset-CSS10 #license-apache-2.0 #model-index #endpoints_compatible #region-us
# Wav2Vec2-Large-XLSR-53-greek Fine-tuned facebook/wav2vec2-large-xlsr-53 on greek using the Common Voice and CSS10 datasets. When using this model, make sure that your speech input is sampled at 16kHz. ## Usage The model can be used directly (without a language model) as follows: ## Evaluation The model can be evaluated as follows on the greek test data of Common Voice. Test Result: 20.89 % ## Training The Common Voice 'train', 'validation', and CSS10 datasets were used for training, added as 'extra' split to the dataset. The sampling rate and format of the CSS10 files is different, hence the function 'speech_file_to_array_fn' was changed to: As suggested by Florian Zimmermeister. The script used for training can be found in run_common_voice.py, still pending of PR. The only changes are to 'speech_file_to_array_fn'. Batch size was kept at 32 (using 'gradient_accumulation_steps') using one of the OVH machines, with a V100 GPU (thank you very much OVH). The model trained for 40 epochs, the first 20 with the 'train+validation' splits, and then 'extra' split was added with the data from CSS10 at the 20th epoch.
[ "# Wav2Vec2-Large-XLSR-53-greek\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on greek using the Common Voice and CSS10 datasets.\nWhen using this model, make sure that your speech input is sampled at 16kHz.", "## Usage\n\nThe model can be used directly (without a language model) as follows:", "## Evaluation\n\nThe model can be evaluated as follows on the greek test data of Common Voice. \n\n\n\nTest Result: 20.89 %", "## Training\n\nThe Common Voice 'train', 'validation', and CSS10 datasets were used for training, added as 'extra' split to the dataset. The sampling rate and format of the CSS10 files is different, hence the function 'speech_file_to_array_fn' was changed to:\n \n\nAs suggested by Florian Zimmermeister.\n\nThe script used for training can be found in run_common_voice.py, still pending of PR. The only changes are to 'speech_file_to_array_fn'. Batch size was kept at 32 (using 'gradient_accumulation_steps') using one of the OVH machines, with a V100 GPU (thank you very much OVH). The model trained for 40 epochs, the first 20 with the 'train+validation' splits, and then 'extra' split was added with the data from CSS10 at the 20th epoch." ]
[ "TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #el #dataset-common_voice #dataset-CSS10 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n", "# Wav2Vec2-Large-XLSR-53-greek\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on greek using the Common Voice and CSS10 datasets.\nWhen using this model, make sure that your speech input is sampled at 16kHz.", "## Usage\n\nThe model can be used directly (without a language model) as follows:", "## Evaluation\n\nThe model can be evaluated as follows on the greek test data of Common Voice. \n\n\n\nTest Result: 20.89 %", "## Training\n\nThe Common Voice 'train', 'validation', and CSS10 datasets were used for training, added as 'extra' split to the dataset. The sampling rate and format of the CSS10 files is different, hence the function 'speech_file_to_array_fn' was changed to:\n \n\nAs suggested by Florian Zimmermeister.\n\nThe script used for training can be found in run_common_voice.py, still pending of PR. The only changes are to 'speech_file_to_array_fn'. Batch size was kept at 32 (using 'gradient_accumulation_steps') using one of the OVH machines, with a V100 GPU (thank you very much OVH). The model trained for 40 epochs, the first 20 with the 'train+validation' splits, and then 'extra' split was added with the data from CSS10 at the 20th epoch." ]
[ 86, 69, 20, 28, 221 ]
[ "passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #el #dataset-common_voice #dataset-CSS10 #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-greek\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on greek using the Common Voice and CSS10 datasets.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the greek test data of Common Voice. \n\n\n\nTest Result: 20.89 %## Training\n\nThe Common Voice 'train', 'validation', and CSS10 datasets were used for training, added as 'extra' split to the dataset. The sampling rate and format of the CSS10 files is different, hence the function 'speech_file_to_array_fn' was changed to:\n \n\nAs suggested by Florian Zimmermeister.\n\nThe script used for training can be found in run_common_voice.py, still pending of PR. The only changes are to 'speech_file_to_array_fn'. Batch size was kept at 32 (using 'gradient_accumulation_steps') using one of the OVH machines, with a V100 GPU (thank you very much OVH). The model trained for 40 epochs, the first 20 with the 'train+validation' splits, and then 'extra' split was added with the data from CSS10 at the 20th epoch." ]
[ -0.14027132093906403, 0.044699110090732574, -0.0032655098475515842, 0.02266632206737995, 0.12088022381067276, -0.006748388521373272, 0.12982778251171112, 0.06362845003604889, -0.05631123483181, 0.05692042410373688, -0.013782304711639881, -0.005541657097637653, 0.09737831354141235, 0.18117642402648926, 0.0041785407811403275, -0.15612739324569702, 0.04971257597208023, -0.01786847971379757, 0.01758646033704281, 0.07240177690982819, 0.08248837292194366, -0.11568738520145416, 0.04670443758368492, 0.06535132229328156, -0.07261136919260025, 0.000442543881945312, 0.03563297912478447, -0.07581949979066849, 0.10444130748510361, 0.10772953182458878, 0.03794450685381889, 0.06298917531967163, 0.10782398283481598, -0.19035285711288452, 0.027360327541828156, 0.08013199269771576, 0.009726845659315586, 0.05396944284439087, 0.0852038562297821, 0.0217686016112566, 0.08406975120306015, 0.028041083365678787, 0.01437822263687849, 0.07376278936862946, -0.07489250600337982, -0.15195998549461365, -0.11906243860721588, -0.0065723443403840065, 0.07069510221481323, 0.11634408682584763, -0.05565204471349716, 0.028856756165623665, -0.0721929743885994, 0.07489671558141708, 0.12770500779151917, -0.1729673147201538, 0.018519967794418335, 0.01438292395323515, 0.03740951418876648, 0.011346393264830112, -0.09685903787612915, -0.004713986534625292, 0.014341852627694607, 0.0023032999597489834, 0.05456719174981117, -0.01912260428071022, -0.11993850767612457, -0.013609667308628559, -0.11594291776418686, -0.037166014313697815, 0.2156706601381302, 0.008679677732288837, -0.09679285436868668, -0.14287656545639038, -0.03381776437163353, -0.07726237922906876, -0.03269152715802193, -0.029844677075743675, 0.0017051785252988338, 0.0312490351498127, 0.006259776186197996, -0.05786711350083351, -0.1142214685678482, -0.05585572496056557, -0.025300906971096992, 0.05205536261200905, 0.0034701579716056585, 0.01839390955865383, -0.07724892348051071, 0.1189466267824173, -0.05977405235171318, -0.06567034870386124, -0.04550807178020477, -0.02229091338813305, -0.1240314245223999, -0.02955407090485096, -0.09170279651880264, -0.14864397048950195, -0.004802546929568052, 0.2002556025981903, 0.07147546112537384, 0.0492328442633152, -0.062086205929517746, -0.0081844013184309, 0.029874293133616447, 0.14174191653728485, -0.05025963485240936, -0.008950122632086277, 0.03220762312412262, 0.07763455808162689, -0.06571803241968155, -0.07280422002077103, -0.052602287381887436, -0.020716024562716484, 0.0521184504032135, 0.041736554354429245, 0.00659010699018836, 0.07776396721601486, -0.027687227353453636, -0.03954660892486572, 0.12603989243507385, -0.1728053241968155, -0.0072316923178732395, 0.07734570652246475, -0.041788216680288315, 0.07117956131696701, 0.0808519795536995, 0.02157342992722988, -0.06914062798023224, 0.07983406633138657, -0.041228003799915314, 0.009262092411518097, -0.0650651752948761, -0.06283241510391235, 0.011983443982899189, -0.03471604362130165, -0.033850543200969696, -0.07023399323225021, -0.10261282324790955, -0.07261913269758224, -0.023906318470835686, 0.05109279975295067, 0.06595263630151749, -0.05347617715597153, -0.02680162899196148, -0.0044042267836630344, -0.028439510613679886, 0.09354259818792343, -0.04023495316505432, 0.03809268772602081, 0.014905220828950405, 0.014295984990894794, 0.004103948827832937, 0.05559047311544418, -0.07891378551721573, -0.005416239611804485, -0.022611262276768684, 0.1265804022550583, 0.018650716170668602, -0.09081857651472092, -0.016841251403093338, -0.03316317871212959, -0.0806201919913292, 0.017167383804917336, 0.02179701067507267, 0.10580641031265259, -0.2786247134208679, -0.059357840567827225, 0.1855422556400299, -0.12687164545059204, -0.0264910701662302, 0.10257157683372498, -0.0336633175611496, -0.05747951194643974, 0.09309475123882294, 0.11460217088460922, 0.1506192535161972, -0.1646910160779953, -0.039321448653936386, -0.03564110025763512, -0.02750297635793686, 0.07000542432069778, 0.09427281469106674, -0.049747616052627563, 0.08396382629871368, 0.05671257898211479, -0.08971428126096725, 0.024740399792790413, -0.010529911145567894, -0.03244399279356003, -0.0018076744163408875, -0.015578006394207478, 0.046769898384809494, -0.020386632531881332, -0.058478184044361115, 0.014910317026078701, -0.05760607868432999, -0.01491967961192131, 0.12358956784009933, -0.1382770985364914, 0.04428665339946747, -0.13452363014221191, 0.08356136828660965, 0.024858765304088593, 0.03263690695166588, -0.21768690645694733, -0.013302321545779705, 0.004374042619019747, -0.009085850790143013, 0.02524031698703766, 0.13688085973262787, -0.003182600485160947, 0.052379705011844635, -0.015304329805076122, 0.004113589879125357, -0.05605287849903107, -0.055539727210998535, -0.015304253436625004, -0.05453495308756828, -0.08981066197156906, -0.05627685785293579, 0.15971601009368896, -0.1717475950717926, 0.030459508299827576, 0.055866044014692307, 0.05096098780632019, 0.02210339903831482, -0.07778900861740112, -0.027024099603295326, -0.001194332609884441, -0.03386055305600166, -0.03236675634980202, -0.019184347242116928, 0.04408856853842735, -0.018484201282262802, 0.015061247162520885, -0.24643148481845856, -0.09581267833709717, 0.07382896542549133, 0.05911676213145256, -0.026309331879019737, -0.050455328077077866, -0.036287203431129456, 0.006690740119665861, -0.0864182561635971, -0.06005933880805969, 0.2137359082698822, 0.049574434757232666, 0.14475052058696747, -0.10130640864372253, -0.023450355976819992, 0.028382642194628716, -0.02814752236008644, 0.0029149146284908056, -0.016743168234825134, -0.02004799246788025, 0.032207783311605453, 0.027006974443793297, -0.030926154926419258, 0.05982248857617378, 0.21457235515117645, -0.03341667354106903, -0.0922282338142395, -0.019261671230196953, -0.02580111473798752, -0.023327909409999847, 0.03351956605911255, -0.04867167770862579, 0.0059666880406439304, 0.0626002848148346, 0.04098755121231079, 0.03585760295391083, -0.11645323783159256, 0.04668852686882019, 0.04103007912635803, -0.09645898640155792, -0.05414022505283356, 0.03930271416902542, 0.0032099841628223658, 0.04721546545624733, -0.05047379434108734, 0.04966156557202339, -0.01660754904150963, -0.05142198130488396, -0.15367616713047028, 0.1277507096529007, -0.1183023452758789, -0.2240811288356781, -0.19000791013240814, 0.007566559128463268, -0.06310145556926727, 0.05590175837278366, 0.029356414452195168, -0.02931222692131996, -0.03468264639377594, -0.08682726323604584, 0.07688518613576889, 0.007304739207029343, -0.030888592824339867, 0.02486628107726574, 0.018132051452994347, 0.05060911178588867, -0.10895407944917679, -0.00865194108337164, -0.03998349979519844, -0.035739678889513016, -0.009314224123954773, 0.05715937167406082, 0.04701618105173111, 0.1273116171360016, -0.005821474362164736, 0.005540447775274515, 0.005775392521172762, 0.1624123454093933, -0.07150711864233017, 0.035871438682079315, 0.2822582423686981, -0.065650075674057, 0.025650091469287872, 0.032010387629270554, -0.0119866244494915, -0.056341517716646194, 0.0025077376049011946, 0.017533021047711372, -0.05983218550682068, -0.22668591141700745, -0.07756071537733078, -0.05712054297327995, -0.03116910718381405, 0.047106992453336716, -0.014812196604907513, 0.05198957026004791, 0.03345933556556702, -0.09095024317502975, -0.03154711425304413, 0.087398961186409, 0.003087948774918914, 0.1086764857172966, 0.015516867861151695, 0.11420036107301712, -0.018676018342375755, 0.0186258926987648, 0.04871983453631401, -0.042042069137096405, 0.13589352369308472, 0.007834410294890404, 0.05556309223175049, 0.0491902157664299, 0.10971913486719131, 0.03620276227593422, 0.10858876258134842, 0.007466346025466919, -0.028707372024655342, 0.018945351243019104, -0.0683254599571228, -0.0940294936299324, 0.014547113329172134, 0.053751297295093536, -0.055045437067747116, -0.11762803792953491, 0.041734497994184494, 0.010639773681759834, 0.16408133506774902, 0.013128346763551235, -0.22416339814662933, -0.07370002567768097, -0.06838114559650421, -0.005660890135914087, -0.05562445521354675, -0.02475839853286743, 0.1440531313419342, -0.1394948810338974, 0.050362154841423035, -0.060484349727630615, 0.07268557697534561, -0.06788971275091171, 0.04268193989992142, 0.025710683315992355, 0.09284014254808426, 0.014343506656587124, 0.08231902867555618, -0.15899990499019623, 0.042568858712911606, 0.029022766277194023, 0.09923049807548523, -0.0796521008014679, 0.05591168999671936, -0.005476966965943575, 0.07938428223133087, 0.09483949840068817, -0.012614555656909943, -0.10359004139900208, -0.15735571086406708, -0.029846925288438797, 0.019251737743616104, 0.1223897635936737, -0.05398472025990486, 0.08175023645162582, -0.04884347319602966, 0.02035684511065483, -0.006933168973773718, 0.010201097466051579, -0.11748089641332626, -0.18646240234375, 0.08414212614297867, 0.054891981184482574, 0.10949740558862686, -0.059173934161663055, -0.038553714752197266, -0.13569658994674683, 0.1061951071023941, -0.09444327652454376, -0.008390172384679317, -0.12784282863140106, 0.0887083187699318, 0.12668035924434662, -0.07395152747631073, 0.04247688129544258, 0.0070909918285906315, 0.18816030025482178, -0.05018424242734909, -0.10781467705965042, -0.010648096911609173, -0.07868438959121704, -0.17078694701194763, -0.02076535113155842, 0.1671539694070816, 0.07778077572584152, 0.007937886752188206, 0.054960064589977264, -0.0010489484993740916, 0.032203830778598785, -0.0823739692568779, -0.04199962317943573, 0.08838579803705215, 0.04697370156645775, 0.01881074160337448, -0.08457738161087036, -0.14771312475204468, -0.09086383879184723, -0.0033170836977660656, 0.0872141495347023, 0.1613796502351761, -0.061693571507930756, 0.11696228384971619, 0.11379146575927734, -0.06016894429922104, -0.18123255670070648, -0.028360234573483467, 0.07889774441719055, 0.0654710903763771, 0.06669552624225616, -0.1979193240404129, 0.049169838428497314, 0.04628901183605194, -0.0015674952883273363, 0.009181605651974678, -0.25000515580177307, -0.11576730012893677, 0.06601843982934952, -0.06646248698234558, -0.030143337324261665, -0.013033274561166763, -0.015316330827772617, -0.05913056805729866, -0.049662765115499496, 0.06605908274650574, -0.07481761276721954, 0.09336856752634048, 0.03276817128062248, -0.01235164888203144, 0.06857804208993912, -0.04505576193332672, 0.08583315461874008, 0.0736403688788414, 0.06113773584365845, -0.023823797702789307, 0.02847040630877018, 0.08244244009256363, -0.030088499188423157, 0.12758859992027283, -0.055205248296260834, 0.06687366217374802, -0.03477802127599716, -0.07606080919504166, -0.039710450917482376, 0.07565530389547348, -0.004523043520748615, -0.014115545898675919, -0.04103110358119011, 0.0011573723750188947, 0.04851004108786583, -0.005554022267460823, -0.036431510001420975, -0.0810120552778244, -0.017863178625702858, 0.1601644605398178, 0.10430198162794113, -0.016440536826848984, -0.06843477487564087, -0.022240057587623596, 0.03385980427265167, 0.05238699913024902, -0.0495896190404892, 0.08631787449121475, 0.08369605243206024, -0.002380346180871129, 0.15758448839187622, 0.05106968432664871, -0.11514420062303543, 0.009866738691926003, 0.045338068157434464, -0.11933863162994385, -0.07678253948688507, -0.04370661452412605, -0.048689015209674835, -0.14615604281425476, 0.02552434243261814, 0.1343681514263153, 0.018162785097956657, 0.006045926827937365, -0.023115631192922592, 0.015609286725521088, -0.04139208793640137, 0.15567852556705475, -0.002263367176055908, 0.05708548426628113, -0.0889081135392189, 0.07658122479915619, 0.026545729488134384, -0.042445577681064606, 0.020642712712287903, -0.05396732687950134, -0.1352359801530838, -0.05215120315551758, -0.06140928715467453, 0.10675344616174698, 0.013056071475148201, -0.07900426536798477, -0.04093801975250244, -0.0652805045247078, 0.01739838905632496, 0.02069016918540001, -0.0027621625922620296, 0.07275491207838058, -0.08630052208900452, -0.035713400691747665, -0.10302233695983887, 0.05197892710566521, 0.05976276472210884, -0.0038896414916962385, -0.06459908932447433, 0.2130020260810852, 0.029818523675203323, 0.05996187776327133, -0.016385089606046677, -0.050026655197143555, 0.015617724508047104, 0.008762755431234837, 0.004256040323525667, -0.03381962701678276, -0.04395541176199913, -0.027250710874795914, 0.0020993202924728394, 0.008096611127257347, 0.02926623262465, 0.04724719375371933, -0.05512649193406105, -0.03195035457611084, -0.05950731784105301, 0.053832363337278366, -0.12544475495815277, 0.025695838034152985, 0.019586794078350067, -0.09585945308208466, 0.08186110109090805, 0.082745261490345, -0.03156576305627823, 0.09320041537284851, -0.14650893211364746, -0.05480799078941345, 0.018492722883820534, 0.09364880621433258, -0.04430043324828148, -0.04300356283783913, 0.034371063113212585, 0.058657173067331314, -0.005592904984951019, -0.026645632460713387, 0.052395597100257874, -0.10834044963121414, -0.017264850437641144, -0.011694633401930332, 0.021291211247444153, -0.05845048278570175, 0.05775165930390358, 0.03844961151480675, 0.11916022002696991, 0.16774965822696686, -0.04046230763196945, 0.07359829545021057, -0.182401642203331, -0.01655612699687481, 0.0074966601096093655, -0.0024282855447381735, 0.008459814824163914, -0.08636577427387238, 0.07184974104166031, -0.016242874786257744, 0.11551515758037567, 0.02463177591562271, 0.10829424858093262, -0.021673081442713737, -0.0038501292001456022, -0.0033908726181834936, 0.029895247891545296, 0.17157083749771118, 0.044088199734687805, -0.007425934541970491, 0.04804768040776253, -0.0035349864047020674, 0.014373007230460644, 0.07650429010391235, 0.10444768518209457, 0.09349051117897034, -0.003580172546207905, 0.05890108644962311, 0.048799067735672, -0.0762074738740921, -0.038573142141103745, -0.015582331456243992, -0.15962433815002441, 0.037840213626623154, -0.059019651263952255, 0.11947256326675415, 0.0580216608941555, -0.1314820945262909, 0.10863886028528214, 0.05128437280654907, -0.07612746208906174, -0.11795482784509659, -0.1749771535396576, -0.032659366726875305, -0.0568845272064209, 0.031200483441352844, -0.1017097681760788, 0.07127432525157928, 0.07597874104976654, -0.01567481830716133, -0.04904162511229515, 0.1346626877784729, -0.12151776254177094, -0.12938013672828674, 0.045023128390312195, -0.022585313767194748, 0.008349980227649212, 0.024763161316514015, -0.020651619881391525, 0.1654195338487625, 0.0041209626942873, 0.09550993889570236, 0.012593800202012062, 0.10341738164424896, 0.0780746266245842, -0.06478799134492874, -0.05658048763871193, 0.010273034684360027, -0.047331634908914566, 0.05060499534010887, 0.11212596297264099, 0.06312531977891922, -0.0293108019977808, -0.024089667946100235, 0.13372687995433807, -0.012384504079818726, -0.09766942262649536, -0.17912210524082184, 0.003908939193934202, 0.08295200765132904, 0.05006648600101471, -0.0216898825019598, -0.09321451187133789, 0.0210184957832098, 0.22401054203510284, 0.13198040425777435, 0.00026723017799668014, 0.019466597586870193, 0.013510770164430141, -0.00520352041348815, -0.010896172374486923, 0.06980884820222855, 0.01115146279335022, 0.06491032987833023, 0.04096709191799164, 0.08869556337594986, -0.024581218138337135, -0.05027877911925316, 0.030514448881149292, 0.1148187667131424, -0.12246344238519669, -0.019998451694846153, -0.02407926693558693, 0.03866063803434372, 0.06867726147174835, -0.19931811094284058, -0.017358990386128426, 0.00818311795592308, -0.047874223440885544, 0.005780596286058426, -0.007885775528848171, 0.04790960252285004, 0.03867513686418533, -0.011291508562862873, 0.007911059074103832, 0.20490662753582, -0.02125531993806362, -0.040080174803733826, -0.04570532217621803, 0.03505107760429382, -0.12147784233093262, 0.08669498562812805, -0.013820305466651917, 0.09954256564378738, 0.03185826167464256, 0.10622725635766983, -0.047447096556425095, 0.09629126638174057, -0.0202319398522377, -0.00530749186873436, 0.04496469348669052, 0.10265173017978668, 0.00644223066046834, 0.04800714924931526, 0.004389693029224873, -0.10585450381040573, 0.08828923851251602, -0.09559904038906097, -0.01604454405605793, -0.043963756412267685, 0.06300701946020126, -0.033451180905103683, 0.14788267016410828, 0.09797677397727966, -0.018505210056900978, -0.018550526350736618, -0.03608746454119682, 0.0367988757789135, 0.024222787469625473, 0.06354635953903198, -0.04111205413937569, -0.18937592208385468, -0.005977048072963953, -0.03708475083112717, -0.011449465528130531, -0.1504049301147461, -0.018337223678827286, 0.012943802401423454, -0.10796516388654709, 0.017378658056259155, 0.1007799506187439, 0.022142605856060982, 0.013958528637886047, -0.04183470085263252, -0.0030270228162407875, 0.057023271918296814, 0.10492580384016037, -0.1432112604379654, -0.08315230160951614 ]
null
null
transformers
# wav2vec2-xls-r-1b-ca-lm This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - CA, the [tv3_parla](https://huggingface.co/datasets/collectivat/tv3_parla) and [parlament_parla](https://huggingface.co/datasets/projecte-aina/parlament_parla) datasets. ## Model description Please check the original [facebook/wav2vec2-xls-r-1b](https://huggingface.co/facebook/wav2vec2-xls-r-1b) Model card. This is just a finetuned version of that model. ## Intended uses & limitations As any model trained on crowdsourced data, this model can show the biases and particularities of the data and model used to train this model. Moreover, since this is a speech recognition model, it may underperform for some lower-resourced dialects for the catalan language. ## Training and evaluation data ## Training procedure The data is preprocessed to remove characters not on the catalan alphabet. Moreover, numbers are verbalized using code provided by [@ccoreilly](https://github.com/ccoreilly), which can be found on the text/ folder or [here](https://github.com/CollectivaT-dev/catotron-cpu/blob/master/text/numbers_ca.py). ### Training results Check the Tensorboard tab to check the training profile and evaluation results along training. The model was evaluated on the test splits for each of the datasets used during training. ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 8 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 2000 - num_epochs: 10.0 - mixed_precision_training: Native AMP ### Framework versions - Transformers 4.17.0.dev0 - Pytorch 1.10.2+cu102 - Datasets 1.18.3 - Tokenizers 0.11.0 # Thanks Want to thank both [@ccoreilly](https://github.com/ccoreilly) and [@gullabi](https://github.com/gullabi) who have contributed with their own resources and knowledge into making this model possible.
{"language": ["ca"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "collectivat/tv3_parla", "generated_from_trainer", "hf-asr-leaderboard", "mozilla-foundation/common_voice_8_0", "projecte-aina/parlament_parla", "robust-speech-event"], "datasets": ["mozilla-foundation/common_voice_8_0", "collectivat/tv3_parla", "projecte-aina/parlament_parla"], "model-index": [{"name": "wav2vec2-xls-r-1b-ca-lm", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "mozilla-foundation/common_voice_8_0 ca", "type": "mozilla-foundation/common_voice_8_0", "args": "ca"}, "metrics": [{"type": "wer", "value": 6.072266995813065, "name": "Test WER"}, {"type": "cer", "value": 1.9180697705166525, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "projecte-aina/parlament_parla ca", "type": "projecte-aina/parlament_parla", "args": "clean"}, "metrics": [{"type": "wer", "value": 5.139820371024042, "name": "Test WER"}, {"type": "cer", "value": 2.0163620128164723, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "collectivat/tv3_parla ca", "type": "collectivat/tv3_parla", "args": "ca"}, "metrics": [{"type": "wer", "value": 11.207991684952074, "name": "Test WER"}, {"type": "cer", "value": 7.32119307305963, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Catalan Dev Data", "type": "speech-recognition-community-v2/dev_data", "args": "ca"}, "metrics": [{"type": "wer", "value": 22.870153690468662, "name": "Test WER"}, {"type": "cer", "value": 13.59039190897598, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Test Data", "type": "speech-recognition-community-v2/eval_data", "args": "ca"}, "metrics": [{"type": "wer", "value": 15.41, "name": "Test WER"}]}]}]}
automatic-speech-recognition
PereLluis13/wav2vec2-xls-r-1b-ca-lm
[ "transformers", "pytorch", "tensorboard", "wav2vec2", "automatic-speech-recognition", "collectivat/tv3_parla", "generated_from_trainer", "hf-asr-leaderboard", "mozilla-foundation/common_voice_8_0", "projecte-aina/parlament_parla", "robust-speech-event", "ca", "dataset:mozilla-foundation/common_voice_8_0", "dataset:collectivat/tv3_parla", "dataset:projecte-aina/parlament_parla", "license:apache-2.0", "model-index", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "ca" ]
TAGS #transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #collectivat/tv3_parla #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #projecte-aina/parlament_parla #robust-speech-event #ca #dataset-mozilla-foundation/common_voice_8_0 #dataset-collectivat/tv3_parla #dataset-projecte-aina/parlament_parla #license-apache-2.0 #model-index #endpoints_compatible #region-us
# wav2vec2-xls-r-1b-ca-lm This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - CA, the tv3_parla and parlament_parla datasets. ## Model description Please check the original facebook/wav2vec2-xls-r-1b Model card. This is just a finetuned version of that model. ## Intended uses & limitations As any model trained on crowdsourced data, this model can show the biases and particularities of the data and model used to train this model. Moreover, since this is a speech recognition model, it may underperform for some lower-resourced dialects for the catalan language. ## Training and evaluation data ## Training procedure The data is preprocessed to remove characters not on the catalan alphabet. Moreover, numbers are verbalized using code provided by @ccoreilly, which can be found on the text/ folder or here. ### Training results Check the Tensorboard tab to check the training profile and evaluation results along training. The model was evaluated on the test splits for each of the datasets used during training. ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 8 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 2000 - num_epochs: 10.0 - mixed_precision_training: Native AMP ### Framework versions - Transformers 4.17.0.dev0 - Pytorch 1.10.2+cu102 - Datasets 1.18.3 - Tokenizers 0.11.0 # Thanks Want to thank both @ccoreilly and @gullabi who have contributed with their own resources and knowledge into making this model possible.
[ "# wav2vec2-xls-r-1b-ca-lm\n\nThis model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - CA, the tv3_parla and parlament_parla datasets.", "## Model description\n\nPlease check the original facebook/wav2vec2-xls-r-1b Model card. This is just a finetuned version of that model.", "## Intended uses & limitations\n\nAs any model trained on crowdsourced data, this model can show the biases and particularities of the data and model used to train this model. Moreover, since this is a speech recognition model, it may underperform for some lower-resourced dialects for the catalan language.", "## Training and evaluation data", "## Training procedure\n\nThe data is preprocessed to remove characters not on the catalan alphabet. Moreover, numbers are verbalized using code provided by @ccoreilly, which can be found on the text/ folder or here.", "### Training results\n\nCheck the Tensorboard tab to check the training profile and evaluation results along training. The model was evaluated on the test splits for each of the datasets used during training.", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 8\n- total_train_batch_size: 64\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 2000\n- num_epochs: 10.0\n- mixed_precision_training: Native AMP", "### Framework versions\n\n- Transformers 4.17.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.18.3\n- Tokenizers 0.11.0", "# Thanks\n\nWant to thank both @ccoreilly and @gullabi who have contributed with their own resources and knowledge into making this model possible." ]
[ "TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #collectivat/tv3_parla #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #projecte-aina/parlament_parla #robust-speech-event #ca #dataset-mozilla-foundation/common_voice_8_0 #dataset-collectivat/tv3_parla #dataset-projecte-aina/parlament_parla #license-apache-2.0 #model-index #endpoints_compatible #region-us \n", "# wav2vec2-xls-r-1b-ca-lm\n\nThis model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - CA, the tv3_parla and parlament_parla datasets.", "## Model description\n\nPlease check the original facebook/wav2vec2-xls-r-1b Model card. This is just a finetuned version of that model.", "## Intended uses & limitations\n\nAs any model trained on crowdsourced data, this model can show the biases and particularities of the data and model used to train this model. Moreover, since this is a speech recognition model, it may underperform for some lower-resourced dialects for the catalan language.", "## Training and evaluation data", "## Training procedure\n\nThe data is preprocessed to remove characters not on the catalan alphabet. Moreover, numbers are verbalized using code provided by @ccoreilly, which can be found on the text/ folder or here.", "### Training results\n\nCheck the Tensorboard tab to check the training profile and evaluation results along training. The model was evaluated on the test splits for each of the datasets used during training.", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 8\n- total_train_batch_size: 64\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 2000\n- num_epochs: 10.0\n- mixed_precision_training: Native AMP", "### Framework versions\n\n- Transformers 4.17.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.18.3\n- Tokenizers 0.11.0", "# Thanks\n\nWant to thank both @ccoreilly and @gullabi who have contributed with their own resources and knowledge into making this model possible." ]
[ 161, 76, 35, 73, 5, 48, 42, 142, 38, 30 ]
[ "passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #collectivat/tv3_parla #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #projecte-aina/parlament_parla #robust-speech-event #ca #dataset-mozilla-foundation/common_voice_8_0 #dataset-collectivat/tv3_parla #dataset-projecte-aina/parlament_parla #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# wav2vec2-xls-r-1b-ca-lm\n\nThis model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - CA, the tv3_parla and parlament_parla datasets.## Model description\n\nPlease check the original facebook/wav2vec2-xls-r-1b Model card. This is just a finetuned version of that model.## Intended uses & limitations\n\nAs any model trained on crowdsourced data, this model can show the biases and particularities of the data and model used to train this model. Moreover, since this is a speech recognition model, it may underperform for some lower-resourced dialects for the catalan language.## Training and evaluation data## Training procedure\n\nThe data is preprocessed to remove characters not on the catalan alphabet. Moreover, numbers are verbalized using code provided by @ccoreilly, which can be found on the text/ folder or here.### Training results\n\nCheck the Tensorboard tab to check the training profile and evaluation results along training. The model was evaluated on the test splits for each of the datasets used during training." ]
[ -0.1081007719039917, 0.2297286093235016, -0.004889249801635742, 0.04178765416145325, 0.08236654847860336, -0.030856266617774963, 0.08974505215883255, 0.10244735330343246, -0.040440868586301804, 0.10602381080389023, -0.03761132061481476, 0.008470072411000729, 0.09485920518636703, 0.08141308277845383, 0.042462632060050964, -0.1573040783405304, 0.04378592222929001, -0.04978107288479805, 0.09736783802509308, 0.09232482314109802, 0.0903247594833374, -0.0936928391456604, 0.04867416247725487, 0.04140601307153702, -0.02815086394548416, 0.020315084606409073, 0.023577429354190826, -0.1196194440126419, 0.0823102742433548, 0.06682363897562027, 0.03680912405252457, 0.02990161068737507, 0.05106833949685097, -0.16092753410339355, 0.032491330057382584, 0.10926496237516403, 0.0024988180957734585, 0.053869277238845825, 0.15221689641475677, -0.06149394437670708, 0.11330079287290573, -0.00906057097017765, 0.0614641010761261, 0.08375993371009827, -0.10734336078166962, -0.13868306577205658, -0.11895550787448883, 0.02906988561153412, 0.07738478481769562, 0.11255655437707901, -0.06604595482349396, 0.08409052342176437, -0.04264490306377411, 0.0362226739525795, 0.17489126324653625, -0.1615660935640335, -0.032318975776433945, -0.05266464874148369, -0.004103704821318388, 0.004486084450036287, -0.06719718873500824, 0.059223663061857224, 0.024851398542523384, -0.004803601186722517, 0.03532496839761734, -0.0372224859893322, -0.04227432981133461, -0.06995514035224915, -0.11127063632011414, -0.04762302711606026, 0.06482328474521637, 0.011823528446257114, -0.05151816084980965, -0.1627388447523117, 0.0010435060830786824, 0.06409548223018646, 0.005024792160838842, 0.027034875005483627, -0.00044060961226932704, -0.009738459251821041, -0.013399937190115452, -0.062779501080513, -0.0923391580581665, -0.06440234184265137, -0.07142308354377747, 0.07928461581468582, 0.013510314747691154, 0.021651385352015495, -0.049764398485422134, 0.09408637881278992, 0.026914989575743675, -0.0881769061088562, -0.0021729054860770702, -0.010738849639892578, -0.14735059440135956, -0.06079582870006561, -0.010835637338459492, -0.138358011841774, 0.02092663384974003, 0.14580579102039337, -0.009005758911371231, 0.048998989164829254, -0.07568663358688354, 0.009029144421219826, 0.09455818682909012, 0.1270158290863037, -0.11595293134450912, 0.0030763898976147175, -0.00037601435906253755, 0.02460525743663311, -0.006106202956289053, -0.012831645086407661, -0.017312509939074516, -0.024159202352166176, 0.06407823413610458, 0.09023409336805344, 0.09948155283927917, -0.0005543786683119833, -0.08791830390691757, -0.04230337217450142, 0.08037468791007996, -0.14962229132652283, 0.054431624710559845, 0.06559917330741882, -0.04282468929886818, 0.07131759822368622, -0.015528964810073376, -0.00318882311694324, -0.08438684046268463, -0.01532053854316473, -0.045763831585645676, 0.012270384468138218, -0.037056535482406616, -0.057449135929346085, 0.04372907057404518, -0.11046291142702103, -0.06623081862926483, -0.10972598940134048, -0.10683828592300415, -0.10609053075313568, -0.005904431454837322, -0.09205029904842377, 0.04157624393701553, -0.04161159694194794, 0.02201182022690773, -0.02231971174478531, -0.00855441577732563, 0.09482963383197784, -0.033081624656915665, 0.03575929254293442, -0.018069008365273476, 0.04527539759874344, 0.08344193547964096, 0.05445137247443199, -0.07018478214740753, 0.03750582039356232, -0.14159651100635529, 0.14497430622577667, -0.07392066717147827, -0.039879895746707916, -0.12749959528446198, 0.004986278712749481, -0.004286237992346287, 0.01988709717988968, 0.054598234593868256, 0.13014183938503265, -0.2351091057062149, -0.06401427090167999, 0.1748843789100647, -0.1262446641921997, -0.023327471688389778, 0.11364973336458206, -0.04736414924263954, 0.018216902390122414, 0.11018002778291702, 0.14986251294612885, 0.10300963371992111, -0.14460092782974243, -0.07214241474866867, -0.05508608743548393, -0.010692148469388485, 0.10026685893535614, 0.08398580551147461, -0.08914029598236084, 0.08767413347959518, -0.004995525348931551, -0.06456608325242996, -0.01739586889743805, 0.007795524783432484, -0.05110558494925499, 0.04833769053220749, -0.0334777869284153, 0.017264986410737038, -0.010991469025611877, -0.03259800747036934, -0.0010514381574466825, -0.07631108164787292, 0.022926833480596542, 0.10513417422771454, -0.05467182397842407, 0.023094426840543747, -0.11797848343849182, 0.13747219741344452, -0.06290756165981293, -0.022710183635354042, -0.18842512369155884, 0.014287939295172691, 0.016252649948000908, -0.12898868322372437, 0.04758168011903763, 0.007675632368773222, 0.023239629343152046, -0.008326920680701733, -0.005581502337008715, 0.019721703603863716, -0.011414438486099243, -0.004846738651394844, -0.009754394181072712, -0.13136903941631317, -0.04046279564499855, -0.04900825768709183, 0.211078479886055, -0.15165776014328003, -0.008157443255186081, 0.11809583008289337, 0.15311798453330994, 0.014500852674245834, -0.06386739760637283, 0.05991442874073982, 0.014535530470311642, 0.024722909554839134, -0.05747867375612259, 0.015014594420790672, 0.02107306383550167, -0.01563938707113266, 0.07672378420829773, -0.1354362666606903, -0.1468048095703125, 0.08044382929801941, -0.007643524091690779, -0.08533185720443726, 0.024622581899166107, -0.0012566664954647422, -0.0003141449997201562, -0.14081218838691711, -0.05431903526186943, 0.23311005532741547, 0.06267573684453964, 0.08233765512704849, -0.10968664288520813, -0.05711742118000984, 0.016379499807953835, -0.030826598405838013, -0.052029214799404144, 0.06012001261115074, 0.029857920482754707, -0.03886200115084648, 0.04544824734330177, -0.03564164787530899, 0.004666507709771395, 0.1359621286392212, 0.01936947926878929, -0.1070394217967987, -0.02902376651763916, -0.017309924587607384, 0.024086246266961098, 0.0629613846540451, -0.0579056590795517, -0.013610709458589554, 0.023536041378974915, -0.0003064561460632831, 0.056317783892154694, -0.09503714740276337, 0.03606795519590378, 0.03557408228516579, -0.02585916966199875, -0.01632893644273281, -0.013175906613469124, 0.01644563116133213, 0.09025269001722336, 0.031672582030296326, 0.049383338540792465, -0.05042283609509468, -0.04058462381362915, -0.13644039630889893, 0.11167309433221817, -0.05725933983922005, -0.28025224804878235, -0.14360082149505615, 0.042180225253105164, -0.024462873116135597, 0.015507908537983894, 0.009924354963004589, -0.04679497331380844, -0.045631490647792816, -0.08595311641693115, -0.01798713579773903, 0.010428793728351593, -0.036021970212459564, 0.03488863632082939, 0.027108440175652504, 0.04818703234195709, -0.07094744592905045, 0.007570828776806593, 0.000772874045651406, -0.08535642921924591, -0.03230152651667595, 0.037860263139009476, 0.08040688931941986, 0.1015755757689476, 0.019721394404768944, 0.017031950876116753, -0.014538024552166462, 0.14533457159996033, -0.13266843557357788, 0.055658936500549316, 0.1472567766904831, -0.005988627672195435, -0.0008854123298078775, 0.07479169219732285, 0.0007955220062285662, -0.04407260939478874, -0.010290228761732578, 0.08322695642709732, -0.031445473432540894, -0.2113054245710373, -0.05914386361837387, -0.009734396822750568, -0.081288181245327, 0.12059029191732407, 0.04082644358277321, 0.017342770472168922, 0.052874356508255005, -0.02966080792248249, -0.032706815749406815, 0.062004636973142624, 0.0440719872713089, 0.040890514850616455, 0.020181845873594284, 0.06568551808595657, -0.05402199923992157, 0.04149031639099121, 0.12553386390209198, 0.03217696398496628, 0.1453889161348343, -0.032045911997556686, 0.18581198155879974, 0.08768045157194138, 0.01781529188156128, -0.003387665143236518, 0.007278288248926401, 0.014674210920929909, 0.03772749379277229, 0.04084647446870804, -0.08265683799982071, -0.016263321042060852, -0.0021763211116194725, 0.11315613240003586, -0.07988748699426651, -0.053603991866111755, -0.0706905871629715, 0.07451243698596954, 0.2277994155883789, 0.03372502326965332, -0.19179993867874146, -0.018208835273981094, 0.016909612342715263, -0.05078139901161194, -0.04991738870739937, -0.004519774578511715, 0.09860333055257797, -0.16392749547958374, 0.09844570606946945, 0.017647389322519302, 0.09431988000869751, -0.17705294489860535, -0.04893741011619568, 0.0020701561588793993, 0.04373998939990997, -0.0013815750135108829, 0.07831334322690964, -0.19147756695747375, 0.10857250541448593, 0.02225346304476261, 0.10131034255027771, -0.06727723777294159, -0.002222148235887289, 0.011595282703638077, -0.033639274537563324, 0.1570483148097992, 0.008560665883123875, -0.03531574085354805, -0.06018223613500595, -0.08910157531499863, 0.007887925021350384, 0.0016872626729309559, -0.07344361394643784, 0.08761707693338394, 0.009562673047184944, -0.01993422582745552, -0.0621807761490345, -0.06939175724983215, -0.12978559732437134, -0.12542502582073212, 0.004525584634393454, 0.029728136956691742, -0.010335779748857021, -0.07194031774997711, -0.05876903980970383, -0.13785238564014435, 0.09293261170387268, -0.1345812976360321, -0.06295662373304367, -0.09564702212810516, 0.01280539482831955, 0.08784934878349304, -0.05345967411994934, 0.02639160305261612, -0.0001862578501459211, 0.17778536677360535, -0.03349015489220619, -0.053090427070856094, 0.018987813964486122, -0.05884252488613129, -0.16931001842021942, -0.018266476690769196, 0.1264430582523346, 0.12419198453426361, 0.026364034041762352, 0.056835200637578964, 0.0534244142472744, 0.014822257682681084, -0.09600789099931717, -0.0139627018943429, 0.12249907851219177, -0.006349770352244377, 0.03865000978112221, 0.0351065918803215, -0.2112119048833847, -0.1029643639922142, 0.00737721286714077, 0.092008076608181, 0.15121978521347046, -0.0711960420012474, 0.10983296483755112, 0.2048291265964508, -0.09406649321317673, -0.1400693953037262, 0.025065260007977486, 0.04039241373538971, 0.0016780755249783397, 0.02181459777057171, -0.2101791650056839, 0.05231362208724022, 0.12202195078134537, -0.006483771372586489, 0.013410516083240509, -0.30094558000564575, -0.12775975465774536, 0.08999583125114441, 0.033710695803165436, -0.10037025809288025, -0.05144565552473068, -0.06544355303049088, -0.04858976975083351, -0.0716724544763565, 0.08427410572767258, -0.17870667576789856, 0.0692383348941803, 0.022874481976032257, 0.01106971688568592, 0.01839584857225418, -0.04718897119164467, 0.1402014195919037, 0.03242892026901245, 0.010726431384682655, -0.046799492090940475, -0.016517706215381622, 0.12500876188278198, -0.0353718139231205, 0.06144426763057709, 0.025820119306445122, 0.028540991246700287, -0.1306189000606537, -0.07059893012046814, -0.09795704483985901, 0.04296491667628288, -0.05181794986128807, 0.009562603197991848, -0.024119993671774864, 0.07772023975849152, 0.06297489255666733, 0.007938665337860584, -0.08478090912103653, -0.11660172045230865, 0.02729801833629608, 0.1586051732301712, 0.12036964297294617, 0.07285923510789871, -0.08948153257369995, -0.028729567304253578, -0.015081499703228474, 0.059361040592193604, 0.00599254434928298, 0.06225688382983208, 0.0769062265753746, 0.04868975654244423, 0.11069950461387634, -0.022258685901761055, -0.1853533536195755, 0.07922057807445526, 0.032406941056251526, -0.06703606992959976, -0.11843059957027435, -0.001057324348948896, 0.013040830381214619, -0.08976298570632935, -0.02629646472632885, 0.11975942552089691, 0.022823220118880272, -0.05389510840177536, 0.008293592371046543, 0.032396573573350906, -0.06188863143324852, 0.14828269183635712, 0.006323876790702343, 0.04710967466235161, -0.08501844108104706, 0.12937219440937042, 0.10052631050348282, -0.041118886321783066, 0.03554960712790489, 0.06991544365882874, -0.07783710211515427, -0.0688071995973587, -0.08209949731826782, 0.09671401977539062, -0.0020459850784391165, -0.09168561547994614, -0.03962180018424988, -0.06594741344451904, 0.05325871333479881, 0.10606913268566132, 0.0113716721534729, 0.050918083637952805, 0.011305154301226139, -0.02878868393599987, -0.0588272325694561, 0.015653301030397415, 0.06645441800355911, -0.009343326091766357, -0.0414593331515789, 0.055957768112421036, 0.037905365228652954, -0.009728247299790382, -0.026597993448376656, -0.08390502631664276, -0.07064371556043625, 0.053103502839803696, -0.14804863929748535, 0.032217249274253845, -0.15328389406204224, -0.008008445613086224, -0.012737498618662357, 0.004941272549331188, -0.010712943971157074, 0.04262269288301468, -0.03809298947453499, -0.027596134692430496, -0.04442162066698074, 0.07373980432748795, -0.19532693922519684, 0.05712777003645897, 0.012370827607810497, -0.06524639576673508, 0.0881626084446907, 0.02860993519425392, 0.001587109756655991, 0.010270366445183754, -0.11392565816640854, -0.0419403612613678, -0.03266170620918274, 0.01576283387839794, 0.03542191907763481, -0.17916245758533478, 0.04080060124397278, 0.021610336378216743, -0.02067672647535801, 0.009401705116033554, 0.047126688063144684, -0.0811554417014122, 0.06592603772878647, 0.05654421076178551, -0.023030593991279602, -0.07623603194952011, 0.1124892458319664, 0.09475373476743698, 0.040232185274362564, 0.10172524303197861, -0.0675535723567009, 0.0443451888859272, -0.12078281491994858, -0.0065705375745892525, -0.01934237778186798, 0.03341997042298317, 0.07817797362804413, 0.004385687410831451, 0.07207183539867401, -0.015833642333745956, 0.1513911783695221, 0.01934463530778885, 0.04324942082166672, 0.03935900703072548, 0.013188907876610756, -0.07874423265457153, 0.05511743575334549, 0.04882466420531273, 0.0192668829113245, -0.023314038291573524, 0.016540007665753365, -0.029014209285378456, -0.02970169298350811, 0.06836389750242233, 0.07333486527204514, 0.17447923123836517, 0.06500794738531113, 0.021763581782579422, 0.09427079558372498, -0.030970899388194084, -0.06659109890460968, 0.07360727339982986, -0.12768463790416718, 0.0306607186794281, -0.07992541044950485, -0.004533345345407724, 0.1519906371831894, -0.1552775502204895, 0.08801928162574768, 0.04872880503535271, -0.050173722207546234, -0.08813254535198212, -0.27962300181388855, -0.04424109309911728, 0.028854062780737877, 0.0036096551921218634, -0.054411277174949646, 0.0324755534529686, 0.045518577098846436, 0.04591912403702736, -0.0580194815993309, 0.14759506285190582, -0.0441121943295002, -0.12852051854133606, 0.10453953593969345, 0.0019309120252728462, 0.030876774340867996, -0.011997873894870281, 0.046348050236701965, 0.08278543502092361, 0.082940973341465, 0.08340846747159958, 0.04895344749093056, 0.02489352785050869, 0.04885638505220413, -0.03141650930047035, -0.0737856924533844, 0.027620811015367508, -0.04952319711446762, 0.02259596809744835, 0.13995224237442017, 0.09419062733650208, -0.0054191830568015575, 0.004754404537379742, 0.17483219504356384, -0.028905058279633522, -0.03638315200805664, -0.1860436499118805, 0.08018501847982407, -0.0696452334523201, -0.025447439402341843, 0.000518993241712451, -0.11804019659757614, 0.0257159061729908, 0.19756940007209778, 0.13315868377685547, -0.0043927691876888275, 0.026422321796417236, -0.02470691129565239, 0.0035716462880373, 0.0007223053253255785, 0.05155842751264572, -0.004683896899223328, 0.12950722873210907, -0.03928215801715851, 0.05516050383448601, -0.009417436085641384, -0.04985969886183739, -0.029488496482372284, 0.04817255958914757, -0.04639130085706711, 0.02444065734744072, -0.07387320697307587, 0.1400904506444931, -0.044373758137226105, -0.2338612675666809, -0.011426608078181744, -0.006614790763705969, -0.1345980316400528, 0.012030684389173985, -0.06299404799938202, 0.023927435278892517, 0.05420370399951935, 0.03566300868988037, 0.02320931665599346, 0.13824675977230072, 0.03397452086210251, -0.0009958073496818542, -0.09598647803068161, 0.08509574085474014, -0.05564863234758377, 0.21562449634075165, -0.03660096228122711, 0.04529392346739769, 0.09589371830224991, 0.025985367596149445, -0.13295310735702515, 0.06459478288888931, 0.029447395354509354, -0.04474528133869171, 0.047403253614902496, 0.16117647290229797, 0.005942084826529026, 0.06409268826246262, 0.08030277490615845, 0.007462262641638517, 0.07709997892379761, -0.11791932582855225, 0.05118854343891144, -0.14354464411735535, 0.09424321353435516, -0.07951407879590988, 0.13343903422355652, 0.07198276370763779, -0.03661558777093887, 0.015764888375997543, -0.050834108144044876, 0.02143879234790802, 0.007622146978974342, 0.05454839766025543, -0.039774853736162186, -0.17349615693092346, 0.03034062497317791, -0.039975740015506744, 0.06584908068180084, -0.20602700114250183, 0.016024252399802208, 0.026714615523815155, -0.10163969546556473, 0.051850661635398865, 0.07141708582639694, -0.0035452498123049736, 0.05279092863202095, -0.03032245673239231, -0.12824475765228271, 0.04108984395861626, 0.11457372456789017, -0.11388979852199554, -0.025966638699173927 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # wav2vec2-xls-r-1b-ca This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - CA, the [tv3_parla](https://huggingface.co/datasets/collectivat/tv3_parla) and [parlament_parla](https://huggingface.co/datasets/projecte-aina/parlament_parla) datasets. ## Model description Please check the original [facebook/wav2vec2-xls-r-1b](https://huggingface.co/facebook/wav2vec2-xls-r-1b) Model card. This is just a finetuned version of that model. ## Intended uses & limitations As any model trained on crowdsourced data, this model can show the biases and particularities of the data and model used to train this model. Moreover, since this is a speech recognition model, it may underperform for some lower-resourced dialects for the catalan language. ## Training and evaluation data ## Training procedure The data is preprocessed to remove characters not on the catalan alphabet. Moreover, numbers are verbalized using code provided by [@ccoreilly](https://github.com/ccoreilly), which can be found on the text/ folder or [here](https://github.com/CollectivaT-dev/catotron-cpu/blob/master/text/numbers_ca.py). ### Training results Check the Tensorboard tab to check the training profile and evaluation results along training. The model was evaluated on the test splits for each of the datasets used during training. ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 8 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 2000 - num_epochs: 10.0 - mixed_precision_training: Native AMP ### Framework versions - Transformers 4.17.0.dev0 - Pytorch 1.10.2+cu102 - Datasets 1.18.3 - Tokenizers 0.11.0 # Thanks Want to thank both [@ccoreilly](https://github.com/ccoreilly) and [@gullabi](https://github.com/gullabi) who have contributed with their own resources and knowledge into making this model possible.
{"language": ["ca"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "collectivat/tv3_parla", "generated_from_trainer", "hf-asr-leaderboard", "mozilla-foundation/common_voice_8_0", "projecte-aina/parlament_parla", "robust-speech-event"], "datasets": ["mozilla-foundation/common_voice_8_0", "collectivat/tv3_parla", "projecte-aina/parlament_parla"], "model-index": [{"name": "wav2vec2-xls-r-1b-ca", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "mozilla-foundation/common_voice_8_0 ca", "type": "mozilla-foundation/common_voice_8_0", "args": "ca"}, "metrics": [{"type": "wer", "value": 11.030639657300515, "name": "Test WER"}, {"type": "cer", "value": 2.8405630530040633, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "projecte-aina/parlament_parla ca", "type": "projecte-aina/parlament_parla", "args": "clean"}, "metrics": [{"type": "wer", "value": 6.483115660665961, "name": "Test WER"}, {"type": "cer", "value": 2.0212863746191827, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "collectivat/tv3_parla ca", "type": "collectivat/tv3_parla", "args": "ca"}, "metrics": [{"type": "wer", "value": 17.917773414943987, "name": "Test WER"}, {"type": "cer", "value": 8.872589572206396, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Catalan Dev Data", "type": "speech-recognition-community-v2/dev_data", "args": "ca"}, "metrics": [{"type": "wer", "value": 27.126683954209096, "name": "Test WER"}, {"type": "cer", "value": 14.213308815078726, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Test Data", "type": "speech-recognition-community-v2/eval_data", "args": "ca"}, "metrics": [{"type": "wer", "value": 18.7, "name": "Test WER"}]}]}]}
automatic-speech-recognition
PereLluis13/wav2vec2-xls-r-1b-ca
[ "transformers", "pytorch", "tensorboard", "wav2vec2", "automatic-speech-recognition", "collectivat/tv3_parla", "generated_from_trainer", "hf-asr-leaderboard", "mozilla-foundation/common_voice_8_0", "projecte-aina/parlament_parla", "robust-speech-event", "ca", "dataset:mozilla-foundation/common_voice_8_0", "dataset:collectivat/tv3_parla", "dataset:projecte-aina/parlament_parla", "license:apache-2.0", "model-index", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "ca" ]
TAGS #transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #collectivat/tv3_parla #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #projecte-aina/parlament_parla #robust-speech-event #ca #dataset-mozilla-foundation/common_voice_8_0 #dataset-collectivat/tv3_parla #dataset-projecte-aina/parlament_parla #license-apache-2.0 #model-index #endpoints_compatible #region-us
# wav2vec2-xls-r-1b-ca This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - CA, the tv3_parla and parlament_parla datasets. ## Model description Please check the original facebook/wav2vec2-xls-r-1b Model card. This is just a finetuned version of that model. ## Intended uses & limitations As any model trained on crowdsourced data, this model can show the biases and particularities of the data and model used to train this model. Moreover, since this is a speech recognition model, it may underperform for some lower-resourced dialects for the catalan language. ## Training and evaluation data ## Training procedure The data is preprocessed to remove characters not on the catalan alphabet. Moreover, numbers are verbalized using code provided by @ccoreilly, which can be found on the text/ folder or here. ### Training results Check the Tensorboard tab to check the training profile and evaluation results along training. The model was evaluated on the test splits for each of the datasets used during training. ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 8 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 2000 - num_epochs: 10.0 - mixed_precision_training: Native AMP ### Framework versions - Transformers 4.17.0.dev0 - Pytorch 1.10.2+cu102 - Datasets 1.18.3 - Tokenizers 0.11.0 # Thanks Want to thank both @ccoreilly and @gullabi who have contributed with their own resources and knowledge into making this model possible.
[ "# wav2vec2-xls-r-1b-ca\n\nThis model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - CA, the tv3_parla and parlament_parla datasets.", "## Model description\n\nPlease check the original facebook/wav2vec2-xls-r-1b Model card. This is just a finetuned version of that model.", "## Intended uses & limitations\n\nAs any model trained on crowdsourced data, this model can show the biases and particularities of the data and model used to train this model. Moreover, since this is a speech recognition model, it may underperform for some lower-resourced dialects for the catalan language.", "## Training and evaluation data", "## Training procedure\n\nThe data is preprocessed to remove characters not on the catalan alphabet. Moreover, numbers are verbalized using code provided by @ccoreilly, which can be found on the text/ folder or here.", "### Training results\n\nCheck the Tensorboard tab to check the training profile and evaluation results along training. The model was evaluated on the test splits for each of the datasets used during training.", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 8\n- total_train_batch_size: 64\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 2000\n- num_epochs: 10.0\n- mixed_precision_training: Native AMP", "### Framework versions\n\n- Transformers 4.17.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.18.3\n- Tokenizers 0.11.0", "# Thanks\n\nWant to thank both @ccoreilly and @gullabi who have contributed with their own resources and knowledge into making this model possible." ]
[ "TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #collectivat/tv3_parla #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #projecte-aina/parlament_parla #robust-speech-event #ca #dataset-mozilla-foundation/common_voice_8_0 #dataset-collectivat/tv3_parla #dataset-projecte-aina/parlament_parla #license-apache-2.0 #model-index #endpoints_compatible #region-us \n", "# wav2vec2-xls-r-1b-ca\n\nThis model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - CA, the tv3_parla and parlament_parla datasets.", "## Model description\n\nPlease check the original facebook/wav2vec2-xls-r-1b Model card. This is just a finetuned version of that model.", "## Intended uses & limitations\n\nAs any model trained on crowdsourced data, this model can show the biases and particularities of the data and model used to train this model. Moreover, since this is a speech recognition model, it may underperform for some lower-resourced dialects for the catalan language.", "## Training and evaluation data", "## Training procedure\n\nThe data is preprocessed to remove characters not on the catalan alphabet. Moreover, numbers are verbalized using code provided by @ccoreilly, which can be found on the text/ folder or here.", "### Training results\n\nCheck the Tensorboard tab to check the training profile and evaluation results along training. The model was evaluated on the test splits for each of the datasets used during training.", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 8\n- total_train_batch_size: 64\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 2000\n- num_epochs: 10.0\n- mixed_precision_training: Native AMP", "### Framework versions\n\n- Transformers 4.17.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.18.3\n- Tokenizers 0.11.0", "# Thanks\n\nWant to thank both @ccoreilly and @gullabi who have contributed with their own resources and knowledge into making this model possible." ]
[ 161, 74, 35, 73, 5, 48, 42, 142, 38, 30 ]
[ "passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #collectivat/tv3_parla #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #projecte-aina/parlament_parla #robust-speech-event #ca #dataset-mozilla-foundation/common_voice_8_0 #dataset-collectivat/tv3_parla #dataset-projecte-aina/parlament_parla #license-apache-2.0 #model-index #endpoints_compatible #region-us \n# wav2vec2-xls-r-1b-ca\n\nThis model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - CA, the tv3_parla and parlament_parla datasets.## Model description\n\nPlease check the original facebook/wav2vec2-xls-r-1b Model card. This is just a finetuned version of that model.## Intended uses & limitations\n\nAs any model trained on crowdsourced data, this model can show the biases and particularities of the data and model used to train this model. Moreover, since this is a speech recognition model, it may underperform for some lower-resourced dialects for the catalan language.## Training and evaluation data## Training procedure\n\nThe data is preprocessed to remove characters not on the catalan alphabet. Moreover, numbers are verbalized using code provided by @ccoreilly, which can be found on the text/ folder or here.### Training results\n\nCheck the Tensorboard tab to check the training profile and evaluation results along training. The model was evaluated on the test splits for each of the datasets used during training." ]
[ -0.10467194765806198, 0.2354826033115387, -0.004875562619417906, 0.044299203902482986, 0.08584043383598328, -0.030604368075728416, 0.0975818783044815, 0.10234085470438004, -0.04564531147480011, 0.09180548042058945, -0.017188681289553642, 0.010978713631629944, 0.09392837435007095, 0.08590158075094223, 0.05014185979962349, -0.1643776148557663, 0.044321492314338684, -0.04452788829803467, 0.12153790146112442, 0.0979892835021019, 0.08910061419010162, -0.08954225480556488, 0.05317184329032898, 0.04631244018673897, -0.03908465802669525, 0.029475286602973938, 0.019263548776507378, -0.12013322114944458, 0.07961348444223404, 0.06305129081010818, 0.04341832175850868, 0.02546735666692257, 0.05789729207754135, -0.15976087749004364, 0.03003571182489395, 0.11155276000499725, -0.004398665856570005, 0.05705887824296951, 0.15659886598587036, -0.07147656381130219, 0.1101142093539238, -0.006813102401793003, 0.08210527896881104, 0.07598847895860672, -0.11940702050924301, -0.1325797140598297, -0.10282204300165176, 0.03423706814646721, 0.0734989196062088, 0.1132754236459732, -0.06622383743524551, 0.08942960202693939, -0.0587807260453701, 0.032612960785627365, 0.18036434054374695, -0.1586192548274994, -0.037926167249679565, -0.04040022939443588, -0.007122113835066557, 0.005516746547073126, -0.05867519974708557, 0.04086381569504738, 0.027845922857522964, 0.000964942155405879, 0.04855244606733322, -0.03136267885565758, -0.034700699150562286, -0.07042169570922852, -0.12117882072925568, -0.05912262946367264, 0.060000982135534286, -0.0010760821169242263, -0.04908573254942894, -0.16194696724414825, 0.008041560649871826, 0.06506004929542542, 0.0086319325491786, 0.03520089015364647, -0.006104596424847841, -0.007297338452190161, -0.017632760107517242, -0.0491703562438488, -0.09099751710891724, -0.07115975022315979, -0.0588025264441967, 0.0723477303981781, 0.025636201724410057, 0.02550358511507511, -0.05427378788590431, 0.10616331547498703, 0.010630577802658081, -0.09186127781867981, 0.0009572422713972628, -0.02179012820124626, -0.13389214873313904, -0.05494202300906181, -0.018471533432602882, -0.11710965633392334, 0.011528820730745792, 0.14535045623779297, -0.005042331293225288, 0.04984583333134651, -0.0570768304169178, 0.014747567474842072, 0.10114607959985733, 0.11930996179580688, -0.12158588320016861, 0.007773781195282936, -0.003890564199537039, 0.031111400574445724, -0.009587387554347515, -0.01040100771933794, -0.020220665261149406, -0.023414751514792442, 0.05484892427921295, 0.08527661114931107, 0.09735492616891861, 0.0021981589961797, -0.08557556569576263, -0.039506323635578156, 0.09253876656293869, -0.14507749676704407, 0.04832541570067406, 0.06369375437498093, -0.05040200799703598, 0.061009783297777176, -0.015482690185308456, -0.005633549764752388, -0.09184437990188599, -0.01631837524473667, -0.047352395951747894, 0.013479686342179775, -0.05349652096629143, -0.06003841012716293, 0.0353238619863987, -0.11156230419874191, -0.06648949533700943, -0.10547012090682983, -0.11432278901338577, -0.10218396037817001, -0.005185467656701803, -0.09402614086866379, 0.04089384526014328, -0.03920653834939003, 0.025027336552739143, -0.022420091554522514, -0.007501727901399136, 0.10451351851224899, -0.034790314733982086, 0.03885631263256073, -0.026424666866660118, 0.0490410141646862, 0.08467535674571991, 0.05490488186478615, -0.08189469575881958, 0.03530137240886688, -0.13550063967704773, 0.1488502472639084, -0.0625973641872406, -0.059575311839580536, -0.11915463209152222, -0.009015916846692562, -0.004832498263567686, 0.019943954423069954, 0.058169957250356674, 0.1376836597919464, -0.24033141136169434, -0.056888844817876816, 0.1889788657426834, -0.13143303990364075, -0.01748570427298546, 0.11064324527978897, -0.055206049233675, 0.01839977689087391, 0.1129486933350563, 0.15083758533000946, 0.10161957144737244, -0.1291418820619583, -0.07090456038713455, -0.0507972277700901, -0.013024195097386837, 0.09611973911523819, 0.09123948216438293, -0.07566571235656738, 0.08721929788589478, -0.004885611589998007, -0.07000633329153061, -0.022686317563056946, -0.0004788558871950954, -0.05254141241312027, 0.04471753165125847, -0.03500230610370636, 0.016090231016278267, -0.000790469057392329, -0.033925436437129974, -0.005471118725836277, -0.07972747832536697, 0.026619501411914825, 0.10304776579141617, -0.061137277632951736, 0.016257394105196, -0.11756633222103119, 0.1351308822631836, -0.07636374235153198, -0.024662017822265625, -0.1960584968328476, 0.01936551183462143, 0.012772799469530582, -0.1154974102973938, 0.04828955978155136, 0.005931918043643236, 0.027788633480668068, -0.0019003375200554729, -0.0044085062108933926, 0.01826995238661766, -0.01694214902818203, -0.0057430267333984375, -0.0061605945229530334, -0.13212059438228607, -0.04388357326388359, -0.05219733342528343, 0.19477638602256775, -0.15343931317329407, -0.005245584528893232, 0.10636117309331894, 0.14599400758743286, 0.014740427024662495, -0.07288854569196701, 0.053111620247364044, 0.021520953625440598, 0.02676508203148842, -0.05704499036073685, 0.02637750469148159, 0.022728707641363144, -0.008364677429199219, 0.07161559164524078, -0.13022342324256897, -0.12736812233924866, 0.08160912990570068, -0.011969120241701603, -0.09418079257011414, 0.016864502802491188, -0.00593396183103323, 0.001023873221129179, -0.14765281975269318, -0.051297083497047424, 0.25312870740890503, 0.05959397181868553, 0.07631701231002808, -0.11851374804973602, -0.05506143346428871, 0.01956658996641636, -0.034431420266628265, -0.042934756726026535, 0.07122918963432312, 0.05443276837468147, -0.04015263915061951, 0.04751327261328697, -0.04468313604593277, 0.009613942354917526, 0.13786527514457703, 0.019342822954058647, -0.1123470887541771, -0.019227195531129837, -0.019641105085611343, 0.021457865834236145, 0.06671244651079178, -0.054603997617959976, -0.012124559842050076, 0.025439471006393433, 0.0017356481403112411, 0.05560004338622093, -0.11048714816570282, 0.025968633592128754, 0.028374413028359413, -0.028064055368304253, -0.04203806817531586, -0.02071538008749485, 0.02029440365731716, 0.09256219118833542, 0.030622953549027443, 0.040180642157793045, -0.04437904804944992, -0.04280754178762436, -0.14015589654445648, 0.12199835479259491, -0.04711152985692024, -0.27754947543144226, -0.13525433838367462, 0.04768579825758934, -0.016474386677145958, 0.01780320331454277, 0.008838296867907047, -0.04934849217534065, -0.041357360780239105, -0.09071847051382065, -0.022710079327225685, 0.009319264441728592, -0.047863658517599106, 0.01400324609130621, 0.030199648812413216, 0.046688053756952286, -0.07772878557443619, 0.007271824404597282, -0.0038405165541917086, -0.088791623711586, -0.02684658393263817, 0.037392254918813705, 0.07289250940084457, 0.10389377176761627, 0.025443030521273613, 0.020335160195827484, -0.01894054375588894, 0.16777364909648895, -0.12798583507537842, 0.04674840345978737, 0.15315626561641693, -0.012617876753211021, 0.011709282174706459, 0.07374517619609833, 0.005968946497887373, -0.048588983714580536, -0.011239081621170044, 0.07176180928945541, -0.044705040752887726, -0.20462194085121155, -0.055607203394174576, -0.004446570295840502, -0.08214348554611206, 0.13409490883350372, 0.031099488958716393, -0.002454268280416727, 0.06347156316041946, -0.031994663178920746, -0.032068416476249695, 0.05784787982702255, 0.05057563632726669, 0.028893502429127693, 0.020932937040925026, 0.07491883635520935, -0.057103481143713, 0.032090477645397186, 0.11713433265686035, 0.0298837348818779, 0.16179734468460083, -0.04583929851651192, 0.1587454080581665, 0.09410509467124939, 0.027506273239850998, 0.0009316251380369067, 0.009414195083081722, 0.00025295783416368067, 0.030825022608041763, 0.031093280762434006, -0.08315419405698776, -0.021721752360463142, -0.008373737335205078, 0.10465091466903687, -0.08191284537315369, -0.050862789154052734, -0.08232641220092773, 0.05823195353150368, 0.22009404003620148, 0.04075431078672409, -0.2120041698217392, -0.00886018667370081, 0.014089825563132763, -0.047182511538267136, -0.04911592975258827, -0.0016261417185887694, 0.1039244681596756, -0.17043505609035492, 0.09209207445383072, 0.022646596655249596, 0.09782615303993225, -0.17599628865718842, -0.050181180238723755, 0.002639483893290162, 0.03691621124744415, 0.0002885927679017186, 0.082598976790905, -0.18160070478916168, 0.1122845932841301, 0.011172305792570114, 0.10612736642360687, -0.0786869078874588, -0.0031715049408376217, 0.01089561264961958, -0.0299613606184721, 0.15236063301563263, 0.009702582843601704, -0.0078037758357822895, -0.06355135887861252, -0.09240726381540298, 0.011479593813419342, -0.006241315510123968, -0.07602641731500626, 0.08271393179893494, 0.014024392701685429, -0.007631510961800814, -0.06025607883930206, -0.06620375066995621, -0.12752152979373932, -0.13406457006931305, 0.0020783557556569576, 0.014353184960782528, -0.015899505466222763, -0.06677555292844772, -0.0700201541185379, -0.13151177763938904, 0.09139173477888107, -0.11445562541484833, -0.07298986613750458, -0.09131178259849548, 0.004212518222630024, 0.08918322622776031, -0.05393880233168602, 0.02503347583115101, 0.007553385104984045, 0.17856907844543457, -0.02556333690881729, -0.055912379175424576, 0.017870046198368073, -0.06274786591529846, -0.16300468146800995, -0.017387263476848602, 0.125823974609375, 0.12943223118782043, 0.028017830103635788, 0.059358514845371246, 0.05309227481484413, 0.014020728878676891, -0.10207460075616837, -0.014379906468093395, 0.12067730724811554, 0.01365814357995987, 0.040742259472608566, 0.04287685453891754, -0.2088562697172165, -0.09636249393224716, 0.011349489912390709, 0.08910297602415085, 0.15167991816997528, -0.07001712173223495, 0.10934523493051529, 0.195436492562294, -0.09077278524637222, -0.13948826491832733, 0.02360180765390396, 0.04857156425714493, -0.003146004630252719, 0.030620988458395004, -0.21502052247524261, 0.04731491208076477, 0.1192193329334259, -0.006370055489242077, 0.010434573516249657, -0.3179107904434204, -0.11909866333007812, 0.08954102545976639, 0.05208941549062729, -0.0825439989566803, -0.05243602395057678, -0.06030851975083351, -0.043286483734846115, -0.07724706828594208, 0.09753143042325974, -0.1698591560125351, 0.07250604778528214, 0.018552599474787712, 0.01184101216495037, 0.018715376034379005, -0.0529651939868927, 0.13549555838108063, 0.02918507345020771, 0.010500697419047356, -0.05361089110374451, -0.011202160269021988, 0.11177211999893188, -0.028524227440357208, 0.06263376027345657, 0.029953019693493843, 0.031827591359615326, -0.13137565553188324, -0.07010527700185776, -0.09757550060749054, 0.03166110813617706, -0.050863370299339294, 0.0017016136553138494, -0.029098335653543472, 0.07622745633125305, 0.0596495196223259, 0.0085159195587039, -0.09687800705432892, -0.12284396588802338, 0.02475696988403797, 0.14022321999073029, 0.11840502172708511, 0.07486935704946518, -0.10749512910842896, -0.035283610224723816, -0.016895782202482224, 0.06112728640437126, 0.0003992921847384423, 0.05530579760670662, 0.07244933396577835, 0.0439145565032959, 0.11174657940864563, -0.01418579462915659, -0.18059980869293213, 0.07711886614561081, 0.033482618629932404, -0.0697149708867073, -0.10325219482183456, 0.005843853112310171, 0.014663469046354294, -0.09431855380535126, -0.027493523433804512, 0.1106511652469635, 0.01111547090113163, -0.05607074499130249, 0.0011453682091087103, 0.03266093134880066, -0.05764983966946602, 0.14660100638866425, 0.014336648397147655, 0.04161404073238373, -0.08904993534088135, 0.1209603101015091, 0.09915224462747574, -0.03176308423280716, 0.03786149248480797, 0.06331433355808258, -0.07956545799970627, -0.061813414096832275, -0.0746380016207695, 0.1100560873746872, -0.008498964831233025, -0.09268347918987274, -0.04810476303100586, -0.06239332631230354, 0.06538118422031403, 0.09582804143428802, 0.02004961669445038, 0.04555676877498627, 0.006529808044433594, -0.032271385192871094, -0.06042294204235077, 0.012877775356173515, 0.07049906998872757, -0.0075360569171607494, -0.03645111992955208, 0.06440757215023041, 0.04887087270617485, -0.014035943895578384, -0.027447188273072243, -0.08436273038387299, -0.08059900999069214, 0.05647957697510719, -0.14794525504112244, 0.035718806087970734, -0.1468355506658554, -0.0006985630607232451, -0.016001177951693535, 0.006503773387521505, -0.014036851935088634, 0.04110686480998993, -0.03434518352150917, -0.021620407700538635, -0.03922273591160774, 0.07456183433532715, -0.19485507905483246, 0.059846583753824234, 0.01625641994178295, -0.06493155658245087, 0.08249901235103607, 0.022657381370663643, -0.005167438182979822, 0.002806853735819459, -0.1057385727763176, -0.03688424453139305, -0.03327500820159912, 0.01323346234858036, 0.030212197452783585, -0.16997165977954865, 0.04527449980378151, 0.023054076358675957, -0.02300156094133854, 0.015744809061288834, 0.05152091383934021, -0.08444911986589432, 0.0823502242565155, 0.06315259635448456, -0.03801881894469261, -0.07247559726238251, 0.11484784632921219, 0.09146131575107574, 0.04633817821741104, 0.1043834313750267, -0.06305649131536484, 0.05042337626218796, -0.1259109526872635, -0.012053881771862507, -0.01263207197189331, 0.0380612388253212, 0.08273021131753922, 0.010138191282749176, 0.06983867287635803, -0.013219454325735569, 0.16748127341270447, 0.025062665343284607, 0.034441858530044556, 0.03494250774383545, 0.01585146225988865, -0.06382729858160019, 0.05375975742936134, 0.04720131680369377, 0.019282570108771324, -0.024676496163010597, 0.008607601746916771, -0.025071119889616966, -0.032965559512376785, 0.06427545100450516, 0.07692217081785202, 0.16274705529212952, 0.0712779089808464, 0.016780102625489235, 0.0860895961523056, -0.02267674170434475, -0.06594467163085938, 0.06294221431016922, -0.10417693108320236, 0.02204788289964199, -0.07494587451219559, 0.006566735915839672, 0.14861167967319489, -0.1541776806116104, 0.08644264936447144, 0.045596953481435776, -0.05184518173336983, -0.08575370907783508, -0.28335514664649963, -0.03920828178524971, 0.027279648929834366, 0.00884904433041811, -0.06259345263242722, 0.03428894281387329, 0.06381083279848099, 0.041786108165979385, -0.05540617182850838, 0.14976784586906433, -0.04396096616983414, -0.1244373768568039, 0.09267350286245346, -0.0010563801042735577, 0.04160871356725693, -0.018052469938993454, 0.04655845835804939, 0.08223120123147964, 0.07587597519159317, 0.08805285394191742, 0.04706990346312523, 0.022189849987626076, 0.036452990025281906, -0.02837485447525978, -0.0699467882514, 0.02194872871041298, -0.049471136182546616, 0.03441115468740463, 0.14738410711288452, 0.09327682107686996, 0.001607316080480814, 0.003403709502890706, 0.18135544657707214, -0.02360762096941471, -0.03634140267968178, -0.189087375998497, 0.0861758291721344, -0.06348857283592224, -0.02930624596774578, 0.010271534323692322, -0.1139969527721405, 0.021106522530317307, 0.20166918635368347, 0.14266587793827057, -0.010485022328794003, 0.018340442329645157, -0.02425544336438179, 0.00274942209944129, 0.002602605614811182, 0.06506973505020142, -0.005632190499454737, 0.14304177463054657, -0.05257020518183708, 0.05491963401436806, -0.014957649633288383, -0.037075139582157135, -0.037193529307842255, 0.039275430142879486, -0.03885690122842789, 0.024633167311549187, -0.08728300780057907, 0.14269565045833588, -0.0457882322371006, -0.21649618446826935, -0.017699630931019783, -0.007108019664883614, -0.14217637479305267, 0.016527576372027397, -0.06584449857473373, 0.0173483993858099, 0.06430330872535706, 0.03386766090989113, 0.02794826216995716, 0.11429527401924133, 0.04141467809677124, 0.0012293029576539993, -0.10802509635686874, 0.08974133431911469, -0.04357520118355751, 0.22978714108467102, -0.035930246114730835, 0.06236289069056511, 0.09264135360717773, 0.02864212915301323, -0.13509954512119293, 0.05776945874094963, 0.027882542461156845, -0.026231709867715836, 0.03529466688632965, 0.15556931495666504, 0.006271732971072197, 0.053998786956071854, 0.07709626108407974, 0.0071855224668979645, 0.075510673224926, -0.11294802278280258, 0.06675887107849121, -0.13603360950946808, 0.09238291531801224, -0.08144951611757278, 0.13466374576091766, 0.07665825635194778, -0.03359824791550636, 0.010339084081351757, -0.05144602060317993, 0.026656070724129677, 0.00539744645357132, 0.04176274687051773, -0.03206281363964081, -0.17443342506885529, 0.023463193327188492, -0.03716697171330452, 0.06081339716911316, -0.21227434277534485, 0.007499842904508114, 0.025111882016062737, -0.10215378552675247, 0.042841363698244095, 0.0655643418431282, -0.010420216247439384, 0.05844516679644585, -0.03483168035745621, -0.1221543475985527, 0.032570645213127136, 0.10864922404289246, -0.1096661165356636, -0.031283702701330185 ]
null
null
transformers
# wav2vec2-xls-r-300m-ca-lm This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - CA, the [tv3_parla](https://huggingface.co/datasets/collectivat/tv3_parla) and [parlament_parla](https://huggingface.co/datasets/projecte-aina/parlament_parla) datasets. It achieves the following results on the evaluation set (for the three datasets and without the LM): - Loss: 0.2472 - Wer: 0.1499 ## Model description Please check the original [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) Model card. This is just a finetuned version of that model. ## Intended uses & limitations As any model trained on crowdsourced data, this model can show the biases and particularities of the data and model used to train this model. Moreover, since this is a speech recognition model, it may underperform for some lower-resourced dialects for the catalan language. ## Training and evaluation data More information needed ## Training procedure The data is preprocessed to remove characters not on the catalan alphabet. Moreover, numbers are verbalized using code provided by [@ccoreilly](https://github.com/ccoreilly), which can be found on the text/ folder or [here](https://github.com/CollectivaT-dev/catotron-cpu/blob/master/text/numbers_ca.py). ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 7.5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 2000 - num_epochs: 18.0 - mixed_precision_training: Native AMP ### Training results Check the Tensorboard tab to check the training profile and evaluation results along training. The model was evaluated on the test splits for each of the datasets used during training. | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:-----:|:---------------:|:------:| | 6.2099 | 0.09 | 500 | 3.4125 | 1.0 | | 2.9961 | 0.18 | 1000 | 2.9224 | 1.0 | | 2.2147 | 0.26 | 1500 | 0.6521 | 0.5568 | | 1.3017 | 0.35 | 2000 | 0.3153 | 0.2761 | | 1.1196 | 0.44 | 2500 | 0.2444 | 0.2367 | | 1.0712 | 0.53 | 3000 | 0.2324 | 0.2132 | | 1.052 | 0.62 | 3500 | 0.2173 | 0.2032 | | 1.2813 | 2.13 | 4000 | 0.3326 | 0.2099 | | 1.2365 | 2.4 | 4500 | 0.3224 | 0.2003 | | 1.2193 | 2.66 | 5000 | 0.3198 | 0.1957 | | 1.2072 | 2.93 | 5500 | 0.3063 | 0.1933 | | 1.213 | 3.2 | 6000 | 0.3051 | 0.1980 | | 1.2074 | 3.46 | 6500 | 0.3012 | 0.1879 | | 1.1918 | 3.73 | 7000 | 0.2947 | 0.1829 | | 1.1893 | 4.0 | 7500 | 0.2895 | 0.1807 | | 1.1751 | 4.26 | 8000 | 0.2878 | 0.1776 | | 1.1628 | 4.53 | 8500 | 0.2835 | 0.1731 | | 1.1577 | 4.79 | 9000 | 0.2816 | 0.1761 | | 1.1448 | 5.06 | 9500 | 0.2757 | 0.1740 | | 1.1407 | 5.33 | 10000 | 0.2768 | 0.1798 | | 1.1401 | 5.59 | 10500 | 0.2780 | 0.1816 | | 1.1333 | 5.86 | 11000 | 0.2748 | 0.1750 | | 1.1571 | 6.13 | 11500 | 0.2808 | 0.1708 | | 1.1505 | 6.39 | 12000 | 0.2726 | 0.1692 | | 1.1519 | 6.66 | 12500 | 0.2749 | 0.1654 | | 1.136 | 6.93 | 13000 | 0.2765 | 0.1643 | | 1.1326 | 7.19 | 13500 | 0.2706 | 0.1668 | | 1.1342 | 7.46 | 14000 | 0.2665 | 0.1638 | | 1.1286 | 7.72 | 14500 | 0.2669 | 0.1636 | | 1.1243 | 7.99 | 15000 | 0.2619 | 0.1623 | | 1.1173 | 8.26 | 15500 | 0.2652 | 0.1604 | | 1.1129 | 8.52 | 16000 | 0.2610 | 0.1598 | | 1.1091 | 8.79 | 16500 | 0.2608 | 0.1584 | | 1.1053 | 9.06 | 17000 | 0.2633 | 0.1664 | | 1.1004 | 9.32 | 17500 | 0.2594 | 0.1662 | | 1.0995 | 9.59 | 18000 | 0.2623 | 0.1569 | | 1.0964 | 9.86 | 18500 | 0.2624 | 0.1597 | | 1.09 | 10.12 | 19000 | 0.2577 | 0.1578 | | 1.089 | 10.39 | 19500 | 0.2574 | 0.1531 | | 1.0864 | 10.66 | 20000 | 0.2556 | 0.1546 | | 1.0806 | 10.92 | 20500 | 0.2548 | 0.1583 | | 1.0842 | 11.19 | 21000 | 0.2550 | 0.1542 | | 1.0805 | 11.45 | 21500 | 0.2561 | 0.1524 | | 1.0722 | 11.72 | 22000 | 0.2540 | 0.1566 | | 1.0763 | 11.99 | 22500 | 0.2549 | 0.1572 | | 1.0835 | 12.25 | 23000 | 0.2586 | 0.1521 | | 1.0883 | 12.52 | 23500 | 0.2583 | 0.1519 | | 1.0888 | 12.79 | 24000 | 0.2551 | 0.1582 | | 1.0933 | 13.05 | 24500 | 0.2628 | 0.1537 | | 1.0799 | 13.32 | 25000 | 0.2600 | 0.1508 | | 1.0804 | 13.59 | 25500 | 0.2620 | 0.1475 | | 1.0814 | 13.85 | 26000 | 0.2537 | 0.1517 | | 1.0693 | 14.12 | 26500 | 0.2560 | 0.1542 | | 1.0724 | 14.38 | 27000 | 0.2540 | 0.1574 | | 1.0704 | 14.65 | 27500 | 0.2548 | 0.1626 | | 1.0729 | 14.92 | 28000 | 0.2548 | 0.1601 | | 1.0724 | 15.18 | 28500 | 0.2511 | 0.1512 | | 1.0655 | 15.45 | 29000 | 0.2498 | 0.1490 | | 1.0608 | 15.98 | 30000 | 0.2487 | 0.1481 | | 1.0541 | 16.52 | 31000 | 0.2468 | 0.1504 | | 1.0584 | 17.05 | 32000 | 0.2467 | 0.1493 | | 1.0507 | 17.58 | 33000 | 0.2481 | 0.1517 | ### Framework versions - Transformers 4.16.0.dev0 - Pytorch 1.10.1+cu102 - Datasets 1.18.3 - Tokenizers 0.11.0 # Thanks Want to thank both [@ccoreilly](https://github.com/ccoreilly) and [@gullabi](https://github.com/gullabi) who have contributed with their own resources and knowledge into making this model possible.
{"language": ["ca"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "collectivat/tv3_parla", "generated_from_trainer", "hf-asr-leaderboard", "mozilla-foundation/common_voice_8_0", "projecte-aina/parlament_parla", "robust-speech-event"], "datasets": ["mozilla-foundation/common_voice_8_0", "collectivat/tv3_parla", "projecte-aina/parlament_parla"], "model-index": [{"name": "wav2vec2-xls-r-300m-ca-lm", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "mozilla-foundation/common_voice_8_0 ca", "type": "mozilla-foundation/common_voice_8_0", "args": "ca"}, "metrics": [{"type": "wer", "value": 6.771703090587865, "name": "Test WER"}, {"type": "cer", "value": 2.100777784371229, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "projecte-aina/parlament_parla ca", "type": "projecte-aina/parlament_parla", "args": "clean"}, "metrics": [{"type": "wer", "value": 5.565360630662431, "name": "Test WER"}, {"type": "cer", "value": 1.8594390167034354, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "collectivat/tv3_parla ca", "type": "collectivat/tv3_parla", "args": "ca"}, "metrics": [{"type": "wer", "value": 13.53312545713516, "name": "Test WER"}, {"type": "cer", "value": 8.684635913340555, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Catalan Dev Data", "type": "speech-recognition-community-v2/dev_data", "args": "ca"}, "metrics": [{"type": "wer", "value": 26.04515843400164, "name": "Test WER"}, {"type": "cer", "value": 15.056890012642224, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Test Data", "type": "speech-recognition-community-v2/eval_data", "args": "ca"}, "metrics": [{"type": "wer", "value": 17.68, "name": "Test WER"}]}]}]}
automatic-speech-recognition
PereLluis13/wav2vec2-xls-r-300m-ca-lm
[ "transformers", "pytorch", "tensorboard", "wav2vec2", "automatic-speech-recognition", "collectivat/tv3_parla", "generated_from_trainer", "hf-asr-leaderboard", "mozilla-foundation/common_voice_8_0", "projecte-aina/parlament_parla", "robust-speech-event", "ca", "dataset:mozilla-foundation/common_voice_8_0", "dataset:collectivat/tv3_parla", "dataset:projecte-aina/parlament_parla", "license:apache-2.0", "model-index", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "ca" ]
TAGS #transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #collectivat/tv3_parla #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #projecte-aina/parlament_parla #robust-speech-event #ca #dataset-mozilla-foundation/common_voice_8_0 #dataset-collectivat/tv3_parla #dataset-projecte-aina/parlament_parla #license-apache-2.0 #model-index #endpoints_compatible #region-us
wav2vec2-xls-r-300m-ca-lm ========================= This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the MOZILLA-FOUNDATION/COMMON\_VOICE\_8\_0 - CA, the tv3\_parla and parlament\_parla datasets. It achieves the following results on the evaluation set (for the three datasets and without the LM): * Loss: 0.2472 * Wer: 0.1499 Model description ----------------- Please check the original facebook/wav2vec2-xls-r-300m Model card. This is just a finetuned version of that model. Intended uses & limitations --------------------------- As any model trained on crowdsourced data, this model can show the biases and particularities of the data and model used to train this model. Moreover, since this is a speech recognition model, it may underperform for some lower-resourced dialects for the catalan language. Training and evaluation data ---------------------------- More information needed Training procedure ------------------ The data is preprocessed to remove characters not on the catalan alphabet. Moreover, numbers are verbalized using code provided by @ccoreilly, which can be found on the text/ folder or here. ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 7.5e-05 * train\_batch\_size: 32 * eval\_batch\_size: 32 * seed: 42 * gradient\_accumulation\_steps: 4 * total\_train\_batch\_size: 128 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * lr\_scheduler\_warmup\_steps: 2000 * num\_epochs: 18.0 * mixed\_precision\_training: Native AMP ### Training results Check the Tensorboard tab to check the training profile and evaluation results along training. The model was evaluated on the test splits for each of the datasets used during training. ### Framework versions * Transformers 4.16.0.dev0 * Pytorch 1.10.1+cu102 * Datasets 1.18.3 * Tokenizers 0.11.0 Thanks ====== Want to thank both @ccoreilly and @gullabi who have contributed with their own resources and knowledge into making this model possible.
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 18.0\n* mixed\\_precision\\_training: Native AMP", "### Training results\n\n\nCheck the Tensorboard tab to check the training profile and evaluation results along training. The model was evaluated on the test splits for each of the datasets used during training.", "### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.18.3\n* Tokenizers 0.11.0\n\n\nThanks\n======\n\n\nWant to thank both @ccoreilly and @gullabi who have contributed with their own resources and knowledge into making this model possible." ]
[ "TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #collectivat/tv3_parla #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #projecte-aina/parlament_parla #robust-speech-event #ca #dataset-mozilla-foundation/common_voice_8_0 #dataset-collectivat/tv3_parla #dataset-projecte-aina/parlament_parla #license-apache-2.0 #model-index #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 18.0\n* mixed\\_precision\\_training: Native AMP", "### Training results\n\n\nCheck the Tensorboard tab to check the training profile and evaluation results along training. The model was evaluated on the test splits for each of the datasets used during training.", "### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.18.3\n* Tokenizers 0.11.0\n\n\nThanks\n======\n\n\nWant to thank both @ccoreilly and @gullabi who have contributed with their own resources and knowledge into making this model possible." ]
[ 161, 160, 42, 72 ]
[ "passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #collectivat/tv3_parla #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #projecte-aina/parlament_parla #robust-speech-event #ca #dataset-mozilla-foundation/common_voice_8_0 #dataset-collectivat/tv3_parla #dataset-projecte-aina/parlament_parla #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 18.0\n* mixed\\_precision\\_training: Native AMP### Training results\n\n\nCheck the Tensorboard tab to check the training profile and evaluation results along training. The model was evaluated on the test splits for each of the datasets used during training.### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.18.3\n* Tokenizers 0.11.0\n\n\nThanks\n======\n\n\nWant to thank both @ccoreilly and @gullabi who have contributed with their own resources and knowledge into making this model possible." ]
[ -0.1037735566496849, 0.16864806413650513, -0.006274207029491663, 0.0739114060997963, 0.07064073532819748, 0.02520143613219261, 0.048301294445991516, 0.16067075729370117, 0.021686352789402008, 0.14008595049381256, 0.042390450835227966, 0.09102736413478851, 0.10175461322069168, 0.08147802948951721, -0.0032629056368023157, -0.22798281908035278, -0.014917992986738682, -0.04647151753306389, -0.07841093838214874, 0.11563113331794739, 0.09429129958152771, -0.08326639235019684, 0.05235009267926216, 0.010703658685088158, -0.016478702425956726, -0.023271268233656883, -0.03477642685174942, -0.023938022553920746, 0.08857075124979019, 0.023340700194239616, 0.04142114520072937, 0.059318140149116516, 0.07734952867031097, -0.2841726243495941, 0.023432571440935135, 0.08140309154987335, 0.009846126660704613, 0.07630540430545807, 0.12538671493530273, -0.07743306457996368, 0.11052867770195007, -0.09166122227907181, 0.0622323676943779, 0.06987899541854858, -0.12712980806827545, -0.2262381613254547, -0.1176464855670929, 0.04549189656972885, 0.15374243259429932, 0.044634219259023666, -0.03906483203172684, 0.08223122358322144, -0.053697820752859116, 0.04435895010828972, 0.22738732397556305, -0.23370958864688873, -0.058610379695892334, -0.03278898447751999, 0.01859361305832863, 0.030601156875491142, -0.07927264273166656, 0.02124006487429142, -0.0077434382401406765, -0.03152646869421005, 0.09173772484064102, 0.0009907508501783013, 0.1383359581232071, 0.02519870735704899, -0.13558217883110046, -0.05948638170957565, 0.06279811263084412, 0.09619410336017609, -0.028502123430371284, -0.1418440341949463, -0.02061101421713829, -0.10782653093338013, -0.003120948327705264, -0.01671871542930603, 0.015443445183336735, -0.025222277268767357, -0.05002092942595482, -0.0019157306523993611, -0.05147544667124748, -0.06656701862812042, 0.053592514246702194, 0.1160762682557106, 0.044392723590135574, -0.028129464015364647, 0.03302881494164467, 0.1106027364730835, 0.07617854326963425, -0.13451224565505981, -0.012636873871088028, -0.0023055437486618757, -0.18923524022102356, -0.01718713901937008, 0.006655509117990732, -0.023967750370502472, 0.03169115260243416, 0.2022426426410675, 0.06581616401672363, 0.04840861260890961, 0.021893374621868134, -0.007485712878406048, -0.00968709122389555, 0.12538571655750275, -0.06242893636226654, -0.12589794397354126, -0.054172586649656296, 0.08517132699489594, -0.005700606852769852, -0.018194526433944702, -0.02363266982138157, 0.04737813025712967, 0.0809108316898346, 0.0974913015961647, 0.07515319436788559, -0.00008871898899087682, -0.11314931511878967, -0.05153752863407135, 0.08309019356966019, -0.11357467621564865, 0.07052124291658401, 0.09306212514638901, -0.048795923590660095, -0.017980774864554405, -0.03888951241970062, 0.040754351764917374, -0.030793456360697746, 0.08988336473703384, -0.054060615599155426, -0.0032116989605128765, -0.04280116781592369, -0.02742222137749195, 0.022580144926905632, -0.07066353410482407, -0.04411972686648369, -0.06980374455451965, -0.03218000382184982, -0.08981412649154663, 0.10101059079170227, -0.11307580024003983, -0.06621026247739792, -0.03120395541191101, -0.017321154475212097, 0.05024297535419464, -0.0073570432141423225, 0.14763231575489044, -0.045142725110054016, 0.06377005577087402, -0.0256356168538332, 0.047550227493047714, 0.14116781949996948, 0.06370050460100174, -0.029823821038007736, 0.04029688239097595, -0.15838158130645752, 0.10048888623714447, -0.0857972577214241, 0.016865292564034462, -0.18229219317436218, -0.0738324448466301, 0.0074002500623464584, -0.052819330245256424, 0.06644676625728607, 0.12306834757328033, -0.15476743876934052, -0.059142034500837326, 0.1071125790476799, -0.056235358119010925, -0.0705515593290329, 0.12458710372447968, -0.024841217324137688, -0.03682229667901993, 0.01186264120042324, 0.14229074120521545, 0.08413641899824142, -0.12560416758060455, 0.010001154616475105, -0.11594059318304062, 0.05507475882768631, 0.11092077940702438, 0.11962639540433884, -0.06973262131214142, 0.05830381438136101, -0.03792016953229904, -0.08394226431846619, -0.010652774944901466, -0.04193231463432312, -0.07937134802341461, 0.009899750351905823, -0.06375589966773987, 0.016175754368305206, 0.03851356729865074, -0.025647955015301704, -0.05070928856730461, -0.15485018491744995, -0.08783133327960968, 0.08434189110994339, -0.06520882993936539, 0.032787274569272995, -0.09728936851024628, 0.08390790969133377, 0.010849304497241974, -0.018817054107785225, -0.1286519467830658, -0.03137411177158356, 0.052457042038440704, -0.13082924485206604, 0.00006482993921963498, -0.019114378839731216, 0.04792342334985733, 0.019580421969294548, -0.03623092919588089, -0.04132910817861557, -0.012654502876102924, -0.012068366631865501, -0.03157830983400345, -0.21863172948360443, -0.051772795617580414, -0.005114186555147171, 0.2272590547800064, -0.14612150192260742, -0.007595043163746595, 0.08446627855300903, 0.17969627678394318, 0.032401520758867264, -0.0852268636226654, 0.029477283358573914, 0.003987407311797142, -0.006279671564698219, -0.053447891026735306, 0.04203778877854347, -0.013161235488951206, -0.11725752800703049, 0.05977805703878403, -0.1207672581076622, -0.10807882249355316, 0.048952504992485046, 0.05699366331100464, -0.05610073730349541, -0.053303852677345276, -0.0528145357966423, -0.03684483468532562, -0.04308580979704857, -0.0038710501976311207, 0.22111773490905762, 0.05773710086941719, 0.05785353109240532, -0.08613526821136475, -0.10213851183652878, -0.005613066256046295, 0.012081719934940338, -0.02518763206899166, 0.1347590982913971, 0.06716052442789078, -0.08770186454057693, 0.06284619122743607, 0.06496598571538925, 0.03218849003314972, 0.0460532121360302, -0.015572791919112206, -0.08652149140834808, -0.06421443074941635, 0.03580741956830025, 0.04154277592897415, 0.10352703928947449, -0.0971742570400238, 0.011966770514845848, 0.053324874490499496, 0.012160008773207664, -0.010708327405154705, -0.13099297881126404, 0.007369286846369505, 0.03236635401844978, -0.06726839393377304, 0.009255144745111465, -0.0014201393350958824, -0.009005631320178509, 0.07575973123311996, 0.02694130875170231, -0.011632419191300869, -0.038809727877378464, -0.05752044916152954, -0.08424147963523865, 0.10908413678407669, -0.04683532565832138, -0.17859424650669098, -0.10767985880374908, 0.016504360362887383, -0.02318461239337921, -0.012125837616622448, 0.0336092934012413, -0.07754109054803848, -0.05078224092721939, -0.0742151215672493, -0.020925382152199745, -0.004795278422534466, -0.006304334383457899, 0.01089237816631794, 0.02338317781686783, 0.05612659826874733, -0.0779053345322609, 0.012741001322865486, -0.009301038458943367, -0.030841657891869545, 0.032058969140052795, 0.07376722246408463, 0.09881925582885742, 0.12593990564346313, 0.08283448964357376, 0.023564990609884262, -0.020854147151112556, 0.12293267250061035, -0.15980038046836853, 0.04422811418771744, 0.08611489087343216, -0.016439054161310196, 0.07509159296751022, 0.15979307889938354, 0.018044330179691315, -0.08371995389461517, 0.032410528510808945, 0.06594788283109665, -0.01277302484959364, -0.22683848440647125, -0.033268000930547714, -0.03800027817487717, -0.04583568125963211, 0.11345073580741882, 0.042781371623277664, -0.03899125009775162, 0.032627206295728683, -0.033155206590890884, -0.10447148233652115, 0.04309487342834473, 0.07006379216909409, 0.006818100344389677, 0.05282839015126228, 0.10040473192930222, 0.009512493386864662, -0.01282824669033289, 0.08404310792684555, 0.0016330337384715676, 0.20058460533618927, -0.018322598189115524, 0.21219831705093384, 0.0412852019071579, 0.13381201028823853, -0.04606933146715164, -0.007794125936925411, 0.00872649997472763, 0.02775581181049347, 0.008231568150222301, -0.046199969947338104, -0.015940474346280098, 0.03814186528325081, 0.14522212743759155, -0.03812575712800026, -0.05820472165942192, -0.01695915497839451, 0.06387512385845184, 0.34107705950737, 0.1078038364648819, -0.19515341520309448, -0.016179760918021202, 0.040330514311790466, -0.11077722907066345, -0.023868128657341003, -0.033615998923778534, 0.06096278131008148, -0.13404923677444458, 0.08488953113555908, -0.04589870944619179, 0.09054394066333771, -0.13077232241630554, -0.03441154584288597, 0.03324839472770691, 0.039933472871780396, 0.005080641712993383, 0.057716816663742065, -0.1614614874124527, 0.24165336787700653, -0.016767248511314392, 0.034677401185035706, -0.05257319658994675, 0.03153754398226738, 0.005695999134331942, -0.04722769930958748, 0.13252554833889008, 0.01580914855003357, -0.0638098493218422, -0.0703091099858284, -0.13339556753635406, 0.004097762983292341, 0.12205084413290024, -0.18869701027870178, 0.11301221698522568, -0.04039296880364418, -0.037011612206697464, -0.021460644900798798, -0.030690139159560204, -0.10396551340818405, -0.10635758191347122, 0.056769825518131256, -0.06205292046070099, 0.07923262566328049, -0.058313608169555664, -0.05374126136302948, -0.1030394434928894, 0.16206246614456177, -0.08910522609949112, -0.0291118323802948, -0.13747306168079376, 0.01201529148966074, 0.14026303589344025, -0.07793635129928589, 0.005563262850046158, 0.00030049169436097145, 0.12210515886545181, 0.012558715417981148, 0.0186610110104084, 0.08173921704292297, -0.05857912823557854, -0.22260616719722748, -0.06321018189191818, 0.18732969462871552, 0.06446363031864166, 0.057001885026693344, 0.010108201764523983, 0.07691198587417603, -0.029842987656593323, -0.07719600945711136, 0.07627016305923462, 0.010266748256981373, 0.04417702183127403, 0.04358507692813873, 0.024301210418343544, -0.006248853635042906, -0.12517999112606049, -0.06714849174022675, 0.09437934309244156, 0.30073633790016174, -0.09541210532188416, 0.12692569196224213, 0.02804385870695114, -0.051959406584501266, -0.1145988330245018, -0.050353217869997025, 0.08737572282552719, 0.00089839386055246, 0.022498885169625282, -0.16795065999031067, 0.026915322989225388, 0.08120974898338318, -0.034629903733730316, 0.008947945199906826, -0.2522139251232147, -0.11002529412508011, 0.022987913340330124, 0.0546751469373703, -0.1294708102941513, -0.15754830837249756, -0.08585385233163834, -0.02931196801364422, -0.11883798241615295, 0.0612686350941658, 0.00933105032891035, 0.08602786809206009, -0.002161954063922167, -0.00648876465857029, 0.0278810765594244, -0.046112172305583954, 0.17446163296699524, 0.03910207003355026, -0.00010683396249078214, -0.0531030036509037, 0.06312199681997299, -0.010451851412653923, -0.06326824426651001, 0.021749885752797127, -0.061847250908613205, 0.01417572796344757, -0.15765126049518585, -0.026499370113015175, -0.05975566431879997, 0.020700445398688316, -0.045029811561107635, -0.037001632153987885, -0.038282718509435654, 0.06639116257429123, 0.1269611418247223, 0.010135170072317123, 0.03014686517417431, -0.061250656843185425, 0.06414075940847397, 0.09507571905851364, 0.13412617146968842, 0.021006163209676743, -0.1307886838912964, -0.020505908876657486, 0.014300811104476452, -0.005891993176192045, -0.09789733588695526, 0.08454427123069763, 0.10091211646795273, 0.03181257098913193, 0.13401402533054352, -0.0037844551261514425, -0.10825872421264648, 0.01145614217966795, 0.08051120489835739, -0.08774115890264511, -0.19364818930625916, 0.011077910661697388, -0.038415879011154175, -0.1252160370349884, -0.039575643837451935, 0.11915716528892517, 0.04265039414167404, -0.026191528886556625, 0.029159249737858772, 0.07397449016571045, 0.012369467876851559, 0.12125011533498764, 0.010600686073303223, 0.0646013468503952, -0.06872910261154175, 0.090727798640728, 0.09062209725379944, -0.11545879393815994, 0.0072171944193542, 0.10679168999195099, -0.06225115433335304, -0.04949899762868881, -0.005539840552955866, 0.020448457449674606, 0.004718017298728228, -0.008242818526923656, -0.08638839423656464, -0.06007027253508568, 0.06441199779510498, 0.01768014207482338, 0.03571541607379913, -0.012249572202563286, 0.05595355108380318, 0.03528624773025513, -0.06014937907457352, 0.09129700064659119, 0.09976962208747864, 0.046319399029016495, -0.07413683086633682, 0.028790554031729698, -0.002517393324524164, -0.0017810394056141376, 0.0055516622960567474, -0.0014562199357897043, -0.07427453249692917, 0.001165010267868638, -0.06695609539747238, -0.025593798607587814, -0.08594843000173569, -0.00017209538782481104, 0.018597284331917763, -0.08322828263044357, -0.03588820993900299, 0.00963593740016222, -0.09216680377721786, -0.07888474315404892, -0.030009128153324127, 0.09175607562065125, -0.15849962830543518, -0.011609422974288464, 0.0980994701385498, -0.10477672517299652, 0.14764109253883362, 0.024595048278570175, 0.030894892290234566, -0.005670323967933655, -0.07396750897169113, 0.009143197908997536, -0.0364244282245636, 0.012715349905192852, 0.039512019604444504, -0.2686346173286438, -0.006503353826701641, -0.03956270590424538, -0.010907962918281555, 0.017218777909874916, 0.05687251314520836, -0.12143019586801529, 0.02527550794184208, -0.014200860634446144, -0.07826241105794907, -0.04651736468076706, 0.018122494220733643, 0.014597469009459019, 0.04382829740643501, 0.14108806848526, -0.03326354920864105, 0.08885668963193893, -0.2012549489736557, 0.0029777928721159697, 0.006015460006892681, 0.01725202053785324, 0.02214118465781212, -0.005781327374279499, 0.08237896859645844, -0.03507865220308304, 0.07526031136512756, -0.03688937425613403, 0.03487466648221016, 0.04275752604007721, -0.0226192194968462, 0.04620000347495079, 0.03209170699119568, 0.04570130258798599, 0.04982151463627815, -0.027659203857183456, 0.061179183423519135, -0.03211992233991623, -0.0094761298969388, 0.02450355514883995, 0.14197686314582825, 0.17730924487113953, 0.08595968782901764, 0.014321083202958107, 0.047303151339292526, -0.10972163081169128, -0.1244816929101944, 0.1427362710237503, -0.09140641987323761, 0.07929010689258575, -0.021130608394742012, 0.22376947104930878, 0.10239353775978088, -0.220474973320961, 0.04937064275145531, 0.003076436696574092, -0.06259892880916595, -0.09908021241426468, -0.14527860283851624, -0.07637634873390198, -0.13463257253170013, 0.031547028571367264, -0.051767174154520035, 0.06293699890375137, 0.03647635877132416, 0.02828483283519745, 0.0313953272998333, 0.0490998737514019, 0.02089742012321949, -0.01753290370106697, 0.07412134855985641, 0.043226856738328934, -0.02604430355131626, 0.022935248911380768, -0.02297813817858696, -0.006877281703054905, 0.009432089515030384, 0.08085082471370697, -0.007605849299579859, -0.07093200832605362, 0.047002363950014114, 0.007837863638997078, -0.11019586026668549, -0.0014805443352088332, -0.03353166580200195, 0.006409007124602795, 0.11798317730426788, 0.06516832113265991, 0.059387557208538055, -0.027697879821062088, 0.21915143728256226, -0.05787147581577301, -0.005077925976365805, -0.16043104231357574, 0.18394167721271515, -0.04527275636792183, 0.00383192696608603, 0.03914598748087883, -0.11363817006349564, 0.007015299517661333, 0.1485559344291687, 0.12184401601552963, -0.03784231096506119, 0.0055660284124314785, 0.01453680731356144, 0.007465932052582502, 0.010846360586583614, 0.01657312735915184, 0.07400219142436981, 0.029011765494942665, -0.06801223009824753, 0.007952908053994179, -0.040343862026929855, -0.05872790142893791, -0.04160796478390694, 0.1043061688542366, -0.013276549056172371, -0.006956192199140787, -0.03965931385755539, 0.14815092086791992, -0.014929206110537052, -0.227726012468338, 0.056314267218112946, -0.13387347757816315, -0.16326281428337097, -0.024023374542593956, 0.04072747007012367, -0.013822467066347599, 0.08412577211856842, 0.056540198624134064, -0.04322347790002823, 0.16466383635997772, 0.049073923379182816, -0.05548549443483353, -0.06084765866398811, 0.08411964029073715, -0.13493363559246063, 0.20906688272953033, -0.019771745428442955, 0.05448786914348602, 0.12906098365783691, 0.01837257109582424, -0.1285134255886078, 0.011099978350102901, 0.10914571583271027, -0.138834148645401, 0.06188136339187622, 0.2144378423690796, -0.024484850466251373, 0.08354028314352036, 0.09465969353914261, -0.01991436257958412, 0.04010035842657089, -0.07882250845432281, 0.01164520438760519, -0.08685888350009918, 0.054136402904987335, -0.0783240795135498, 0.11096581816673279, 0.17572879791259766, -0.05903157964348793, 0.02007278800010681, -0.06207792088389397, -0.015252471901476383, 0.007621477823704481, 0.14016729593276978, -0.014469738118350506, -0.2216859757900238, 0.04670659452676773, 0.0003154845326207578, 0.07404039055109024, -0.1703401505947113, -0.0827217549085617, 0.032664548605680466, -0.049601852893829346, -0.02063128538429737, 0.13120324909687042, 0.023465555161237717, 0.0028198817744851112, -0.03903503343462944, -0.12922753393650055, -0.00580858439207077, 0.1362031251192093, -0.16378355026245117, -0.01610167697072029 ]
null
null
transformers
# wav2vec2-xls-r-300m-ca This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - CA, the [tv3_parla](https://huggingface.co/datasets/collectivat/tv3_parla) and [parlament_parla](https://huggingface.co/datasets/projecte-aina/parlament_parla) datasets. It achieves the following results on the evaluation set (for the three datasets): - Loss: 0.2472 - Wer: 0.1499 ## Model description Please check the original [facebook/wav2vec2-xls-r-1b](https://huggingface.co/facebook/wav2vec2-xls-r-1b) Model card. This is just a finetuned version of that model. ## Intended uses & limitations As any model trained on crowdsourced data, this model can show the biases and particularities of the data and model used to train this model. Moreover, since this is a speech recognition model, it may underperform for some lower-resourced dialects for the catalan language. ## Training and evaluation data More information needed ## Training procedure The data is preprocessed to remove characters not on the catalan alphabet. Moreover, numbers are verbalized using code provided by [@ccoreilly](https://github.com/ccoreilly), which can be found on the text/ folder or [here](https://github.com/CollectivaT-dev/catotron-cpu/blob/master/text/numbers_ca.py). ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 7.5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 2000 - num_epochs: 18.0 - mixed_precision_training: Native AMP ### Training results Check the Tensorboard tab to check the training profile and evaluation results along training. The model was evaluated on the test splits for each of the datasets used during training. | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:-----:|:---------------:|:------:| | 6.2099 | 0.09 | 500 | 3.4125 | 1.0 | | 2.9961 | 0.18 | 1000 | 2.9224 | 1.0 | | 2.2147 | 0.26 | 1500 | 0.6521 | 0.5568 | | 1.3017 | 0.35 | 2000 | 0.3153 | 0.2761 | | 1.1196 | 0.44 | 2500 | 0.2444 | 0.2367 | | 1.0712 | 0.53 | 3000 | 0.2324 | 0.2132 | | 1.052 | 0.62 | 3500 | 0.2173 | 0.2032 | | 1.2813 | 2.13 | 4000 | 0.3326 | 0.2099 | | 1.2365 | 2.4 | 4500 | 0.3224 | 0.2003 | | 1.2193 | 2.66 | 5000 | 0.3198 | 0.1957 | | 1.2072 | 2.93 | 5500 | 0.3063 | 0.1933 | | 1.213 | 3.2 | 6000 | 0.3051 | 0.1980 | | 1.2074 | 3.46 | 6500 | 0.3012 | 0.1879 | | 1.1918 | 3.73 | 7000 | 0.2947 | 0.1829 | | 1.1893 | 4.0 | 7500 | 0.2895 | 0.1807 | | 1.1751 | 4.26 | 8000 | 0.2878 | 0.1776 | | 1.1628 | 4.53 | 8500 | 0.2835 | 0.1731 | | 1.1577 | 4.79 | 9000 | 0.2816 | 0.1761 | | 1.1448 | 5.06 | 9500 | 0.2757 | 0.1740 | | 1.1407 | 5.33 | 10000 | 0.2768 | 0.1798 | | 1.1401 | 5.59 | 10500 | 0.2780 | 0.1816 | | 1.1333 | 5.86 | 11000 | 0.2748 | 0.1750 | | 1.1571 | 6.13 | 11500 | 0.2808 | 0.1708 | | 1.1505 | 6.39 | 12000 | 0.2726 | 0.1692 | | 1.1519 | 6.66 | 12500 | 0.2749 | 0.1654 | | 1.136 | 6.93 | 13000 | 0.2765 | 0.1643 | | 1.1326 | 7.19 | 13500 | 0.2706 | 0.1668 | | 1.1342 | 7.46 | 14000 | 0.2665 | 0.1638 | | 1.1286 | 7.72 | 14500 | 0.2669 | 0.1636 | | 1.1243 | 7.99 | 15000 | 0.2619 | 0.1623 | | 1.1173 | 8.26 | 15500 | 0.2652 | 0.1604 | | 1.1129 | 8.52 | 16000 | 0.2610 | 0.1598 | | 1.1091 | 8.79 | 16500 | 0.2608 | 0.1584 | | 1.1053 | 9.06 | 17000 | 0.2633 | 0.1664 | | 1.1004 | 9.32 | 17500 | 0.2594 | 0.1662 | | 1.0995 | 9.59 | 18000 | 0.2623 | 0.1569 | | 1.0964 | 9.86 | 18500 | 0.2624 | 0.1597 | | 1.09 | 10.12 | 19000 | 0.2577 | 0.1578 | | 1.089 | 10.39 | 19500 | 0.2574 | 0.1531 | | 1.0864 | 10.66 | 20000 | 0.2556 | 0.1546 | | 1.0806 | 10.92 | 20500 | 0.2548 | 0.1583 | | 1.0842 | 11.19 | 21000 | 0.2550 | 0.1542 | | 1.0805 | 11.45 | 21500 | 0.2561 | 0.1524 | | 1.0722 | 11.72 | 22000 | 0.2540 | 0.1566 | | 1.0763 | 11.99 | 22500 | 0.2549 | 0.1572 | | 1.0835 | 12.25 | 23000 | 0.2586 | 0.1521 | | 1.0883 | 12.52 | 23500 | 0.2583 | 0.1519 | | 1.0888 | 12.79 | 24000 | 0.2551 | 0.1582 | | 1.0933 | 13.05 | 24500 | 0.2628 | 0.1537 | | 1.0799 | 13.32 | 25000 | 0.2600 | 0.1508 | | 1.0804 | 13.59 | 25500 | 0.2620 | 0.1475 | | 1.0814 | 13.85 | 26000 | 0.2537 | 0.1517 | | 1.0693 | 14.12 | 26500 | 0.2560 | 0.1542 | | 1.0724 | 14.38 | 27000 | 0.2540 | 0.1574 | | 1.0704 | 14.65 | 27500 | 0.2548 | 0.1626 | | 1.0729 | 14.92 | 28000 | 0.2548 | 0.1601 | | 1.0724 | 15.18 | 28500 | 0.2511 | 0.1512 | | 1.0655 | 15.45 | 29000 | 0.2498 | 0.1490 | | 1.0608 | 15.98 | 30000 | 0.2487 | 0.1481 | | 1.0541 | 16.52 | 31000 | 0.2468 | 0.1504 | | 1.0584 | 17.05 | 32000 | 0.2467 | 0.1493 | | 1.0507 | 17.58 | 33000 | 0.2481 | 0.1517 | ### Framework versions - Transformers 4.16.0.dev0 - Pytorch 1.10.1+cu102 - Datasets 1.18.3 - Tokenizers 0.11.0 # Thanks Want to thank both [@ccoreilly](https://github.com/ccoreilly) and [@gullabi](https://github.com/gullabi) who have contributed with their own resources and knowledge into making this model possible.
{"language": ["ca"], "license": "apache-2.0", "tags": ["automatic-speech-recognition", "collectivat/tv3_parla", "generated_from_trainer", "hf-asr-leaderboard", "mozilla-foundation/common_voice_8_0", "projecte-aina/parlament_parla", "robust-speech-event"], "datasets": ["mozilla-foundation/common_voice_8_0", "collectivat/tv3_parla", "projecte-aina/parlament_parla"], "model-index": [{"name": "wav2vec2-xls-r-300m-ca", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "mozilla-foundation/common_voice_8_0 ca", "type": "mozilla-foundation/common_voice_8_0", "args": "ca"}, "metrics": [{"type": "wer", "value": 13.170091241317552, "name": "Test WER"}, {"type": "cer", "value": 3.356726205534543, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "projecte-aina/parlament_parla ca", "type": "projecte-aina/parlament_parla", "args": "clean"}, "metrics": [{"type": "wer", "value": 8.048005647723262, "name": "Test WER"}, {"type": "cer", "value": 2.240912911020065, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "collectivat/tv3_parla ca", "type": "collectivat/tv3_parla", "args": "ca"}, "metrics": [{"type": "wer", "value": 23.320629787889285, "name": "Test WER"}, {"type": "cer", "value": 10.43921620208999, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Speech Recognition"}, "dataset": {"name": "speech-recognition-community-v2/dev_data ca", "type": "speech-recognition-community-v2/dev_data", "args": "ca"}, "metrics": [{"type": "wer", "value": 31.99671115046487, "name": "Test WER"}, {"type": "cer", "value": 15.820020687277324, "name": "Test CER"}]}, {"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "Robust Speech Event - Test Data", "type": "speech-recognition-community-v2/eval_data", "args": "ca"}, "metrics": [{"type": "wer", "value": 22.04, "name": "Test WER"}]}]}]}
automatic-speech-recognition
PereLluis13/wav2vec2-xls-r-300m-ca
[ "transformers", "pytorch", "tensorboard", "wav2vec2", "automatic-speech-recognition", "collectivat/tv3_parla", "generated_from_trainer", "hf-asr-leaderboard", "mozilla-foundation/common_voice_8_0", "projecte-aina/parlament_parla", "robust-speech-event", "ca", "dataset:mozilla-foundation/common_voice_8_0", "dataset:collectivat/tv3_parla", "dataset:projecte-aina/parlament_parla", "license:apache-2.0", "model-index", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "ca" ]
TAGS #transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #collectivat/tv3_parla #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #projecte-aina/parlament_parla #robust-speech-event #ca #dataset-mozilla-foundation/common_voice_8_0 #dataset-collectivat/tv3_parla #dataset-projecte-aina/parlament_parla #license-apache-2.0 #model-index #endpoints_compatible #region-us
wav2vec2-xls-r-300m-ca ====================== This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the MOZILLA-FOUNDATION/COMMON\_VOICE\_8\_0 - CA, the tv3\_parla and parlament\_parla datasets. It achieves the following results on the evaluation set (for the three datasets): * Loss: 0.2472 * Wer: 0.1499 Model description ----------------- Please check the original facebook/wav2vec2-xls-r-1b Model card. This is just a finetuned version of that model. Intended uses & limitations --------------------------- As any model trained on crowdsourced data, this model can show the biases and particularities of the data and model used to train this model. Moreover, since this is a speech recognition model, it may underperform for some lower-resourced dialects for the catalan language. Training and evaluation data ---------------------------- More information needed Training procedure ------------------ The data is preprocessed to remove characters not on the catalan alphabet. Moreover, numbers are verbalized using code provided by @ccoreilly, which can be found on the text/ folder or here. ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 7.5e-05 * train\_batch\_size: 32 * eval\_batch\_size: 32 * seed: 42 * gradient\_accumulation\_steps: 4 * total\_train\_batch\_size: 128 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * lr\_scheduler\_warmup\_steps: 2000 * num\_epochs: 18.0 * mixed\_precision\_training: Native AMP ### Training results Check the Tensorboard tab to check the training profile and evaluation results along training. The model was evaluated on the test splits for each of the datasets used during training. ### Framework versions * Transformers 4.16.0.dev0 * Pytorch 1.10.1+cu102 * Datasets 1.18.3 * Tokenizers 0.11.0 Thanks ====== Want to thank both @ccoreilly and @gullabi who have contributed with their own resources and knowledge into making this model possible.
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 18.0\n* mixed\\_precision\\_training: Native AMP", "### Training results\n\n\nCheck the Tensorboard tab to check the training profile and evaluation results along training. The model was evaluated on the test splits for each of the datasets used during training.", "### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.18.3\n* Tokenizers 0.11.0\n\n\nThanks\n======\n\n\nWant to thank both @ccoreilly and @gullabi who have contributed with their own resources and knowledge into making this model possible." ]
[ "TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #collectivat/tv3_parla #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #projecte-aina/parlament_parla #robust-speech-event #ca #dataset-mozilla-foundation/common_voice_8_0 #dataset-collectivat/tv3_parla #dataset-projecte-aina/parlament_parla #license-apache-2.0 #model-index #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 18.0\n* mixed\\_precision\\_training: Native AMP", "### Training results\n\n\nCheck the Tensorboard tab to check the training profile and evaluation results along training. The model was evaluated on the test splits for each of the datasets used during training.", "### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.18.3\n* Tokenizers 0.11.0\n\n\nThanks\n======\n\n\nWant to thank both @ccoreilly and @gullabi who have contributed with their own resources and knowledge into making this model possible." ]
[ 161, 160, 42, 72 ]
[ "passage: TAGS\n#transformers #pytorch #tensorboard #wav2vec2 #automatic-speech-recognition #collectivat/tv3_parla #generated_from_trainer #hf-asr-leaderboard #mozilla-foundation/common_voice_8_0 #projecte-aina/parlament_parla #robust-speech-event #ca #dataset-mozilla-foundation/common_voice_8_0 #dataset-collectivat/tv3_parla #dataset-projecte-aina/parlament_parla #license-apache-2.0 #model-index #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 7.5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2000\n* num\\_epochs: 18.0\n* mixed\\_precision\\_training: Native AMP### Training results\n\n\nCheck the Tensorboard tab to check the training profile and evaluation results along training. The model was evaluated on the test splits for each of the datasets used during training.### Framework versions\n\n\n* Transformers 4.16.0.dev0\n* Pytorch 1.10.1+cu102\n* Datasets 1.18.3\n* Tokenizers 0.11.0\n\n\nThanks\n======\n\n\nWant to thank both @ccoreilly and @gullabi who have contributed with their own resources and knowledge into making this model possible." ]
[ -0.1037735566496849, 0.16864806413650513, -0.006274207029491663, 0.0739114060997963, 0.07064073532819748, 0.02520143613219261, 0.048301294445991516, 0.16067075729370117, 0.021686352789402008, 0.14008595049381256, 0.042390450835227966, 0.09102736413478851, 0.10175461322069168, 0.08147802948951721, -0.0032629056368023157, -0.22798281908035278, -0.014917992986738682, -0.04647151753306389, -0.07841093838214874, 0.11563113331794739, 0.09429129958152771, -0.08326639235019684, 0.05235009267926216, 0.010703658685088158, -0.016478702425956726, -0.023271268233656883, -0.03477642685174942, -0.023938022553920746, 0.08857075124979019, 0.023340700194239616, 0.04142114520072937, 0.059318140149116516, 0.07734952867031097, -0.2841726243495941, 0.023432571440935135, 0.08140309154987335, 0.009846126660704613, 0.07630540430545807, 0.12538671493530273, -0.07743306457996368, 0.11052867770195007, -0.09166122227907181, 0.0622323676943779, 0.06987899541854858, -0.12712980806827545, -0.2262381613254547, -0.1176464855670929, 0.04549189656972885, 0.15374243259429932, 0.044634219259023666, -0.03906483203172684, 0.08223122358322144, -0.053697820752859116, 0.04435895010828972, 0.22738732397556305, -0.23370958864688873, -0.058610379695892334, -0.03278898447751999, 0.01859361305832863, 0.030601156875491142, -0.07927264273166656, 0.02124006487429142, -0.0077434382401406765, -0.03152646869421005, 0.09173772484064102, 0.0009907508501783013, 0.1383359581232071, 0.02519870735704899, -0.13558217883110046, -0.05948638170957565, 0.06279811263084412, 0.09619410336017609, -0.028502123430371284, -0.1418440341949463, -0.02061101421713829, -0.10782653093338013, -0.003120948327705264, -0.01671871542930603, 0.015443445183336735, -0.025222277268767357, -0.05002092942595482, -0.0019157306523993611, -0.05147544667124748, -0.06656701862812042, 0.053592514246702194, 0.1160762682557106, 0.044392723590135574, -0.028129464015364647, 0.03302881494164467, 0.1106027364730835, 0.07617854326963425, -0.13451224565505981, -0.012636873871088028, -0.0023055437486618757, -0.18923524022102356, -0.01718713901937008, 0.006655509117990732, -0.023967750370502472, 0.03169115260243416, 0.2022426426410675, 0.06581616401672363, 0.04840861260890961, 0.021893374621868134, -0.007485712878406048, -0.00968709122389555, 0.12538571655750275, -0.06242893636226654, -0.12589794397354126, -0.054172586649656296, 0.08517132699489594, -0.005700606852769852, -0.018194526433944702, -0.02363266982138157, 0.04737813025712967, 0.0809108316898346, 0.0974913015961647, 0.07515319436788559, -0.00008871898899087682, -0.11314931511878967, -0.05153752863407135, 0.08309019356966019, -0.11357467621564865, 0.07052124291658401, 0.09306212514638901, -0.048795923590660095, -0.017980774864554405, -0.03888951241970062, 0.040754351764917374, -0.030793456360697746, 0.08988336473703384, -0.054060615599155426, -0.0032116989605128765, -0.04280116781592369, -0.02742222137749195, 0.022580144926905632, -0.07066353410482407, -0.04411972686648369, -0.06980374455451965, -0.03218000382184982, -0.08981412649154663, 0.10101059079170227, -0.11307580024003983, -0.06621026247739792, -0.03120395541191101, -0.017321154475212097, 0.05024297535419464, -0.0073570432141423225, 0.14763231575489044, -0.045142725110054016, 0.06377005577087402, -0.0256356168538332, 0.047550227493047714, 0.14116781949996948, 0.06370050460100174, -0.029823821038007736, 0.04029688239097595, -0.15838158130645752, 0.10048888623714447, -0.0857972577214241, 0.016865292564034462, -0.18229219317436218, -0.0738324448466301, 0.0074002500623464584, -0.052819330245256424, 0.06644676625728607, 0.12306834757328033, -0.15476743876934052, -0.059142034500837326, 0.1071125790476799, -0.056235358119010925, -0.0705515593290329, 0.12458710372447968, -0.024841217324137688, -0.03682229667901993, 0.01186264120042324, 0.14229074120521545, 0.08413641899824142, -0.12560416758060455, 0.010001154616475105, -0.11594059318304062, 0.05507475882768631, 0.11092077940702438, 0.11962639540433884, -0.06973262131214142, 0.05830381438136101, -0.03792016953229904, -0.08394226431846619, -0.010652774944901466, -0.04193231463432312, -0.07937134802341461, 0.009899750351905823, -0.06375589966773987, 0.016175754368305206, 0.03851356729865074, -0.025647955015301704, -0.05070928856730461, -0.15485018491744995, -0.08783133327960968, 0.08434189110994339, -0.06520882993936539, 0.032787274569272995, -0.09728936851024628, 0.08390790969133377, 0.010849304497241974, -0.018817054107785225, -0.1286519467830658, -0.03137411177158356, 0.052457042038440704, -0.13082924485206604, 0.00006482993921963498, -0.019114378839731216, 0.04792342334985733, 0.019580421969294548, -0.03623092919588089, -0.04132910817861557, -0.012654502876102924, -0.012068366631865501, -0.03157830983400345, -0.21863172948360443, -0.051772795617580414, -0.005114186555147171, 0.2272590547800064, -0.14612150192260742, -0.007595043163746595, 0.08446627855300903, 0.17969627678394318, 0.032401520758867264, -0.0852268636226654, 0.029477283358573914, 0.003987407311797142, -0.006279671564698219, -0.053447891026735306, 0.04203778877854347, -0.013161235488951206, -0.11725752800703049, 0.05977805703878403, -0.1207672581076622, -0.10807882249355316, 0.048952504992485046, 0.05699366331100464, -0.05610073730349541, -0.053303852677345276, -0.0528145357966423, -0.03684483468532562, -0.04308580979704857, -0.0038710501976311207, 0.22111773490905762, 0.05773710086941719, 0.05785353109240532, -0.08613526821136475, -0.10213851183652878, -0.005613066256046295, 0.012081719934940338, -0.02518763206899166, 0.1347590982913971, 0.06716052442789078, -0.08770186454057693, 0.06284619122743607, 0.06496598571538925, 0.03218849003314972, 0.0460532121360302, -0.015572791919112206, -0.08652149140834808, -0.06421443074941635, 0.03580741956830025, 0.04154277592897415, 0.10352703928947449, -0.0971742570400238, 0.011966770514845848, 0.053324874490499496, 0.012160008773207664, -0.010708327405154705, -0.13099297881126404, 0.007369286846369505, 0.03236635401844978, -0.06726839393377304, 0.009255144745111465, -0.0014201393350958824, -0.009005631320178509, 0.07575973123311996, 0.02694130875170231, -0.011632419191300869, -0.038809727877378464, -0.05752044916152954, -0.08424147963523865, 0.10908413678407669, -0.04683532565832138, -0.17859424650669098, -0.10767985880374908, 0.016504360362887383, -0.02318461239337921, -0.012125837616622448, 0.0336092934012413, -0.07754109054803848, -0.05078224092721939, -0.0742151215672493, -0.020925382152199745, -0.004795278422534466, -0.006304334383457899, 0.01089237816631794, 0.02338317781686783, 0.05612659826874733, -0.0779053345322609, 0.012741001322865486, -0.009301038458943367, -0.030841657891869545, 0.032058969140052795, 0.07376722246408463, 0.09881925582885742, 0.12593990564346313, 0.08283448964357376, 0.023564990609884262, -0.020854147151112556, 0.12293267250061035, -0.15980038046836853, 0.04422811418771744, 0.08611489087343216, -0.016439054161310196, 0.07509159296751022, 0.15979307889938354, 0.018044330179691315, -0.08371995389461517, 0.032410528510808945, 0.06594788283109665, -0.01277302484959364, -0.22683848440647125, -0.033268000930547714, -0.03800027817487717, -0.04583568125963211, 0.11345073580741882, 0.042781371623277664, -0.03899125009775162, 0.032627206295728683, -0.033155206590890884, -0.10447148233652115, 0.04309487342834473, 0.07006379216909409, 0.006818100344389677, 0.05282839015126228, 0.10040473192930222, 0.009512493386864662, -0.01282824669033289, 0.08404310792684555, 0.0016330337384715676, 0.20058460533618927, -0.018322598189115524, 0.21219831705093384, 0.0412852019071579, 0.13381201028823853, -0.04606933146715164, -0.007794125936925411, 0.00872649997472763, 0.02775581181049347, 0.008231568150222301, -0.046199969947338104, -0.015940474346280098, 0.03814186528325081, 0.14522212743759155, -0.03812575712800026, -0.05820472165942192, -0.01695915497839451, 0.06387512385845184, 0.34107705950737, 0.1078038364648819, -0.19515341520309448, -0.016179760918021202, 0.040330514311790466, -0.11077722907066345, -0.023868128657341003, -0.033615998923778534, 0.06096278131008148, -0.13404923677444458, 0.08488953113555908, -0.04589870944619179, 0.09054394066333771, -0.13077232241630554, -0.03441154584288597, 0.03324839472770691, 0.039933472871780396, 0.005080641712993383, 0.057716816663742065, -0.1614614874124527, 0.24165336787700653, -0.016767248511314392, 0.034677401185035706, -0.05257319658994675, 0.03153754398226738, 0.005695999134331942, -0.04722769930958748, 0.13252554833889008, 0.01580914855003357, -0.0638098493218422, -0.0703091099858284, -0.13339556753635406, 0.004097762983292341, 0.12205084413290024, -0.18869701027870178, 0.11301221698522568, -0.04039296880364418, -0.037011612206697464, -0.021460644900798798, -0.030690139159560204, -0.10396551340818405, -0.10635758191347122, 0.056769825518131256, -0.06205292046070099, 0.07923262566328049, -0.058313608169555664, -0.05374126136302948, -0.1030394434928894, 0.16206246614456177, -0.08910522609949112, -0.0291118323802948, -0.13747306168079376, 0.01201529148966074, 0.14026303589344025, -0.07793635129928589, 0.005563262850046158, 0.00030049169436097145, 0.12210515886545181, 0.012558715417981148, 0.0186610110104084, 0.08173921704292297, -0.05857912823557854, -0.22260616719722748, -0.06321018189191818, 0.18732969462871552, 0.06446363031864166, 0.057001885026693344, 0.010108201764523983, 0.07691198587417603, -0.029842987656593323, -0.07719600945711136, 0.07627016305923462, 0.010266748256981373, 0.04417702183127403, 0.04358507692813873, 0.024301210418343544, -0.006248853635042906, -0.12517999112606049, -0.06714849174022675, 0.09437934309244156, 0.30073633790016174, -0.09541210532188416, 0.12692569196224213, 0.02804385870695114, -0.051959406584501266, -0.1145988330245018, -0.050353217869997025, 0.08737572282552719, 0.00089839386055246, 0.022498885169625282, -0.16795065999031067, 0.026915322989225388, 0.08120974898338318, -0.034629903733730316, 0.008947945199906826, -0.2522139251232147, -0.11002529412508011, 0.022987913340330124, 0.0546751469373703, -0.1294708102941513, -0.15754830837249756, -0.08585385233163834, -0.02931196801364422, -0.11883798241615295, 0.0612686350941658, 0.00933105032891035, 0.08602786809206009, -0.002161954063922167, -0.00648876465857029, 0.0278810765594244, -0.046112172305583954, 0.17446163296699524, 0.03910207003355026, -0.00010683396249078214, -0.0531030036509037, 0.06312199681997299, -0.010451851412653923, -0.06326824426651001, 0.021749885752797127, -0.061847250908613205, 0.01417572796344757, -0.15765126049518585, -0.026499370113015175, -0.05975566431879997, 0.020700445398688316, -0.045029811561107635, -0.037001632153987885, -0.038282718509435654, 0.06639116257429123, 0.1269611418247223, 0.010135170072317123, 0.03014686517417431, -0.061250656843185425, 0.06414075940847397, 0.09507571905851364, 0.13412617146968842, 0.021006163209676743, -0.1307886838912964, -0.020505908876657486, 0.014300811104476452, -0.005891993176192045, -0.09789733588695526, 0.08454427123069763, 0.10091211646795273, 0.03181257098913193, 0.13401402533054352, -0.0037844551261514425, -0.10825872421264648, 0.01145614217966795, 0.08051120489835739, -0.08774115890264511, -0.19364818930625916, 0.011077910661697388, -0.038415879011154175, -0.1252160370349884, -0.039575643837451935, 0.11915716528892517, 0.04265039414167404, -0.026191528886556625, 0.029159249737858772, 0.07397449016571045, 0.012369467876851559, 0.12125011533498764, 0.010600686073303223, 0.0646013468503952, -0.06872910261154175, 0.090727798640728, 0.09062209725379944, -0.11545879393815994, 0.0072171944193542, 0.10679168999195099, -0.06225115433335304, -0.04949899762868881, -0.005539840552955866, 0.020448457449674606, 0.004718017298728228, -0.008242818526923656, -0.08638839423656464, -0.06007027253508568, 0.06441199779510498, 0.01768014207482338, 0.03571541607379913, -0.012249572202563286, 0.05595355108380318, 0.03528624773025513, -0.06014937907457352, 0.09129700064659119, 0.09976962208747864, 0.046319399029016495, -0.07413683086633682, 0.028790554031729698, -0.002517393324524164, -0.0017810394056141376, 0.0055516622960567474, -0.0014562199357897043, -0.07427453249692917, 0.001165010267868638, -0.06695609539747238, -0.025593798607587814, -0.08594843000173569, -0.00017209538782481104, 0.018597284331917763, -0.08322828263044357, -0.03588820993900299, 0.00963593740016222, -0.09216680377721786, -0.07888474315404892, -0.030009128153324127, 0.09175607562065125, -0.15849962830543518, -0.011609422974288464, 0.0980994701385498, -0.10477672517299652, 0.14764109253883362, 0.024595048278570175, 0.030894892290234566, -0.005670323967933655, -0.07396750897169113, 0.009143197908997536, -0.0364244282245636, 0.012715349905192852, 0.039512019604444504, -0.2686346173286438, -0.006503353826701641, -0.03956270590424538, -0.010907962918281555, 0.017218777909874916, 0.05687251314520836, -0.12143019586801529, 0.02527550794184208, -0.014200860634446144, -0.07826241105794907, -0.04651736468076706, 0.018122494220733643, 0.014597469009459019, 0.04382829740643501, 0.14108806848526, -0.03326354920864105, 0.08885668963193893, -0.2012549489736557, 0.0029777928721159697, 0.006015460006892681, 0.01725202053785324, 0.02214118465781212, -0.005781327374279499, 0.08237896859645844, -0.03507865220308304, 0.07526031136512756, -0.03688937425613403, 0.03487466648221016, 0.04275752604007721, -0.0226192194968462, 0.04620000347495079, 0.03209170699119568, 0.04570130258798599, 0.04982151463627815, -0.027659203857183456, 0.061179183423519135, -0.03211992233991623, -0.0094761298969388, 0.02450355514883995, 0.14197686314582825, 0.17730924487113953, 0.08595968782901764, 0.014321083202958107, 0.047303151339292526, -0.10972163081169128, -0.1244816929101944, 0.1427362710237503, -0.09140641987323761, 0.07929010689258575, -0.021130608394742012, 0.22376947104930878, 0.10239353775978088, -0.220474973320961, 0.04937064275145531, 0.003076436696574092, -0.06259892880916595, -0.09908021241426468, -0.14527860283851624, -0.07637634873390198, -0.13463257253170013, 0.031547028571367264, -0.051767174154520035, 0.06293699890375137, 0.03647635877132416, 0.02828483283519745, 0.0313953272998333, 0.0490998737514019, 0.02089742012321949, -0.01753290370106697, 0.07412134855985641, 0.043226856738328934, -0.02604430355131626, 0.022935248911380768, -0.02297813817858696, -0.006877281703054905, 0.009432089515030384, 0.08085082471370697, -0.007605849299579859, -0.07093200832605362, 0.047002363950014114, 0.007837863638997078, -0.11019586026668549, -0.0014805443352088332, -0.03353166580200195, 0.006409007124602795, 0.11798317730426788, 0.06516832113265991, 0.059387557208538055, -0.027697879821062088, 0.21915143728256226, -0.05787147581577301, -0.005077925976365805, -0.16043104231357574, 0.18394167721271515, -0.04527275636792183, 0.00383192696608603, 0.03914598748087883, -0.11363817006349564, 0.007015299517661333, 0.1485559344291687, 0.12184401601552963, -0.03784231096506119, 0.0055660284124314785, 0.01453680731356144, 0.007465932052582502, 0.010846360586583614, 0.01657312735915184, 0.07400219142436981, 0.029011765494942665, -0.06801223009824753, 0.007952908053994179, -0.040343862026929855, -0.05872790142893791, -0.04160796478390694, 0.1043061688542366, -0.013276549056172371, -0.006956192199140787, -0.03965931385755539, 0.14815092086791992, -0.014929206110537052, -0.227726012468338, 0.056314267218112946, -0.13387347757816315, -0.16326281428337097, -0.024023374542593956, 0.04072747007012367, -0.013822467066347599, 0.08412577211856842, 0.056540198624134064, -0.04322347790002823, 0.16466383635997772, 0.049073923379182816, -0.05548549443483353, -0.06084765866398811, 0.08411964029073715, -0.13493363559246063, 0.20906688272953033, -0.019771745428442955, 0.05448786914348602, 0.12906098365783691, 0.01837257109582424, -0.1285134255886078, 0.011099978350102901, 0.10914571583271027, -0.138834148645401, 0.06188136339187622, 0.2144378423690796, -0.024484850466251373, 0.08354028314352036, 0.09465969353914261, -0.01991436257958412, 0.04010035842657089, -0.07882250845432281, 0.01164520438760519, -0.08685888350009918, 0.054136402904987335, -0.0783240795135498, 0.11096581816673279, 0.17572879791259766, -0.05903157964348793, 0.02007278800010681, -0.06207792088389397, -0.015252471901476383, 0.007621477823704481, 0.14016729593276978, -0.014469738118350506, -0.2216859757900238, 0.04670659452676773, 0.0003154845326207578, 0.07404039055109024, -0.1703401505947113, -0.0827217549085617, 0.032664548605680466, -0.049601852893829346, -0.02063128538429737, 0.13120324909687042, 0.023465555161237717, 0.0028198817744851112, -0.03903503343462944, -0.12922753393650055, -0.00580858439207077, 0.1362031251192093, -0.16378355026245117, -0.01610167697072029 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # medium This model is a fine-tuned version of [prithivida/parrot_paraphraser_on_T5](https://huggingface.co/prithivida/parrot_paraphraser_on_T5) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.6025 - Rouge1: 81.6007 - Rouge2: 75.1196 - Rougel: 81.4213 - Rougelsum: 81.4956 - Gen Len: 32.4286 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len | |:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:-------:| | No log | 1.0 | 63 | 0.5775 | 65.0748 | 58.8985 | 64.5731 | 63.6249 | 19.0 | | No log | 2.0 | 126 | 0.5806 | 74.3055 | 69.2025 | 73.4922 | 73.0941 | 17.8571 | | No log | 3.0 | 189 | 0.6025 | 71.3808 | 66.0359 | 70.1235 | 69.4614 | 18.0 | ### Framework versions - Transformers 4.15.0 - Pytorch 1.10.1+cu113 - Datasets 1.17.0 - Tokenizers 0.10.3
{"tags": ["generated_from_trainer"], "metrics": ["rouge"], "model-index": [{"name": "medium", "results": []}]}
text2text-generation
Peter/medium
[ "transformers", "pytorch", "t5", "text2text-generation", "generated_from_trainer", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #t5 #text2text-generation #generated_from_trainer #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
medium ====== This model is a fine-tuned version of prithivida/parrot\_paraphraser\_on\_T5 on an unknown dataset. It achieves the following results on the evaluation set: * Loss: 0.6025 * Rouge1: 81.6007 * Rouge2: 75.1196 * Rougel: 81.4213 * Rougelsum: 81.4956 * Gen Len: 32.4286 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 5e-05 * train\_batch\_size: 8 * eval\_batch\_size: 8 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 3.0 ### Training results ### Framework versions * Transformers 4.15.0 * Pytorch 1.10.1+cu113 * Datasets 1.17.0 * Tokenizers 0.10.3
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0", "### Training results", "### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.1+cu113\n* Datasets 1.17.0\n* Tokenizers 0.10.3" ]
[ "TAGS\n#transformers #pytorch #t5 #text2text-generation #generated_from_trainer #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0", "### Training results", "### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.1+cu113\n* Datasets 1.17.0\n* Tokenizers 0.10.3" ]
[ 55, 98, 4, 33 ]
[ "passage: TAGS\n#transformers #pytorch #t5 #text2text-generation #generated_from_trainer #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3.0### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.1+cu113\n* Datasets 1.17.0\n* Tokenizers 0.10.3" ]
[ -0.09481168538331985, 0.02117178402841091, -0.0017619547434151173, 0.11350587755441666, 0.19194161891937256, 0.032162148505449295, 0.10997337102890015, 0.12497401982545853, -0.13132509589195251, 0.008867883123457432, 0.12912076711654663, 0.18235093355178833, 0.0072020478546619415, 0.1202327311038971, -0.0614016093313694, -0.2699034512042999, -0.021079393103718758, 0.045429542660713196, -0.04664776101708412, 0.15629644691944122, 0.0959535539150238, -0.13512589037418365, 0.08791636675596237, -0.015135344117879868, -0.23860995471477509, 0.019010359421372414, 0.02101343870162964, -0.06221241131424904, 0.16022276878356934, 0.010768788866698742, 0.12635283172130585, 0.005642051342874765, 0.0754176527261734, -0.16042304039001465, 0.010048101656138897, 0.055186230689287186, 0.01928005926311016, 0.08525481075048447, 0.06718552112579346, -0.015688879415392876, 0.15114726126194, -0.08074045926332474, 0.044264573603868484, 0.023358115926384926, -0.1371631622314453, -0.1695661097764969, -0.06362754851579666, 0.0023993307258933783, 0.059491198509931564, 0.1181061863899231, -0.015961257740855217, 0.13921360671520233, -0.10421600192785263, 0.11394482851028442, 0.25076723098754883, -0.2743295133113861, -0.06608142703771591, 0.006780393421649933, 0.012303125113248825, 0.10296883434057236, -0.09480949491262436, -0.008223694749176502, 0.05744997784495354, 0.05630088970065117, 0.1309276968240738, -0.03466590866446495, -0.1298322230577469, 0.024306831881403923, -0.1462244838476181, -0.03475652262568474, 0.1472284495830536, 0.015236131846904755, -0.025096463039517403, -0.031158454716205597, -0.07884746789932251, -0.1789293885231018, -0.021738091483712196, -0.025653498247265816, 0.025615422055125237, -0.04110634699463844, -0.07849151641130447, -0.018317261710762978, -0.10012789815664291, -0.076100654900074, -0.06357624381780624, 0.15765008330345154, 0.03936849161982536, 0.005174319725483656, -0.047760382294654846, 0.12392494827508926, 0.012885167263448238, -0.12378443032503128, 0.03607542812824249, 0.019711099565029144, 0.004865070804953575, -0.048828985542058945, -0.08437072485685349, -0.0965413749217987, -0.009983901865780354, 0.07627499848604202, -0.05638772249221802, 0.03354240581393242, 0.029650980606675148, 0.02886849455535412, -0.09465264528989792, 0.18374083936214447, -0.031946975737810135, -0.026983261108398438, 0.003096754662692547, 0.0467812605202198, -0.015764299780130386, -0.01633148454129696, -0.11325499415397644, -0.02777072601020336, 0.1325262486934662, 0.019680047407746315, -0.07960744202136993, 0.08280355483293533, -0.04625368118286133, -0.036998528987169266, -0.03374866768717766, -0.10122791677713394, 0.013921771198511124, -0.02233666181564331, -0.09110786765813828, 0.000780449656303972, 0.020985206589102745, 0.01588434912264347, -0.05361289158463478, 0.11389079689979553, -0.08768787235021591, 0.03917742893099785, -0.10346159338951111, -0.11874084919691086, -0.013504557311534882, -0.08482694625854492, 0.013684607110917568, -0.10472553223371506, -0.1843046098947525, -0.017980681732296944, 0.027882901951670647, -0.022489355877041817, -0.061666786670684814, -0.07245279103517532, -0.07038649171590805, 0.008437630720436573, -0.018873995169997215, 0.16726423799991608, -0.053219959139823914, 0.1262163370847702, 0.053678225725889206, 0.07152419537305832, -0.04732795059680939, 0.06642716377973557, -0.09747033566236496, -0.01748584397137165, -0.17429541051387787, 0.08100750297307968, -0.006947696208953857, 0.06832083314657211, -0.07338175922632217, -0.11296746879816055, -0.008125371299684048, 0.00406234385445714, 0.08459498733282089, 0.1185075119137764, -0.16024529933929443, -0.08518385142087936, 0.17544114589691162, -0.052708134055137634, -0.1128145083785057, 0.1195659264922142, -0.05778488144278526, 0.06824950873851776, 0.08946157246828079, 0.1653648316860199, 0.05688529089093208, -0.05888305976986885, 0.053451403975486755, -0.014801138080656528, 0.04019962623715401, -0.05165080353617668, 0.055340830236673355, 0.0023012133315205574, -0.029109260067343712, 0.032555077224969864, 0.000514704268425703, 0.05984925106167793, -0.11687158793210983, -0.0859738364815712, -0.04291577637195587, -0.10582634061574936, 0.0712539479136467, 0.06350730359554291, 0.10273315757513046, -0.10966306179761887, -0.0653245598077774, 0.05449528992176056, 0.062409039586782455, -0.06131346896290779, 0.03504926338791847, -0.03919823467731476, 0.05004182830452919, -0.03564603254199028, -0.01821890100836754, -0.20599789917469025, -0.014722786843776703, 0.01147691160440445, 0.07379572838544846, 0.027102092280983925, 0.017585953697562218, 0.07643108814954758, 0.06338059902191162, -0.05965515971183777, -0.02124505676329136, -0.027910461649298668, -0.01988457888364792, -0.14984802901744843, -0.16535784304141998, -0.01296755950897932, -0.007235381752252579, 0.11806824803352356, -0.19882778823375702, 0.03208914026618004, -0.034141093492507935, 0.0665055513381958, 0.004631316754966974, -0.01093627791851759, -0.041293639689683914, 0.10266686230897903, -0.038140688091516495, -0.03990226611495018, 0.08836869150400162, -0.010596075095236301, -0.08191713690757751, -0.03798247501254082, -0.10967857390642166, 0.17396396398544312, 0.13209456205368042, -0.16627243161201477, -0.0949736014008522, -0.003598491894081235, -0.06037876382470131, -0.02646547369658947, -0.041804809123277664, 0.013473873026669025, 0.20640723407268524, -0.0179616566747427, 0.16329164803028107, -0.07467532902956009, -0.04704812541604042, 0.0104299271479249, -0.027563350275158882, 0.05198204889893532, 0.12238317728042603, 0.09478756040334702, -0.07861912995576859, 0.1272115260362625, 0.132711723446846, -0.08160174638032913, 0.16032929718494415, -0.0213566143065691, -0.0871187373995781, 0.006005530711263418, -0.016024520620703697, -0.00930373277515173, 0.045694395899772644, -0.15577293932437897, -0.02054416947066784, 0.020229117944836617, 0.027847521007061005, 0.03696908801794052, -0.22060547769069672, -0.043940648436546326, 0.038365960121154785, -0.03672599047422409, -0.005681907292455435, -0.006043245550245047, 0.02350696176290512, 0.1281362622976303, 0.005372475832700729, -0.05823233351111412, 0.03926514834165573, 0.010895494371652603, -0.08709487318992615, 0.21538732945919037, -0.07861810177564621, -0.1676163673400879, -0.11938098818063736, -0.07357528060674667, -0.05620981752872467, 0.020222118124365807, 0.06209041550755501, -0.11129102855920792, -0.010597548447549343, -0.05174984410405159, 0.06664801388978958, -0.01857038401067257, 0.04377680644392967, -0.005752242635935545, -0.004779257345944643, 0.04693122208118439, -0.10611388087272644, -0.013700070790946484, -0.05641249194741249, -0.08860313147306442, 0.06783302873373032, 0.014386951923370361, 0.11798912286758423, 0.16590328514575958, -0.03672412410378456, 0.019449593499302864, -0.039324767887592316, 0.2403773069381714, -0.06930967420339584, -0.03353988751769066, 0.13763795793056488, 0.00034277187660336494, 0.05080345273017883, 0.08803645521402359, 0.04888160154223442, -0.09523891657590866, 0.03418638929724693, 0.02699834667146206, -0.03723956644535065, -0.22718949615955353, -0.039850834757089615, -0.05988422408699989, -0.0264629814773798, 0.0902349054813385, 0.012834153138101101, 0.06642943620681763, 0.07206984609365463, 0.03972363471984863, 0.09257897734642029, -0.04107736423611641, 0.04870772734284401, 0.10475274175405502, 0.04280359670519829, 0.13122893869876862, -0.037548039108514786, -0.08352003246545792, 0.03857198730111122, -0.04109717532992363, 0.2128467559814453, -0.018990900367498398, 0.1139647364616394, 0.04240022227168083, 0.16704285144805908, 0.005928369238972664, 0.11295972019433975, 0.00011103460565209389, -0.04150396212935448, -0.010848309844732285, -0.039271313697099686, -0.0626555010676384, 0.010724241845309734, -0.061913084238767624, 0.06096870079636574, -0.1445244401693344, -0.006219949573278427, 0.05495906248688698, 0.2545137107372284, 0.03778949752449989, -0.33761557936668396, -0.07760155200958252, -0.008506027050316334, -0.0344875231385231, -0.02908979170024395, 0.026034682989120483, 0.10219073295593262, -0.11793122440576553, 0.021932316944003105, -0.06075070798397064, 0.10055583715438843, -0.04143350198864937, 0.05755174532532692, 0.0424555204808712, 0.10031779855489731, -0.006578950211405754, 0.0799185112118721, -0.3496035635471344, 0.2640492022037506, 0.003494138130918145, 0.06547341495752335, -0.08499688655138016, -0.01234249398112297, 0.02636970765888691, 0.05912373960018158, 0.04260120168328285, -0.010134615935385227, 0.01367658656090498, -0.1918168067932129, -0.039810728281736374, 0.031591612845659256, 0.12894496321678162, -0.043000590056180954, 0.1030380055308342, -0.035317663103342056, 0.015777921304106712, 0.06887390464544296, -0.013625889085233212, -0.06795042008161545, -0.07825066149234772, -0.013335734605789185, 0.020368628203868866, -0.0087128272280097, -0.04966169223189354, -0.11079785227775574, -0.10215825587511063, 0.15412960946559906, 0.02154727838933468, -0.044799719005823135, -0.12425380945205688, 0.0868522897362709, 0.06893201917409897, -0.08681359887123108, 0.03298986330628395, 0.01911384053528309, 0.04963945224881172, 0.032767195254564285, -0.0791042149066925, 0.11310597509145737, -0.04780052974820137, -0.15962521731853485, -0.049424830824136734, 0.11775127798318863, 0.02728339098393917, 0.05981019139289856, -0.019022081047296524, 0.0018485739128664136, -0.04327719286084175, -0.0835416316986084, 0.021042121574282646, -0.020711233839392662, 0.07468231767416, 0.052323341369628906, -0.0700971856713295, 0.018044419586658478, -0.07931572198867798, -0.03436040133237839, 0.2224484235048294, 0.20575015246868134, -0.07769695669412613, 0.01299369242042303, 0.028788169845938683, -0.06822866946458817, -0.19206488132476807, 0.04958324506878853, 0.07662253081798553, 0.01590069569647312, 0.0314638577401638, -0.1830989569425583, 0.10295435041189194, 0.09418802708387375, 0.015388633124530315, 0.11325845867395401, -0.3410303294658661, -0.13259775936603546, 0.10082561522722244, 0.16026099026203156, 0.15516060590744019, -0.137332022190094, -0.011572190560400486, -0.03180493414402008, -0.11805304139852524, 0.09698802977800369, -0.07130748778581619, 0.13147273659706116, -0.038213904947042465, 0.12131280452013016, 0.01777317374944687, -0.04707927629351616, 0.09150560945272446, 0.01964927837252617, 0.09049562364816666, -0.07382499426603317, -0.028856215998530388, 0.038854796439409256, -0.03219088166952133, 0.009531461633741856, -0.06700441986322403, 0.03592539578676224, -0.1071716845035553, -0.026328183710575104, -0.09779366105794907, 0.03560638055205345, -0.026917055249214172, -0.07546509057283401, -0.01659161038696766, -0.004465328063815832, 0.04299411550164223, -0.017872493714094162, 0.09014017134904861, -0.007603462785482407, 0.1700720638036728, 0.09189116209745407, 0.08964252471923828, -0.06980089098215103, -0.025684582069516182, -0.014674215577542782, -0.00654322886839509, 0.053375210613012314, -0.12657855451107025, 0.020331600680947304, 0.1455584615468979, 0.020442942157387733, 0.1375977247953415, 0.09336310625076294, -0.011292417533695698, 0.012452822178602219, 0.06459634751081467, -0.1869724988937378, -0.05624253675341606, -0.04350292682647705, -0.1041007861495018, -0.09033986926078796, 0.04841326177120209, 0.09579458832740784, -0.0659259706735611, -0.01515856385231018, -0.03185243159532547, -0.014011272229254246, -0.07294955104589462, 0.2066422700881958, 0.0520397387444973, 0.051766619086265564, -0.10349788516759872, 0.0671515241265297, 0.04116910323500633, -0.05603218451142311, 0.011541900224983692, 0.10808540135622025, -0.08944372087717056, -0.03693709895014763, 0.10429093986749649, 0.18689276278018951, -0.07035931944847107, -0.019922763109207153, -0.13207541406154633, -0.1156141385436058, 0.08132845908403397, 0.17239350080490112, 0.10157082229852676, 0.015166059136390686, -0.06370856612920761, 0.010011548176407814, -0.14029797911643982, 0.0676608756184578, 0.04361489415168762, 0.06473258882761002, -0.1262138932943344, 0.2129068821668625, -0.0023154988884925842, 0.05622854828834534, -0.032706547528505325, 0.01485752034932375, -0.09636211395263672, 0.025363964959979057, -0.11750105023384094, -0.05829314887523651, 0.00550372339785099, -0.005969870835542679, -0.01697765477001667, -0.05740463733673096, -0.04936344921588898, 0.008320130407810211, -0.1236921176314354, -0.02742655761539936, 0.02781510353088379, 0.03827868402004242, -0.10171448439359665, -0.03487246856093407, 0.017410883679986, -0.05639156699180603, 0.07157368212938309, 0.07272446900606155, -0.00397211080417037, 0.07397554069757462, -0.14302252233028412, -0.0014245655620470643, 0.06810510158538818, 0.0013467124663293362, 0.06783849745988846, -0.04092089831829071, -0.00019400421297177672, -0.0006666174158453941, 0.0894797071814537, 0.030939558520913124, 0.07306297868490219, -0.13365669548511505, 0.01056858990341425, -0.017262492328882217, -0.08875098079442978, -0.0759480744600296, 0.03398287296295166, 0.06452367454767227, 0.014256913214921951, 0.17435051500797272, -0.07850529998540878, 0.053400855511426926, -0.21076060831546783, -0.003998906351625919, -0.009947346523404121, -0.11615818738937378, -0.1159333884716034, -0.08193597197532654, 0.08008071035146713, -0.0465516559779644, 0.1195061132311821, 0.037890005856752396, 0.06628413498401642, 0.016623271629214287, -0.01517210528254509, 0.012761242687702179, 0.01627812162041664, 0.2208070009946823, 0.045408159494400024, -0.05073279142379761, 0.05992203578352928, 0.07405541092157364, 0.11193659156560898, 0.13940300047397614, 0.2224476933479309, 0.12880825996398926, -0.02368859015405178, 0.08911356329917908, 0.0059840925969183445, -0.03681480512022972, -0.15476152300834656, -0.012430203147232533, -0.03753632307052612, 0.08563307672739029, -0.03138217702507973, 0.21224510669708252, 0.0715666115283966, -0.15467925369739532, 0.0458824448287487, -0.04589717462658882, -0.09304409474134445, -0.11271622031927109, -0.024510199204087257, -0.08150521665811539, -0.14637918770313263, -0.0017043891130015254, -0.12085560709238052, 0.03487730398774147, 0.10958603024482727, 0.014659814536571503, -0.032230447977781296, 0.1838097721338272, 0.04333769902586937, -0.0014148838818073273, 0.07942980527877808, -0.007793256547302008, -0.001440678839571774, -0.10760854929685593, -0.07167036086320877, -0.017038287594914436, -0.006172166671603918, 0.041762929409742355, -0.059143658727407455, -0.10183639079332352, 0.024307886138558388, -0.039584968239068985, -0.10471055656671524, 0.016559500247240067, 0.02525143139064312, 0.06163470819592476, 0.04203315079212189, 0.008842836134135723, -0.007609372492879629, -0.023922763764858246, 0.24543364346027374, -0.08171042054891586, -0.09485754370689392, -0.0832013487815857, 0.28547653555870056, 0.058503080159425735, 0.0029951762408018112, 0.029948681592941284, -0.05947394296526909, -0.00011488139716675505, 0.2696416676044464, 0.1996007114648819, -0.10822167247533798, -0.011048302054405212, -0.0022105444222688675, -0.004552507307380438, 0.00027193230926059186, 0.13925723731517792, 0.13621555268764496, 0.03859839215874672, -0.0941309705376625, -0.009482625871896744, -0.04882151260972023, -0.008866713382303715, -0.035473261028528214, 0.07729371637105942, 0.054773349314928055, 0.004833593964576721, -0.041704390197992325, 0.062479808926582336, -0.08464624732732773, -0.06336059421300888, 0.02046426571905613, -0.20803695917129517, -0.1532631367444992, -0.01966339722275734, 0.08709615468978882, -0.0017618100391700864, 0.07855791598558426, -0.025279490277171135, 0.0008226468344219029, 0.04883004352450371, -0.01757540926337242, -0.08788279443979263, -0.07555310428142548, 0.09612346440553665, -0.14820130169391632, 0.153781458735466, -0.04548148810863495, 0.05575745180249214, 0.11575129628181458, 0.06168179586529732, -0.0601874440908432, 0.07953494787216187, 0.027354605495929718, -0.05673285946249962, 0.03921723738312721, 0.12012418359518051, -0.031347405165433884, 0.02379903756082058, 0.033149704337120056, -0.14218832552433014, 0.02787954919040203, -0.0635097324848175, -0.033344775438308716, -0.028578000143170357, -0.05791750177741051, -0.05122306942939758, 0.12336182594299316, 0.24016666412353516, -0.02336074411869049, 0.03678346797823906, -0.09382587671279907, -0.003732639132067561, 0.04556947574019432, 0.03595653921365738, -0.08449915796518326, -0.239898681640625, -0.020331205800175667, 0.10976365953683853, -0.037680208683013916, -0.2640017569065094, -0.0746028944849968, -0.00925681833177805, -0.07084798812866211, -0.10927806049585342, 0.09528275579214096, 0.07152994722127914, 0.048561904579401016, -0.03508083149790764, -0.08657675236463547, -0.07220696657896042, 0.1815720796585083, -0.15900059044361115, -0.09171221405267715 ]
null
null
transformers
How to use this classifier: ``` from transformers import pipeline pipe = pipeline("text-classification", model="Peterard/distilbert_bug_classifier") pipe("The app crashed when I opened it this morning. Can you fix this please?") # [{'label': 'bug', 'score': 0.9042391180992126}] pipe("Please add a like button!") # [{'label': 'no_bug', 'score': 0.9977496266365051}] ``` N.B. The label will change depending on which is the likelier class
{"language": ["en"], "tags": ["text-classification"], "widget": [{"text": "The app crashed when I opened it this morning. Can you fix this please?", "example_title": "Likely bug report"}, {"text": "Please add a like button!", "example_title": "Unlikely bug report"}]}
text-classification
Peterard/distilbert_bug_classifier
[ "transformers", "pytorch", "distilbert", "text-classification", "en", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #distilbert #text-classification #en #autotrain_compatible #endpoints_compatible #region-us
How to use this classifier: N.B. The label will change depending on which is the likelier class
[]
[ "TAGS\n#transformers #pytorch #distilbert #text-classification #en #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ 40 ]
[ "passage: TAGS\n#transformers #pytorch #distilbert #text-classification #en #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ -0.030263155698776245, 0.06142469495534897, -0.007828718982636929, 0.031225133687257767, 0.20358900725841522, 0.03562045469880104, 0.06820938736200333, 0.10621555149555206, 0.046547118574380875, -0.03041197545826435, 0.10762520134449005, 0.24417153000831604, -0.04405980184674263, 0.10668811202049255, -0.13684993982315063, -0.2992449104785919, 0.05461152642965317, 0.06972608715295792, 0.019786885008215904, 0.11697592586278915, 0.09657807648181915, -0.08509010821580887, 0.06882943212985992, -0.03293928876519203, -0.1298491358757019, 0.036017950624227524, 0.04350459203124046, -0.12731637060642242, 0.09460487961769104, 0.04593615233898163, 0.15812821686267853, 0.0355747751891613, -0.05005868896842003, -0.14156650006771088, 0.03690672665834427, -0.0006296871579252183, -0.0856127068400383, 0.04123389348387718, 0.07759469002485275, -0.13290037214756012, 0.03528701141476631, 0.011388486251235008, 0.025332359597086906, 0.05358390510082245, -0.14048027992248535, -0.060039494186639786, -0.011409671045839787, 0.02808109112083912, 0.08155128359794617, 0.05861373245716095, -0.0022044978104531765, 0.11903467029333115, -0.14006628096103668, 0.13705608248710632, 0.0901946946978569, -0.2644253969192505, -0.01771933026611805, 0.09116377681493759, 0.014223599806427956, 0.026300830766558647, -0.048665717244148254, 0.043455518782138824, 0.03333137556910515, 0.009980334900319576, -0.008841044269502163, -0.06316559761762619, -0.1063011884689331, 0.01929975487291813, -0.07819320261478424, -0.03992309421300888, 0.20191062986850739, -0.06435723602771759, 0.06485437601804733, -0.04038913547992706, -0.09595746546983719, -0.04962761327624321, -0.022241804748773575, 0.022322459146380424, -0.05282851681113243, 0.06954709440469742, 0.05512898787856102, 0.013156321831047535, -0.11389792710542679, 0.02637670747935772, -0.22496022284030914, 0.22504958510398865, 0.009354439564049244, 0.039859019219875336, -0.16803084313869476, 0.06276097893714905, 0.039958465844392776, -0.11207303404808044, 0.056973814964294434, -0.11018376797437668, 0.03054177574813366, -0.0463448129594326, -0.074897401034832, -0.04163983091711998, 0.06743098050355911, 0.14662505686283112, 0.02229440025985241, 0.059233490377664566, -0.024306248873472214, 0.08634810894727707, 0.04027532413601875, 0.12439558655023575, 0.05027192458510399, -0.03145049512386322, 0.03387458622455597, -0.13308480381965637, 0.005861266050487757, -0.06835510581731796, -0.15658490359783173, -0.03929687663912773, 0.05809059366583824, 0.07422714680433273, 0.011625424958765507, 0.08938200026750565, -0.07416994124650955, -0.0391068160533905, 0.08653721213340759, -0.08362367004156113, 0.01947680115699768, 0.014862960204482079, 0.025524474680423737, 0.11352068930864334, -0.018859121948480606, 0.003606675658375025, -0.08291155844926834, 0.15589872002601624, -0.05057518556714058, 0.015474442392587662, -0.025058483704924583, -0.07152311503887177, 0.02337280660867691, -0.16568222641944885, 0.025256413966417313, -0.16909989714622498, -0.1069999486207962, 0.005717745516449213, 0.014878419227898121, -0.0008330399286933243, -0.036604128777980804, -0.03987476974725723, 0.006646536756306887, 0.053508173674345016, -0.04938657209277153, -0.08383119851350784, -0.07425942271947861, 0.10212457180023193, -0.040582749992609024, 0.07869743555784225, -0.1332322061061859, 0.07407139241695404, -0.09826333820819855, -0.023219501599669456, -0.13337667286396027, 0.054010991007089615, -0.03815111890435219, 0.16977332532405853, 0.015805315226316452, -0.05599640682339668, -0.06094007566571236, 0.05316924676299095, -0.07098347693681717, 0.18161314725875854, -0.10509146004915237, -0.1173260286450386, 0.18308968842029572, -0.09320890158414841, -0.11175129562616348, 0.09577787667512894, -0.011982210911810398, 0.010894029401242733, 0.10540283471345901, 0.18580509722232819, 0.10825429111719131, 0.008659704588353634, 0.07095866650342941, 0.13008862733840942, -0.1083240807056427, -0.08912111818790436, -0.014514505863189697, -0.01269503589719534, -0.11888642609119415, 0.06122331693768501, 0.09042013436555862, 0.06765656918287277, -0.045390792191028595, -0.03802425041794777, -0.011823293752968311, 0.0034562728833407164, 0.15137794613838196, 0.05996451526880264, 0.10812651365995407, -0.08040177077054977, 0.006106241140514612, -0.005405459087342024, -0.010139516554772854, 0.027093039825558662, 0.03003935143351555, -0.06289456784725189, 0.11618275195360184, 0.02852674014866352, 0.02462056279182434, -0.23809216916561127, -0.054451342672109604, -0.017107846215367317, 0.1539931297302246, -0.019735248759388924, 0.10584146529436111, 0.044061433523893356, -0.0534224659204483, -0.01847780868411064, -0.03025803156197071, 0.18334323167800903, 0.020164934918284416, -0.06658107787370682, -0.07247137278318405, 0.06914103776216507, -0.06794727593660355, 0.015054309740662575, -0.0702219307422638, 0.019724063575267792, 0.09367301315069199, 0.12772999703884125, 0.005332667380571365, 0.07141829282045364, -0.02805221639573574, 0.06816677004098892, -0.0744733065366745, 0.025232575833797455, 0.11231277137994766, -0.013354175724089146, -0.07708635181188583, 0.14260734617710114, -0.13251598179340363, 0.27290013432502747, 0.20147250592708588, -0.308954119682312, -0.0002741122734732926, -0.046477749943733215, -0.008966922760009766, 0.025395488366484642, 0.028706394135951996, 0.016624918207526207, 0.09583847969770432, -0.001388208824209869, 0.20148183405399323, -0.02271384559571743, -0.040046464651823044, -0.013341426849365234, -0.0486057884991169, -0.04287879168987274, 0.09208986163139343, 0.07037974148988724, -0.20744186639785767, 0.18730397522449493, 0.23234853148460388, 0.021411703899502754, 0.16628305613994598, -0.004486382007598877, 0.041393134742975235, 0.08818282186985016, -0.046370550990104675, -0.02722138725221157, -0.0775606706738472, -0.17754293978214264, -0.03468171879649162, 0.07536408305168152, 0.02784700319170952, 0.06804012507200241, -0.10648555308580399, -0.029398543760180473, -0.00015891376824583858, 0.024012576788663864, -0.019450269639492035, 0.09450309723615646, 0.08008977770805359, 0.11033430695533752, -0.012795131653547287, -0.07719901949167252, 0.11784295737743378, -0.0038315728306770325, -0.07454754412174225, 0.1794985681772232, -0.15731453895568848, -0.3637458086013794, -0.1597631275653839, -0.20231474936008453, -0.031238146126270294, 0.06496544182300568, 0.10954587906599045, -0.12770387530326843, -0.045281004160642624, 0.039824262261390686, -0.004428360145539045, -0.05144230276346207, 0.040548697113990784, -0.057808078825473785, 0.07712304592132568, -0.05236544832587242, -0.06135454401373863, -0.0684986412525177, -0.03339593857526779, 0.0026919327210634947, 0.16469484567642212, -0.13133996725082397, 0.0660826787352562, 0.17590445280075073, -0.0010816800640895963, 0.06545384973287582, -0.0342937670648098, 0.16947664320468903, -0.09666881710290909, -0.0241252351552248, 0.18548855185508728, -0.0750376284122467, 0.07672172784805298, 0.15906785428524017, 0.01976652815937996, -0.06716727465391159, 0.03439202532172203, -0.03450751304626465, -0.08740083873271942, -0.20452843606472015, -0.17190618813037872, -0.1161089688539505, 0.050680458545684814, 0.06675353646278381, 0.06575100868940353, 0.13196499645709991, 0.06618184596300125, 0.0021888669580221176, 0.011328908614814281, 0.00765720009803772, 0.07721061259508133, 0.26029154658317566, -0.0064701815135777, 0.1471407115459442, -0.05724647268652916, -0.13205792009830475, 0.08839341998100281, -0.00588693143799901, 0.09658820927143097, 0.09181024879217148, 0.01989448443055153, 0.005359832663089037, 0.05494320020079613, 0.16711965203285217, 0.12444712966680527, 0.04792121797800064, -0.012074846774339676, -0.012951315380632877, 0.0023802220821380615, -0.0812477394938469, 0.009363747201859951, 0.08584925532341003, -0.1352575719356537, -0.07951557636260986, -0.10397672653198242, 0.09759071469306946, 0.08390431851148605, 0.04624316468834877, -0.19875068962574005, 0.0028780000284314156, 0.09073681384325027, -0.0245381910353899, -0.09682867676019669, 0.07193125039339066, -0.049251850694417953, -0.13661248981952667, 0.10380406677722931, -0.029474228620529175, 0.13350795209407806, -0.08837705105543137, 0.08774659037590027, -0.043939415365457535, -0.11300360411405563, 0.02974116988480091, 0.10863682627677917, -0.282967746257782, 0.21104222536087036, 0.012414379976689816, -0.06731738895177841, -0.0788622498512268, -0.023850418627262115, 0.04115491360425949, 0.22505968809127808, 0.07667795568704605, -0.0007896318566054106, -0.05748160183429718, -0.17885781824588776, -0.016879668459296227, -0.00720368605107069, 0.1283119022846222, -0.036770064383745193, -0.017478061839938164, -0.04114226996898651, -0.02355293557047844, -0.032694485038518906, -0.06353391706943512, 0.027469707652926445, -0.180666983127594, 0.058036863803863525, 0.027986537665128708, 0.04891983047127724, 0.017836343497037888, -0.05570787191390991, -0.12792225182056427, 0.19400614500045776, -0.11497315764427185, -0.09164152294397354, -0.11928042769432068, -0.07590245455503464, 0.023730456829071045, -0.08194214105606079, 0.055993128567934036, -0.07894369959831238, 0.021859334781765938, -0.0714351236820221, -0.19925670325756073, 0.1184455156326294, -0.09836841374635696, -0.04297866299748421, -0.05764048919081688, 0.15988494455814362, -0.07424475252628326, 0.015269812196493149, 0.030198032036423683, 0.021205859258770943, -0.08731099963188171, -0.08568815886974335, -0.005570706445723772, 0.02228299155831337, 0.06642195582389832, 0.08024309575557709, -0.09448964893817902, -0.07822077721357346, -0.03557729721069336, 0.018052948638796806, 0.2873334586620331, 0.16235223412513733, -0.06314227730035782, 0.16269038617610931, 0.10906673222780228, -0.07356047630310059, -0.3344048857688904, -0.09303104877471924, -0.10742867738008499, -0.035388536751270294, -0.041579071432352066, -0.15310171246528625, 0.12332756072282791, -0.003724881215021014, -0.011436771601438522, 0.08037535846233368, -0.16064272820949554, -0.08713050931692123, 0.1972457766532898, -0.04186456277966499, 0.37765276432037354, -0.08878311514854431, -0.0969403013586998, -0.07088949531316757, -0.12908510863780975, 0.11937101185321808, 0.010763918980956078, 0.07990390062332153, -0.024705033749341965, 0.0446200855076313, 0.04709387943148613, -0.03400902450084686, 0.10232236981391907, 0.023076383396983147, 0.024283677339553833, -0.11925613135099411, -0.10387106239795685, 0.022791672497987747, -0.012036630883812904, -0.0161875132471323, -0.016324473544955254, 0.014912768267095089, -0.1688709855079651, -0.04393862187862396, -0.06819365173578262, 0.06376270204782486, 0.02937050350010395, -0.03960207477211952, 0.009105977602303028, -0.021468253806233406, -0.0005557039403356612, 0.0010501400101929903, 0.25543326139450073, -0.044152382761240005, 0.1598699539899826, 0.08079736679792404, 0.15581634640693665, -0.1504744440317154, 0.021779799833893776, -0.07902948558330536, -0.0544290766119957, 0.06146973744034767, -0.06453204154968262, 0.07392577081918716, 0.13759030401706696, -0.05295651778578758, 0.06989207863807678, 0.11197321116924286, 0.07687966525554657, -0.03207192197442055, 0.15982303023338318, -0.2364213615655899, 0.03902840241789818, -0.05223481357097626, -0.04157787188887596, 0.06940407305955887, 0.06270080804824829, 0.12716440856456757, 0.061003971844911575, -0.04190092533826828, 0.0056634871289134026, 0.004191338550299406, 0.00015420839190483093, 0.06481848657131195, 0.05343453586101532, 0.03788289055228233, -0.14204254746437073, 0.04975409805774689, 0.053507789969444275, -0.16259104013442993, -0.02058786153793335, 0.1300233006477356, -0.1718578189611435, -0.12348631024360657, -0.028840677812695503, 0.11952333897352219, -0.10093004256486893, -0.04885261133313179, -0.07258017361164093, -0.1308302879333496, 0.07141022384166718, 0.19740551710128784, 0.1317022442817688, 0.09363874793052673, -0.060135141015052795, -0.05302693694829941, -0.0024800957180559635, -0.0035575581714510918, 0.011535931378602982, 0.025270795449614525, -0.1193637102842331, 0.040646836161613464, -0.020342646166682243, 0.15505008399486542, -0.09454428404569626, -0.0766717717051506, -0.15661756694316864, 0.04113609716296196, -0.09291493892669678, -0.023629169911146164, -0.08887065201997757, -0.016758760437369347, 0.0017545223236083984, -0.03403700888156891, -0.02871173433959484, -0.06411834061145782, -0.11183390021324158, 0.04118733108043671, -0.02028043381869793, 0.04490689933300018, -0.06806670874357224, -0.04521825537085533, 0.09757805615663528, -0.033758942037820816, 0.10629785060882568, 0.10498406738042831, -0.09222667664289474, 0.09211854636669159, -0.13728487491607666, -0.1252969205379486, 0.1381920427083969, 0.03196767345070839, 0.07015617191791534, 0.07479675859212875, 0.03168288245797157, 0.07343006879091263, 0.00634493213146925, 0.07170924544334412, 0.06975588202476501, -0.11967138946056366, 0.07242695987224579, -0.038700684905052185, -0.17693042755126953, -0.05847214162349701, -0.03839130699634552, 0.09957719594240189, 0.008366605266928673, 0.17321918904781342, -0.06036517396569252, 0.09879463165998459, -0.02578139491379261, 0.012558653950691223, -0.016774464398622513, -0.204813152551651, -0.06251320242881775, -0.08108261227607727, 0.025232501327991486, 0.006283313035964966, 0.2376510053873062, 0.058470383286476135, 0.03221756964921951, 0.0556790828704834, 0.08779588341712952, -0.011330999433994293, 0.020034777000546455, 0.16821785271167755, 0.09506379812955856, -0.05277571827173233, -0.05695256218314171, 0.05582910403609276, 0.030278123915195465, 0.0047790962271392345, 0.13251441717147827, 0.07431180030107498, -0.03340243175625801, 0.07314082235097885, -0.036665819585323334, 0.04202885553240776, -0.12904013693332672, -0.15144476294517517, -0.04380800947546959, 0.06399941444396973, 0.023600280284881592, 0.036251578480005264, 0.07424605637788773, -0.02542693540453911, 0.049799952656030655, -0.04248998314142227, -0.052378177642822266, -0.18300984799861908, -0.0950779989361763, -0.0934150218963623, -0.09030137211084366, 0.007287723943591118, -0.07546073198318481, -0.007786547765135765, 0.06312061846256256, 0.045175835490226746, -0.05834583193063736, 0.06992648541927338, -0.0007173772901296616, -0.05887741968035698, 0.08711178600788116, -0.04485640302300453, 0.028817761689424515, -0.014015390537679195, -0.02569112740457058, -0.13743257522583008, -0.015988269820809364, -0.042475756257772446, 0.03658660873770714, -0.057898107916116714, -0.0037632854655385017, -0.14483381807804108, -0.12311024963855743, -0.023238301277160645, 0.05911669880151749, -0.05785064399242401, 0.14310266077518463, 0.017736153677105904, 0.015139907598495483, 0.047134146094322205, 0.17469047009944916, -0.0500209778547287, -0.06555446982383728, -0.04282984510064125, 0.23555348813533783, 0.08490831404924393, 0.11101780086755753, 0.001315528410486877, -0.015490680932998657, -0.0767117440700531, 0.2848365902900696, 0.28511664271354675, -0.05533486232161522, 0.05298805609345436, 0.007059083785861731, 0.03426504507660866, 0.15706349909305573, 0.1228085458278656, 0.09367413818836212, 0.23328857123851776, -0.058263979852199554, -0.05195705592632294, -0.03035363368690014, -0.026237159967422485, -0.11869654059410095, 0.05929537117481232, 0.047956325113773346, -0.0459466427564621, -0.06476959586143494, 0.11065966635942459, -0.2118443101644516, 0.1567011922597885, 0.012754237279295921, -0.20724669098854065, -0.06622550636529922, -0.026176853105425835, 0.1372288316488266, -0.0013158525107428432, 0.07490751147270203, -0.004395733587443829, -0.1221928596496582, 0.03371364250779152, 0.015106147155165672, -0.2105216085910797, 0.004209205973893404, 0.06140859052538872, -0.059376075863838196, -0.0044571403414011, -0.025803573429584503, 0.04453223571181297, 0.06678083539009094, 0.07606784999370575, -0.004920254927128553, 0.04413740709424019, -0.007435242645442486, -0.027969831600785255, 0.023203881457448006, 0.024484550580382347, 0.01151964906603098, -0.09357373416423798, 0.07158253341913223, -0.16089771687984467, 0.06013020500540733, -0.05596740543842316, -0.06262783706188202, -0.01719663105905056, 0.054713405668735504, -0.05921980366110802, 0.050090838223695755, 0.10549213737249374, 0.014515814371407032, -0.029376978054642677, -0.0473344586789608, -0.04020359367132187, -0.012046885676681995, -0.1302250474691391, -0.14249619841575623, -0.09525707364082336, -0.08742735534906387, 0.09667260199785233, 0.00016934289305936545, -0.14103811979293823, -0.007254057098180056, -0.10203728079795837, 0.05028759315609932, -0.16541196405887604, 0.09065710753202438, 0.03150828555226326, 0.017963040620088577, -0.009992615319788456, -0.03196872025728226, 0.04678196832537651, 0.0735187977552414, -0.12647606432437897, -0.08617570996284485 ]
null
null
transformers
How to use this classifier: ``` from transformers import pipeline pipe = pipeline("text-classification", model="Peterard/distilbert_feature_classifier") pipe("Please add a like button!") # [{'label': 'feature_request', 'score': 0.8930749893188477}] pipe("The app crashed when I opened it this morning. Can you fix this please?") #[{'label': 'no_feature_request', 'score': 0.9971746206283569}] ``` N.B. The label will change depending on which is the likelier class
{"language": ["en"], "tags": ["text-classification"], "widget": [{"text": "Please add a like button!", "example_title": "Likely feature request"}, {"text": "The app crashed when I opened it this morning. Can you fix this please?", "example_title": "Unlikely feature request"}]}
text-classification
Peterard/distilbert_feature_classifier
[ "transformers", "pytorch", "distilbert", "text-classification", "en", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #distilbert #text-classification #en #autotrain_compatible #endpoints_compatible #region-us
How to use this classifier: N.B. The label will change depending on which is the likelier class
[]
[ "TAGS\n#transformers #pytorch #distilbert #text-classification #en #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ 40 ]
[ "passage: TAGS\n#transformers #pytorch #distilbert #text-classification #en #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ -0.030263155698776245, 0.06142469495534897, -0.007828718982636929, 0.031225133687257767, 0.20358900725841522, 0.03562045469880104, 0.06820938736200333, 0.10621555149555206, 0.046547118574380875, -0.03041197545826435, 0.10762520134449005, 0.24417153000831604, -0.04405980184674263, 0.10668811202049255, -0.13684993982315063, -0.2992449104785919, 0.05461152642965317, 0.06972608715295792, 0.019786885008215904, 0.11697592586278915, 0.09657807648181915, -0.08509010821580887, 0.06882943212985992, -0.03293928876519203, -0.1298491358757019, 0.036017950624227524, 0.04350459203124046, -0.12731637060642242, 0.09460487961769104, 0.04593615233898163, 0.15812821686267853, 0.0355747751891613, -0.05005868896842003, -0.14156650006771088, 0.03690672665834427, -0.0006296871579252183, -0.0856127068400383, 0.04123389348387718, 0.07759469002485275, -0.13290037214756012, 0.03528701141476631, 0.011388486251235008, 0.025332359597086906, 0.05358390510082245, -0.14048027992248535, -0.060039494186639786, -0.011409671045839787, 0.02808109112083912, 0.08155128359794617, 0.05861373245716095, -0.0022044978104531765, 0.11903467029333115, -0.14006628096103668, 0.13705608248710632, 0.0901946946978569, -0.2644253969192505, -0.01771933026611805, 0.09116377681493759, 0.014223599806427956, 0.026300830766558647, -0.048665717244148254, 0.043455518782138824, 0.03333137556910515, 0.009980334900319576, -0.008841044269502163, -0.06316559761762619, -0.1063011884689331, 0.01929975487291813, -0.07819320261478424, -0.03992309421300888, 0.20191062986850739, -0.06435723602771759, 0.06485437601804733, -0.04038913547992706, -0.09595746546983719, -0.04962761327624321, -0.022241804748773575, 0.022322459146380424, -0.05282851681113243, 0.06954709440469742, 0.05512898787856102, 0.013156321831047535, -0.11389792710542679, 0.02637670747935772, -0.22496022284030914, 0.22504958510398865, 0.009354439564049244, 0.039859019219875336, -0.16803084313869476, 0.06276097893714905, 0.039958465844392776, -0.11207303404808044, 0.056973814964294434, -0.11018376797437668, 0.03054177574813366, -0.0463448129594326, -0.074897401034832, -0.04163983091711998, 0.06743098050355911, 0.14662505686283112, 0.02229440025985241, 0.059233490377664566, -0.024306248873472214, 0.08634810894727707, 0.04027532413601875, 0.12439558655023575, 0.05027192458510399, -0.03145049512386322, 0.03387458622455597, -0.13308480381965637, 0.005861266050487757, -0.06835510581731796, -0.15658490359783173, -0.03929687663912773, 0.05809059366583824, 0.07422714680433273, 0.011625424958765507, 0.08938200026750565, -0.07416994124650955, -0.0391068160533905, 0.08653721213340759, -0.08362367004156113, 0.01947680115699768, 0.014862960204482079, 0.025524474680423737, 0.11352068930864334, -0.018859121948480606, 0.003606675658375025, -0.08291155844926834, 0.15589872002601624, -0.05057518556714058, 0.015474442392587662, -0.025058483704924583, -0.07152311503887177, 0.02337280660867691, -0.16568222641944885, 0.025256413966417313, -0.16909989714622498, -0.1069999486207962, 0.005717745516449213, 0.014878419227898121, -0.0008330399286933243, -0.036604128777980804, -0.03987476974725723, 0.006646536756306887, 0.053508173674345016, -0.04938657209277153, -0.08383119851350784, -0.07425942271947861, 0.10212457180023193, -0.040582749992609024, 0.07869743555784225, -0.1332322061061859, 0.07407139241695404, -0.09826333820819855, -0.023219501599669456, -0.13337667286396027, 0.054010991007089615, -0.03815111890435219, 0.16977332532405853, 0.015805315226316452, -0.05599640682339668, -0.06094007566571236, 0.05316924676299095, -0.07098347693681717, 0.18161314725875854, -0.10509146004915237, -0.1173260286450386, 0.18308968842029572, -0.09320890158414841, -0.11175129562616348, 0.09577787667512894, -0.011982210911810398, 0.010894029401242733, 0.10540283471345901, 0.18580509722232819, 0.10825429111719131, 0.008659704588353634, 0.07095866650342941, 0.13008862733840942, -0.1083240807056427, -0.08912111818790436, -0.014514505863189697, -0.01269503589719534, -0.11888642609119415, 0.06122331693768501, 0.09042013436555862, 0.06765656918287277, -0.045390792191028595, -0.03802425041794777, -0.011823293752968311, 0.0034562728833407164, 0.15137794613838196, 0.05996451526880264, 0.10812651365995407, -0.08040177077054977, 0.006106241140514612, -0.005405459087342024, -0.010139516554772854, 0.027093039825558662, 0.03003935143351555, -0.06289456784725189, 0.11618275195360184, 0.02852674014866352, 0.02462056279182434, -0.23809216916561127, -0.054451342672109604, -0.017107846215367317, 0.1539931297302246, -0.019735248759388924, 0.10584146529436111, 0.044061433523893356, -0.0534224659204483, -0.01847780868411064, -0.03025803156197071, 0.18334323167800903, 0.020164934918284416, -0.06658107787370682, -0.07247137278318405, 0.06914103776216507, -0.06794727593660355, 0.015054309740662575, -0.0702219307422638, 0.019724063575267792, 0.09367301315069199, 0.12772999703884125, 0.005332667380571365, 0.07141829282045364, -0.02805221639573574, 0.06816677004098892, -0.0744733065366745, 0.025232575833797455, 0.11231277137994766, -0.013354175724089146, -0.07708635181188583, 0.14260734617710114, -0.13251598179340363, 0.27290013432502747, 0.20147250592708588, -0.308954119682312, -0.0002741122734732926, -0.046477749943733215, -0.008966922760009766, 0.025395488366484642, 0.028706394135951996, 0.016624918207526207, 0.09583847969770432, -0.001388208824209869, 0.20148183405399323, -0.02271384559571743, -0.040046464651823044, -0.013341426849365234, -0.0486057884991169, -0.04287879168987274, 0.09208986163139343, 0.07037974148988724, -0.20744186639785767, 0.18730397522449493, 0.23234853148460388, 0.021411703899502754, 0.16628305613994598, -0.004486382007598877, 0.041393134742975235, 0.08818282186985016, -0.046370550990104675, -0.02722138725221157, -0.0775606706738472, -0.17754293978214264, -0.03468171879649162, 0.07536408305168152, 0.02784700319170952, 0.06804012507200241, -0.10648555308580399, -0.029398543760180473, -0.00015891376824583858, 0.024012576788663864, -0.019450269639492035, 0.09450309723615646, 0.08008977770805359, 0.11033430695533752, -0.012795131653547287, -0.07719901949167252, 0.11784295737743378, -0.0038315728306770325, -0.07454754412174225, 0.1794985681772232, -0.15731453895568848, -0.3637458086013794, -0.1597631275653839, -0.20231474936008453, -0.031238146126270294, 0.06496544182300568, 0.10954587906599045, -0.12770387530326843, -0.045281004160642624, 0.039824262261390686, -0.004428360145539045, -0.05144230276346207, 0.040548697113990784, -0.057808078825473785, 0.07712304592132568, -0.05236544832587242, -0.06135454401373863, -0.0684986412525177, -0.03339593857526779, 0.0026919327210634947, 0.16469484567642212, -0.13133996725082397, 0.0660826787352562, 0.17590445280075073, -0.0010816800640895963, 0.06545384973287582, -0.0342937670648098, 0.16947664320468903, -0.09666881710290909, -0.0241252351552248, 0.18548855185508728, -0.0750376284122467, 0.07672172784805298, 0.15906785428524017, 0.01976652815937996, -0.06716727465391159, 0.03439202532172203, -0.03450751304626465, -0.08740083873271942, -0.20452843606472015, -0.17190618813037872, -0.1161089688539505, 0.050680458545684814, 0.06675353646278381, 0.06575100868940353, 0.13196499645709991, 0.06618184596300125, 0.0021888669580221176, 0.011328908614814281, 0.00765720009803772, 0.07721061259508133, 0.26029154658317566, -0.0064701815135777, 0.1471407115459442, -0.05724647268652916, -0.13205792009830475, 0.08839341998100281, -0.00588693143799901, 0.09658820927143097, 0.09181024879217148, 0.01989448443055153, 0.005359832663089037, 0.05494320020079613, 0.16711965203285217, 0.12444712966680527, 0.04792121797800064, -0.012074846774339676, -0.012951315380632877, 0.0023802220821380615, -0.0812477394938469, 0.009363747201859951, 0.08584925532341003, -0.1352575719356537, -0.07951557636260986, -0.10397672653198242, 0.09759071469306946, 0.08390431851148605, 0.04624316468834877, -0.19875068962574005, 0.0028780000284314156, 0.09073681384325027, -0.0245381910353899, -0.09682867676019669, 0.07193125039339066, -0.049251850694417953, -0.13661248981952667, 0.10380406677722931, -0.029474228620529175, 0.13350795209407806, -0.08837705105543137, 0.08774659037590027, -0.043939415365457535, -0.11300360411405563, 0.02974116988480091, 0.10863682627677917, -0.282967746257782, 0.21104222536087036, 0.012414379976689816, -0.06731738895177841, -0.0788622498512268, -0.023850418627262115, 0.04115491360425949, 0.22505968809127808, 0.07667795568704605, -0.0007896318566054106, -0.05748160183429718, -0.17885781824588776, -0.016879668459296227, -0.00720368605107069, 0.1283119022846222, -0.036770064383745193, -0.017478061839938164, -0.04114226996898651, -0.02355293557047844, -0.032694485038518906, -0.06353391706943512, 0.027469707652926445, -0.180666983127594, 0.058036863803863525, 0.027986537665128708, 0.04891983047127724, 0.017836343497037888, -0.05570787191390991, -0.12792225182056427, 0.19400614500045776, -0.11497315764427185, -0.09164152294397354, -0.11928042769432068, -0.07590245455503464, 0.023730456829071045, -0.08194214105606079, 0.055993128567934036, -0.07894369959831238, 0.021859334781765938, -0.0714351236820221, -0.19925670325756073, 0.1184455156326294, -0.09836841374635696, -0.04297866299748421, -0.05764048919081688, 0.15988494455814362, -0.07424475252628326, 0.015269812196493149, 0.030198032036423683, 0.021205859258770943, -0.08731099963188171, -0.08568815886974335, -0.005570706445723772, 0.02228299155831337, 0.06642195582389832, 0.08024309575557709, -0.09448964893817902, -0.07822077721357346, -0.03557729721069336, 0.018052948638796806, 0.2873334586620331, 0.16235223412513733, -0.06314227730035782, 0.16269038617610931, 0.10906673222780228, -0.07356047630310059, -0.3344048857688904, -0.09303104877471924, -0.10742867738008499, -0.035388536751270294, -0.041579071432352066, -0.15310171246528625, 0.12332756072282791, -0.003724881215021014, -0.011436771601438522, 0.08037535846233368, -0.16064272820949554, -0.08713050931692123, 0.1972457766532898, -0.04186456277966499, 0.37765276432037354, -0.08878311514854431, -0.0969403013586998, -0.07088949531316757, -0.12908510863780975, 0.11937101185321808, 0.010763918980956078, 0.07990390062332153, -0.024705033749341965, 0.0446200855076313, 0.04709387943148613, -0.03400902450084686, 0.10232236981391907, 0.023076383396983147, 0.024283677339553833, -0.11925613135099411, -0.10387106239795685, 0.022791672497987747, -0.012036630883812904, -0.0161875132471323, -0.016324473544955254, 0.014912768267095089, -0.1688709855079651, -0.04393862187862396, -0.06819365173578262, 0.06376270204782486, 0.02937050350010395, -0.03960207477211952, 0.009105977602303028, -0.021468253806233406, -0.0005557039403356612, 0.0010501400101929903, 0.25543326139450073, -0.044152382761240005, 0.1598699539899826, 0.08079736679792404, 0.15581634640693665, -0.1504744440317154, 0.021779799833893776, -0.07902948558330536, -0.0544290766119957, 0.06146973744034767, -0.06453204154968262, 0.07392577081918716, 0.13759030401706696, -0.05295651778578758, 0.06989207863807678, 0.11197321116924286, 0.07687966525554657, -0.03207192197442055, 0.15982303023338318, -0.2364213615655899, 0.03902840241789818, -0.05223481357097626, -0.04157787188887596, 0.06940407305955887, 0.06270080804824829, 0.12716440856456757, 0.061003971844911575, -0.04190092533826828, 0.0056634871289134026, 0.004191338550299406, 0.00015420839190483093, 0.06481848657131195, 0.05343453586101532, 0.03788289055228233, -0.14204254746437073, 0.04975409805774689, 0.053507789969444275, -0.16259104013442993, -0.02058786153793335, 0.1300233006477356, -0.1718578189611435, -0.12348631024360657, -0.028840677812695503, 0.11952333897352219, -0.10093004256486893, -0.04885261133313179, -0.07258017361164093, -0.1308302879333496, 0.07141022384166718, 0.19740551710128784, 0.1317022442817688, 0.09363874793052673, -0.060135141015052795, -0.05302693694829941, -0.0024800957180559635, -0.0035575581714510918, 0.011535931378602982, 0.025270795449614525, -0.1193637102842331, 0.040646836161613464, -0.020342646166682243, 0.15505008399486542, -0.09454428404569626, -0.0766717717051506, -0.15661756694316864, 0.04113609716296196, -0.09291493892669678, -0.023629169911146164, -0.08887065201997757, -0.016758760437369347, 0.0017545223236083984, -0.03403700888156891, -0.02871173433959484, -0.06411834061145782, -0.11183390021324158, 0.04118733108043671, -0.02028043381869793, 0.04490689933300018, -0.06806670874357224, -0.04521825537085533, 0.09757805615663528, -0.033758942037820816, 0.10629785060882568, 0.10498406738042831, -0.09222667664289474, 0.09211854636669159, -0.13728487491607666, -0.1252969205379486, 0.1381920427083969, 0.03196767345070839, 0.07015617191791534, 0.07479675859212875, 0.03168288245797157, 0.07343006879091263, 0.00634493213146925, 0.07170924544334412, 0.06975588202476501, -0.11967138946056366, 0.07242695987224579, -0.038700684905052185, -0.17693042755126953, -0.05847214162349701, -0.03839130699634552, 0.09957719594240189, 0.008366605266928673, 0.17321918904781342, -0.06036517396569252, 0.09879463165998459, -0.02578139491379261, 0.012558653950691223, -0.016774464398622513, -0.204813152551651, -0.06251320242881775, -0.08108261227607727, 0.025232501327991486, 0.006283313035964966, 0.2376510053873062, 0.058470383286476135, 0.03221756964921951, 0.0556790828704834, 0.08779588341712952, -0.011330999433994293, 0.020034777000546455, 0.16821785271167755, 0.09506379812955856, -0.05277571827173233, -0.05695256218314171, 0.05582910403609276, 0.030278123915195465, 0.0047790962271392345, 0.13251441717147827, 0.07431180030107498, -0.03340243175625801, 0.07314082235097885, -0.036665819585323334, 0.04202885553240776, -0.12904013693332672, -0.15144476294517517, -0.04380800947546959, 0.06399941444396973, 0.023600280284881592, 0.036251578480005264, 0.07424605637788773, -0.02542693540453911, 0.049799952656030655, -0.04248998314142227, -0.052378177642822266, -0.18300984799861908, -0.0950779989361763, -0.0934150218963623, -0.09030137211084366, 0.007287723943591118, -0.07546073198318481, -0.007786547765135765, 0.06312061846256256, 0.045175835490226746, -0.05834583193063736, 0.06992648541927338, -0.0007173772901296616, -0.05887741968035698, 0.08711178600788116, -0.04485640302300453, 0.028817761689424515, -0.014015390537679195, -0.02569112740457058, -0.13743257522583008, -0.015988269820809364, -0.042475756257772446, 0.03658660873770714, -0.057898107916116714, -0.0037632854655385017, -0.14483381807804108, -0.12311024963855743, -0.023238301277160645, 0.05911669880151749, -0.05785064399242401, 0.14310266077518463, 0.017736153677105904, 0.015139907598495483, 0.047134146094322205, 0.17469047009944916, -0.0500209778547287, -0.06555446982383728, -0.04282984510064125, 0.23555348813533783, 0.08490831404924393, 0.11101780086755753, 0.001315528410486877, -0.015490680932998657, -0.0767117440700531, 0.2848365902900696, 0.28511664271354675, -0.05533486232161522, 0.05298805609345436, 0.007059083785861731, 0.03426504507660866, 0.15706349909305573, 0.1228085458278656, 0.09367413818836212, 0.23328857123851776, -0.058263979852199554, -0.05195705592632294, -0.03035363368690014, -0.026237159967422485, -0.11869654059410095, 0.05929537117481232, 0.047956325113773346, -0.0459466427564621, -0.06476959586143494, 0.11065966635942459, -0.2118443101644516, 0.1567011922597885, 0.012754237279295921, -0.20724669098854065, -0.06622550636529922, -0.026176853105425835, 0.1372288316488266, -0.0013158525107428432, 0.07490751147270203, -0.004395733587443829, -0.1221928596496582, 0.03371364250779152, 0.015106147155165672, -0.2105216085910797, 0.004209205973893404, 0.06140859052538872, -0.059376075863838196, -0.0044571403414011, -0.025803573429584503, 0.04453223571181297, 0.06678083539009094, 0.07606784999370575, -0.004920254927128553, 0.04413740709424019, -0.007435242645442486, -0.027969831600785255, 0.023203881457448006, 0.024484550580382347, 0.01151964906603098, -0.09357373416423798, 0.07158253341913223, -0.16089771687984467, 0.06013020500540733, -0.05596740543842316, -0.06262783706188202, -0.01719663105905056, 0.054713405668735504, -0.05921980366110802, 0.050090838223695755, 0.10549213737249374, 0.014515814371407032, -0.029376978054642677, -0.0473344586789608, -0.04020359367132187, -0.012046885676681995, -0.1302250474691391, -0.14249619841575623, -0.09525707364082336, -0.08742735534906387, 0.09667260199785233, 0.00016934289305936545, -0.14103811979293823, -0.007254057098180056, -0.10203728079795837, 0.05028759315609932, -0.16541196405887604, 0.09065710753202438, 0.03150828555226326, 0.017963040620088577, -0.009992615319788456, -0.03196872025728226, 0.04678196832537651, 0.0735187977552414, -0.12647606432437897, -0.08617570996284485 ]
null
null
transformers
Attempt of guided text generation to replace GPT-3 for :[This SCP Does Not Exist](https://www.thisscpdoesnotexist.ml) Work in Porgress Finetuned on a dataset of 1700 automatically generated samples from the [official SCP wiki](https://scp-wiki.wikidot.com/) Exemple input : ```Prompt: SCP-9741 is a pair of jeans that looks really cool ### Generation: Item #: SCP-9741\nObject Class: Safe\nSpecial Containment Procedures:``` # Acknowledgment This work was made possible thanks to the TPU Research Cloud program by Google
{}
text-generation
PhilSad/GPT-J6B-Guided-SCP
[ "transformers", "pytorch", "gptj", "text-generation", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gptj #text-generation #autotrain_compatible #endpoints_compatible #region-us
Attempt of guided text generation to replace GPT-3 for :This SCP Does Not Exist Work in Porgress Finetuned on a dataset of 1700 automatically generated samples from the official SCP wiki Exemple input : # Acknowledgment This work was made possible thanks to the TPU Research Cloud program by Google
[ "# Acknowledgment\nThis work was made possible thanks to the TPU Research Cloud program by Google" ]
[ "TAGS\n#transformers #pytorch #gptj #text-generation #autotrain_compatible #endpoints_compatible #region-us \n", "# Acknowledgment\nThis work was made possible thanks to the TPU Research Cloud program by Google" ]
[ 38, 21 ]
[ "passage: TAGS\n#transformers #pytorch #gptj #text-generation #autotrain_compatible #endpoints_compatible #region-us \n# Acknowledgment\nThis work was made possible thanks to the TPU Research Cloud program by Google" ]
[ -0.024371201172471046, 0.05903816223144531, -0.0028124635573476553, 0.018461579456925392, 0.16297662258148193, 0.02733231894671917, 0.09063077718019485, 0.09853512793779373, -0.00023744598729535937, -0.04026171192526817, 0.1293771117925644, 0.17198412120342255, -0.024018289521336555, 0.16860219836235046, 0.0019264533184468746, -0.2796424329280853, 0.014408999122679234, 0.06525120139122009, -0.04426693543791771, 0.1546781063079834, 0.07507900148630142, -0.03270578011870384, 0.0985766127705574, -0.055867213755846024, -0.14342761039733887, 0.023368556052446365, -0.03298696130514145, -0.12935508787631989, 0.1221444383263588, 0.09227225184440613, -0.013317910023033619, 0.047771766781806946, -0.06233665719628334, -0.0013327825581654906, 0.04630555957555771, 0.015457804314792156, -0.1085234060883522, 0.06521216779947281, 0.05379747226834297, -0.046448856592178345, 0.07707510888576508, 0.08468256890773773, -0.02809249795973301, 0.015822572633624077, -0.08518794178962708, -0.05197906866669655, -0.03889414668083191, 0.07008858770132065, -0.0339813306927681, 0.07926998287439346, 0.008689058013260365, 0.2552625238895416, -0.10759928822517395, 0.1213732659816742, 0.11094172298908234, -0.3149651288986206, -0.07172098755836487, 0.09400510787963867, 0.02129456028342247, 0.00007614540663780645, 0.07860946655273438, 0.0641942098736763, 0.009401951916515827, 0.036617498844861984, -0.04506543278694153, -0.036777473986148834, -0.20106424391269684, 0.02091635763645172, -0.054101500660181046, -0.07309684157371521, 0.3396308124065399, -0.02082221955060959, 0.03987697884440422, -0.011477315798401833, -0.121939517557621, 0.0442245788872242, 0.01958089880645275, -0.04294233024120331, 0.024728374555706978, 0.049553122371435165, -0.06274263560771942, -0.08647715300321579, -0.09333770722150803, -0.0386396124958992, -0.2042955607175827, 0.1827349066734314, 0.027645979076623917, 0.030656931921839714, -0.14052626490592957, 0.140210822224617, 0.09348098188638687, -0.051739491522312164, 0.018038606271147728, -0.06813730299472809, 0.0630078986287117, -0.023075489327311516, -0.045373596251010895, -0.10375157743692398, 0.11042828112840652, 0.19566591084003448, 0.13679006695747375, -0.035574767738580704, 0.007419528439640999, 0.08326689898967743, 0.034333210438489914, 0.1731671839952469, -0.15659749507904053, -0.07842420786619186, 0.1438014954328537, -0.1944277286529541, 0.0219289418309927, -0.017481619492173195, -0.09797665476799011, -0.08681502193212509, 0.05544193089008331, 0.08652471005916595, 0.09129726886749268, 0.1527673900127411, -0.01574183627963066, -0.04340650513768196, 0.08485942333936691, -0.05518287792801857, -0.058289580047130585, -0.06329753994941711, -0.00794022623449564, 0.022364763543009758, 0.07983123511075974, -0.0013489777920767665, -0.15438395738601685, 0.027490129694342613, -0.04181988164782524, -0.028483860194683075, 0.06012819707393646, -0.019464099779725075, 0.054176799952983856, -0.10028229653835297, 0.040026210248470306, -0.24527645111083984, -0.12552960216999054, -0.028754398226737976, 0.025338374078273773, -0.03317669406533241, -0.0890774354338646, 0.010640092194080353, 0.02687167376279831, -0.02568693831562996, -0.06545542180538177, 0.07767786085605621, -0.06202174350619316, 0.08821766823530197, -0.12061537802219391, 0.071317158639431, -0.13301923871040344, 0.06260339170694351, -0.12163538485765457, -0.06903690099716187, -0.17207802832126617, 0.06803708523511887, 0.0070165228098630905, 0.16918574273586273, -0.08324950933456421, 0.008257247507572174, -0.1141475960612297, 0.010598991066217422, 0.001350768026895821, 0.142528235912323, -0.06871803849935532, -0.06496082246303558, 0.15318050980567932, -0.0612323172390461, -0.2038857638835907, 0.141135111451149, 0.053006354719400406, 0.14877137541770935, 0.06453049182891846, 0.14218027889728546, 0.02461063861846924, 0.022079013288021088, 0.029576128348708153, 0.09238644689321518, -0.1728203296661377, -0.10236305743455887, -0.03786533698439598, -0.004860784392803907, -0.17360705137252808, 0.03416364639997482, 0.08068916946649551, 0.13349829614162445, -0.052656423300504684, 0.008115043863654137, -0.08038096129894257, -0.04138235002756119, -0.008268528617918491, 0.010129444301128387, 0.12953194975852966, 0.004919479135423899, -0.017040712758898735, -0.07181579619646072, -0.03457850217819214, -0.0188837181776762, 0.06600426137447357, -0.04593851417303085, 0.17272324860095978, -0.03837769106030464, 0.09556546062231064, -0.16716118156909943, -0.07858249545097351, 0.03670696169137955, 0.05248064920306206, 0.014248695224523544, -0.0021166461519896984, 0.03524821251630783, -0.07983569800853729, -0.010928872972726822, -0.004727649502456188, 0.12291072309017181, 0.005625995807349682, -0.07556550949811935, -0.02464998885989189, 0.09388000518083572, 0.0157276913523674, -0.08601206541061401, 0.06961565464735031, -0.025496920570731163, 0.07212824374437332, 0.1433817744255066, 0.0012566561345010996, 0.06069871038198471, 0.03629313409328461, 0.015798455104231834, -0.07005127519369125, -0.043325673788785934, 0.06297089904546738, 0.035313550382852554, -0.08589546382427216, 0.2889280617237091, 0.004529744852334261, 0.31375253200531006, 0.1896577775478363, -0.2573337256908417, -0.024492282420396805, 0.08306215703487396, -0.07099301367998123, -0.036239299923181534, 0.025180425494909286, 0.025331880897283554, 0.08282046020030975, -0.011949404142796993, 0.15216876566410065, -0.06870798021554947, -0.037640511989593506, 0.020357485860586166, -0.01995348185300827, 0.023439403623342514, 0.06730629503726959, 0.09516075998544693, -0.1643179953098297, 0.10570064932107925, 0.1425165981054306, 0.0748366042971611, 0.14217157661914825, -0.004407436121255159, -0.0015655969036743045, 0.07509929686784744, 0.01202362310141325, -0.05899824574589729, -0.03976377472281456, -0.24058383703231812, -0.00494825653731823, 0.07418125122785568, 0.0064412858337163925, 0.10527661442756653, -0.07718084752559662, -0.01932908035814762, -0.05536724627017975, 0.013319108635187149, -0.024515556171536446, 0.15118315815925598, -0.011201421730220318, 0.11510051041841507, -0.00541179534047842, -0.0030943695455789566, 0.10223602503538132, 0.04262804985046387, -0.06924940645694733, 0.14646084606647491, -0.021766457706689835, -0.3771916329860687, -0.07943850755691528, 0.002153865760192275, 0.0037797794211655855, 0.015003364533185959, 0.10312891751527786, -0.04687877371907234, -0.01247576903551817, 0.024018241092562675, 0.031187837943434715, 0.07469133287668228, 0.04013362154364586, -0.09163732081651688, 0.048920463770627975, -0.03724586218595505, -0.07053858041763306, -0.021424507722258568, -0.04815690964460373, -0.08201505988836288, 0.1752379983663559, -0.14819909632205963, 0.13036902248859406, 0.04851791635155678, -0.03755806386470795, 0.037863172590732574, -0.02892463468015194, 0.200224831700325, -0.10053690522909164, 0.04027491435408592, 0.19646283984184265, 0.025385480374097824, 0.007391199003905058, 0.04080430045723915, -0.01439693197607994, -0.09827548265457153, 0.05731682479381561, -0.0801839828491211, -0.09825558215379715, -0.2704831063747406, -0.140217587351799, -0.07784096896648407, 0.10054877400398254, 0.04188981652259827, 0.09768955409526825, 0.028490563854575157, 0.1047758236527443, -0.01596500352025032, 0.1727268248796463, -0.03453733026981354, 0.06223973631858826, 0.17175815999507904, -0.035107504576444626, 0.07099632918834686, -0.0833534300327301, -0.09215694665908813, 0.1560179740190506, 0.06400280445814133, 0.09037991613149643, 0.0074267094023525715, -0.014697782695293427, -0.0017502573318779469, 0.1421845406293869, 0.09941201657056808, 0.1390262246131897, 0.07778328657150269, 0.010732414200901985, -0.027987699955701828, -0.004467882681638002, -0.07057444751262665, 0.009265282191336155, 0.0776081308722496, -0.1643393337726593, 0.03378854691982269, -0.10468042641878128, 0.0970892384648323, 0.21651875972747803, 0.012132233940064907, -0.17755557596683502, -0.03399720415472984, -0.02767438068985939, -0.06213413551449776, -0.11367172747850418, 0.0822305753827095, -0.002374502597376704, -0.14664849638938904, 0.03270489349961281, -0.0336495116353035, 0.0970073938369751, -0.10221930593252182, 0.01680372841656208, 0.06828252226114273, -0.15746508538722992, 0.014727706089615822, 0.11641380190849304, -0.28907179832458496, 0.1772783398628235, -0.026361769065260887, -0.07887831330299377, -0.11196370422840118, -0.03403807431459427, 0.024202199652791023, 0.17659416794776917, 0.13310515880584717, 0.01967671886086464, 0.19402916729450226, 0.007116100285202265, -0.16487711668014526, 0.09669636934995651, 0.06545831263065338, -0.0798691138625145, -0.05501004680991173, -0.004776970949023962, -0.028687365353107452, -0.05116773024201393, -0.0023422318045049906, -0.06734851002693176, -0.07617932558059692, 0.04557148739695549, 0.0022012183908373117, 0.15466032922267914, -0.000466050609247759, -0.06590039283037186, -0.10977032035589218, 0.1841905117034912, 0.07108995318412781, -0.08500559628009796, -0.10172822326421738, -0.04376012459397316, 0.056743375957012177, -0.06336012482643127, 0.06050772964954376, -0.03543241694569588, -0.10614433139562607, 0.029608504846692085, -0.20423153042793274, 0.15588216483592987, -0.11998052895069122, -0.08769097924232483, -0.03941795974969864, 0.07760030776262283, -0.09148901700973511, -0.0151649359613657, -0.027252864092588425, -0.014354336075484753, -0.13342557847499847, -0.07803382724523544, 0.0038939581718295813, 0.005434300284832716, 0.07576601952314377, 0.05513060837984085, -0.020816879346966743, 0.03322712704539299, 0.03819723799824715, -0.04024427384138107, 0.21134600043296814, -0.006940107326954603, -0.04143553599715233, 0.13525864481925964, 0.10795314610004425, -0.0009819997940212488, -0.37396666407585144, -0.10885731875896454, -0.0942593365907669, -0.06608187407255173, -0.07264385372400284, -0.16234207153320312, 0.09136362373828888, -0.001876214169897139, -0.03835832327604294, 0.06971564143896103, -0.1637963503599167, -0.11681708693504333, 0.1404029279947281, 0.015031574293971062, 0.33318623900413513, -0.07273908704519272, -0.04602430760860443, -0.04483554884791374, -0.1487194150686264, 0.08742917329072952, -0.02311203069984913, 0.10290182381868362, -0.057626210153102875, 0.026485031470656395, 0.010299752466380596, -0.04590124636888504, -0.013738114386796951, -0.028431853279471397, -0.028309546411037445, -0.12449493259191513, 0.028241269290447235, 0.04071783274412155, 0.021243803203105927, 0.07971294224262238, 0.08043944835662842, 0.09899422526359558, -0.01589236408472061, -0.06048944219946861, -0.07873322069644928, 0.07672658562660217, 0.05519368499517441, -0.06814529001712799, 0.0051350281573832035, -0.039909727871418, -0.00787616427987814, 0.020886629819869995, 0.11220258474349976, 0.01998806558549404, 0.02795044519007206, -0.00529484311118722, 0.11075346171855927, -0.15750350058078766, 0.010182972997426987, -0.024092989042401314, -0.08429992198944092, 0.08015679568052292, -0.10205870121717453, 0.04003873094916344, 0.07467251271009445, -0.03889421373605728, -0.02076813019812107, 0.0846269428730011, -0.0008209355873987079, -0.026957329362630844, 0.09489886462688446, -0.2746712565422058, -0.049480780959129333, -0.12640707194805145, -0.21142978966236115, 0.15372216701507568, 0.14216190576553345, 0.13317136466503143, -0.04233933985233307, -0.08056238293647766, 0.018354976549744606, -0.016164304688572884, -0.0019425032660365105, 0.07947104424238205, 0.022043902426958084, -0.053265154361724854, -0.1462561935186386, 0.12999530136585236, 0.021125907078385353, -0.12871120870113373, -0.020783530548214912, 0.03837203234434128, -0.18365421891212463, -0.13954712450504303, -0.04673442244529724, 0.00520308455452323, -0.10357686132192612, -0.06009266525506973, -0.059623464941978455, -0.029257938265800476, 0.11815045773983002, 0.1258719265460968, 0.07615703344345093, 0.1289501041173935, -0.03850118815898895, -0.044269632548093796, 0.018125958740711212, -0.009050409309566021, 0.014588851481676102, 0.05770781636238098, -0.10906096547842026, -0.021916979923844337, -0.0026567501481622458, 0.1953972578048706, -0.11093021929264069, -0.05163634940981865, -0.12448990345001221, 0.015488997101783752, -0.08276887983083725, -0.05266059935092926, 0.0011038156226277351, 0.005152032244950533, -0.026591619476675987, -0.05812401697039604, -0.04947176203131676, 0.0007591086905449629, -0.09160716086626053, -0.004669036250561476, 0.00015699485084041953, 0.057557977735996246, -0.030378567054867744, 0.019464384764432907, 0.07310622930526733, -0.022081520408391953, 0.2218608558177948, 0.10253463685512543, -0.09413586556911469, 0.04525277018547058, -0.14568230509757996, -0.09291842579841614, 0.08551612496376038, 0.01078109536319971, 0.03862406313419342, 0.05387735366821289, 0.050085872411727905, 0.06509354710578918, -0.011079531162977219, 0.038085564970970154, 0.07201862335205078, -0.13613632321357727, 0.01231598574668169, -0.04898760840296745, -0.08548270165920258, -0.027684476226568222, -0.053624313324689865, -0.04312868416309357, 0.08982008695602417, 0.03963112086057663, -0.03583543375134468, 0.006555132567882538, -0.08812438696622849, -0.012055540457367897, 0.008763449266552925, -0.1471368819475174, -0.13546006381511688, -0.04392387345433235, 0.033005230128765106, 0.011332339607179165, 0.28640443086624146, 0.1185433492064476, -0.015381881967186928, -0.010858838446438313, 0.16242939233779907, -0.05713411048054695, -0.031146621331572533, 0.1476735919713974, 0.020652176812291145, -0.00018855574307963252, -0.11120708286762238, 0.0738053247332573, -0.009828781709074974, -0.02742053009569645, 0.01826411671936512, -0.022162457928061485, -0.06285585463047028, 0.07442419975996017, -0.03635237738490105, 0.02756352722644806, -0.09509553015232086, -0.13818798959255219, -0.054839327931404114, 0.15403544902801514, -0.04842167720198631, -0.00030992995016276836, 0.03553200140595436, 0.04278063029050827, -0.02813749387860298, 0.0127112977206707, 0.004723445512354374, -0.14399363100528717, -0.14804576337337494, -0.10669958591461182, -0.15282706916332245, 0.038221072405576706, -0.0557498037815094, -0.007318528834730387, 0.015776989981532097, 0.05258086696267128, -0.08931880444288254, 0.07137978076934814, 0.02341284789144993, -0.09839364886283875, 0.11709816753864288, -0.026667235419154167, -0.02166474051773548, -0.05854152888059616, -0.04829225316643715, -0.13733746111392975, 0.03802111744880676, -0.03495420143008232, 0.05417131260037422, -0.008179404772818089, 0.022364752367138863, -0.1050250306725502, -0.041361004114151, -0.0759655088186264, 0.06763270497322083, -0.09293095767498016, -0.016386721283197403, 0.01628749631345272, -0.0038547010626643896, 0.07661963254213333, 0.26765185594558716, -0.029462702572345734, -0.053766973316669464, -0.0710987076163292, 0.14806224405765533, 0.032926108688116074, 0.0590149350464344, 0.024420635774731636, -0.0097359549254179, -0.018485764041543007, 0.29065558314323425, 0.27045854926109314, -0.02188192494213581, 0.03707775101065636, 0.09626719355583191, 0.013997110538184643, 0.10654124617576599, 0.1988918036222458, 0.14678508043289185, 0.16707207262516022, -0.018769625574350357, -0.07051502913236618, -0.032293546944856644, -0.01862049102783203, -0.030978435650467873, 0.13521884381771088, 0.015458769164979458, -0.08360765874385834, -0.05327145755290985, 0.09997937828302383, -0.20464560389518738, 0.018478592857718468, 0.008791100233793259, -0.14634637534618378, -0.0710763931274414, -0.020650796592235565, 0.060716211795806885, 0.0022657986264675856, 0.07510358840227127, -0.007994676008820534, -0.046088919043540955, 0.08523430675268173, 0.01143558882176876, -0.24064958095550537, 0.039132338017225266, 0.043689075857400894, -0.08093655109405518, 0.09184882789850235, -0.028052344918251038, 0.05581655353307724, 0.039972465485334396, 0.07532355189323425, -0.08357775956392288, 0.06588403135538101, -0.044620390981435776, -0.03550497815012932, 0.06954734772443771, -0.013211834244430065, 0.029602088034152985, -0.13476045429706573, 0.03260549530386925, -0.023609736934304237, 0.05970325320959091, -0.03206878900527954, 0.04870638623833656, -0.11610785126686096, 0.05158614739775658, -0.07279878854751587, 0.08793652802705765, 0.05670952796936035, -0.032734524458646774, 0.02634839527308941, -0.12907956540584564, -0.012066320516169071, -0.01927660033106804, -0.13579021394252777, -0.03928663581609726, -0.0657440647482872, -0.0895342156291008, -0.036951933056116104, -0.0029576863162219524, -0.04203831031918526, 0.031158247962594032, -0.11854046583175659, 0.009354302659630775, -0.12183170765638351, 0.1039230227470398, 0.06940555572509766, 0.02401784434914589, 0.019275855273008347, 0.08550485223531723, 0.03977425768971443, 0.0401025265455246, -0.10381575673818588, -0.10015562176704407 ]
null
null
transformers
GPT J 6B finetuned on SCP articles Very experimental
{}
text-generation
PhilSad/GPTJ2B-SCP
[ "transformers", "pytorch", "gptj", "text-generation", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gptj #text-generation #autotrain_compatible #endpoints_compatible #region-us
GPT J 6B finetuned on SCP articles Very experimental
[]
[ "TAGS\n#transformers #pytorch #gptj #text-generation #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ 38 ]
[ "passage: TAGS\n#transformers #pytorch #gptj #text-generation #autotrain_compatible #endpoints_compatible #region-us \n" ]
[ -0.03151220455765724, 0.03707336261868477, -0.007071870379149914, 0.0017476322827860713, 0.17550218105316162, 0.041412875056266785, 0.06822680681943893, 0.15192539989948273, -0.0028824196197092533, -0.03256594389677048, 0.15555061399936676, 0.23781636357307434, -0.024892881512641907, 0.1473873406648636, -0.049011047929525375, -0.2775580585002899, 0.06584495306015015, 0.09494838118553162, -0.013777283951640129, 0.11993943899869919, 0.07632021605968475, -0.047684215009212494, 0.08393528312444687, -0.026972385123372078, -0.19710978865623474, 0.015898795798420906, 0.017967719584703445, -0.12383844703435898, 0.09878624230623245, 0.06138170510530472, 0.08247498422861099, 0.02209017239511013, -0.08755389600992203, -0.11167534440755844, 0.031160524114966393, 0.017747657373547554, -0.06584755331277847, 0.04502895846962929, 0.08679348230361938, -0.09504932165145874, 0.07202108204364777, 0.043690115213394165, -0.027658484876155853, 0.04201428219676018, -0.14984412491321564, -0.12746208906173706, -0.01851898431777954, 0.01862230896949768, 0.05458783358335495, 0.08329953253269196, -0.011250312440097332, 0.1298505961894989, -0.10167486220598221, 0.11950743943452835, 0.15904280543327332, -0.29614710807800293, -0.001627553254365921, 0.10248466581106186, 0.04741263389587402, 0.023154553025960922, -0.0020266633946448565, 0.04751226305961609, 0.01063463930040598, 0.01710410788655281, 0.00028000841848552227, -0.08625026047229767, -0.12598443031311035, 0.045534174889326096, -0.08142338693141937, -0.06148264557123184, 0.2619574964046478, -0.08855904638767242, 0.07061000913381577, -0.02506810799241066, -0.0929417610168457, -0.014947259798645973, -0.026307551190257072, 0.01530848816037178, -0.0707181841135025, 0.07557594776153564, -0.000564271118491888, -0.0453343503177166, -0.12541106343269348, -0.011339031159877777, -0.1798100769519806, 0.1976306140422821, 0.037942033261060715, 0.051710497587919235, -0.19175775349140167, 0.11477487534284592, -0.031865473836660385, -0.09600309282541275, 0.02040717750787735, -0.10116102546453476, 0.059744108468294144, -0.023178568109869957, -0.05489618331193924, -0.07644879817962646, 0.09123460203409195, 0.1268267184495926, 0.05813661217689514, 0.020550431683659554, 0.011739281006157398, 0.08003080636262894, 0.03749508783221245, 0.1008874922990799, 0.000465858553070575, -0.023850098252296448, 0.06882618367671967, -0.15177954733371735, -0.020143602043390274, -0.05883725360035896, -0.13321663439273834, -0.05151035636663437, 0.03850160166621208, 0.09309699386358261, 0.01724337786436081, 0.10648662596940994, -0.03549748286604881, -0.018040094524621964, 0.11053623259067535, -0.05922706425189972, -0.006570661440491676, -0.030643735080957413, 0.03125409036874771, 0.12799637019634247, -0.006093597039580345, -0.0010609433520585299, -0.11617495119571686, 0.10182081907987595, -0.060071662068367004, -0.0010186312720179558, -0.03463878110051155, -0.0376923531293869, 0.02057117037475109, -0.0767161026597023, 0.03845762088894844, -0.16579559445381165, -0.16603033244609833, 0.01161336712539196, 0.001375691732391715, -0.012137191370129585, -0.05322078987956047, -0.02226819097995758, -0.018987197428941727, 0.03858328238129616, -0.06768464297056198, -0.015443463809788227, -0.0640794038772583, 0.11859317868947983, -0.020377477630972862, 0.07902072370052338, -0.11424156278371811, 0.08009770512580872, -0.11536715179681778, -0.03842497989535332, -0.09913934767246246, 0.051218610256910324, -0.035355303436517715, 0.15356910228729248, 0.0005610828520730138, -0.028039881959557533, -0.049313418567180634, 0.044323161244392395, -0.04550134390592575, 0.17294104397296906, -0.05977828800678253, -0.13170675933361053, 0.2849401831626892, -0.06919427961111069, -0.1379150003194809, 0.09073033183813095, 0.02237512916326523, 0.05119442939758301, 0.10456212610006332, 0.16592779755592346, 0.06751891225576401, -0.0007788332877680659, 0.0877680629491806, 0.10701018571853638, -0.1288803368806839, -0.10709404945373535, -0.0020441405940800905, -0.01743694394826889, -0.13654553890228271, 0.07253856211900711, 0.06880289316177368, 0.1082848608493805, -0.04964519664645195, -0.013115920126438141, -0.04249207302927971, -0.0026739821769297123, 0.07932359725236893, 0.03825939819216728, 0.12198621779680252, -0.06681143492460251, -0.02492417022585869, -0.03233812376856804, 0.005398034118115902, -0.004145813174545765, 0.022859416902065277, -0.018361076712608337, 0.10494852066040039, -0.05143984034657478, 0.07253605127334595, -0.15968003869056702, -0.07916484773159027, 0.006278707645833492, 0.08878109604120255, -0.009359870105981827, 0.11318203061819077, 0.06251663714647293, -0.034212902188301086, -0.001812968635931611, -0.012723004445433617, 0.17079047858715057, -0.009916776791214943, -0.055602505803108215, -0.040433406829833984, 0.08011013269424438, -0.055242374539375305, -0.02686125971376896, -0.005838421173393726, 0.01527310535311699, 0.060781825333833694, 0.10973378270864487, -0.0008373140008188784, 0.040041111409664154, 0.0034737070091068745, 0.0330461785197258, -0.059878867119550705, -0.0027495657559484243, 0.08832237869501114, 0.008492691442370415, -0.027223972603678703, 0.20217745006084442, -0.14776653051376343, 0.26809659600257874, 0.19876869022846222, -0.2695048749446869, 0.004929766990244389, -0.014728586189448833, -0.02094225026667118, 0.012075221166014671, 0.03225378692150116, 0.010688871145248413, 0.06388229131698608, 0.00015890740905888379, 0.19102443754673004, -0.038347046822309494, -0.05962871387600899, 0.003270199988037348, -0.07255034893751144, 0.014554842375218868, 0.05893850326538086, 0.10247236490249634, -0.14644168317317963, 0.19831664860248566, 0.17376255989074707, 0.028010735288262367, 0.195783793926239, 0.0214996337890625, -0.0011843818938359618, 0.07653800398111343, -0.0032105506397783756, -0.030258944258093834, -0.07029304653406143, -0.21548305451869965, -0.03496744856238365, 0.0923917219042778, 0.029650313779711723, 0.09876517206430435, -0.1213965192437172, -0.047838255763053894, -0.00962784606963396, -0.007615337148308754, 0.0033828250598162413, 0.12740439176559448, 0.0403982512652874, 0.08968637138605118, -0.014331508427858353, 0.025052199140191078, 0.11064980924129486, 0.037151701748371124, -0.06392659991979599, 0.1835697442293167, -0.1384696364402771, -0.3695679306983948, -0.16000811755657196, -0.16982266306877136, -0.03623782470822334, 0.06372473388910294, 0.13562093675136566, -0.13306501507759094, -0.04562617093324661, 0.06078742817044258, 0.06974658370018005, -0.045037753880023956, 0.033245839178562164, -0.08232969045639038, 0.041240040212869644, -0.08612990379333496, -0.07487888634204865, -0.05983339995145798, -0.01060960441827774, -0.0646623969078064, 0.167014017701149, -0.11750119179487228, 0.07633187621831894, 0.14863160252571106, 0.02449009194970131, 0.0753343254327774, -0.026104863733053207, 0.2013784795999527, -0.09973248839378357, -0.012085337191820145, 0.20709019899368286, -0.018807461485266685, 0.0738348513841629, 0.0982021614909172, 0.0017653228715062141, -0.07931460440158844, 0.026727963238954544, -0.05352002754807472, -0.09800251573324203, -0.19859269261360168, -0.12201059609651566, -0.1321074515581131, 0.0694907084107399, 0.05472753569483757, 0.07535327970981598, 0.15709616243839264, 0.10762650519609451, -0.023958612233400345, 0.06659785658121109, -0.00659692008048296, 0.09210693836212158, 0.20091088116168976, -0.017185727134346962, 0.13523106276988983, -0.07075963914394379, -0.13715381920337677, 0.10246718674898148, 0.057375311851501465, 0.12915703654289246, 0.04318271204829216, 0.039763204753398895, 0.023587986826896667, 0.1051025390625, 0.15832018852233887, 0.11873368173837662, 0.016528356820344925, -0.026332775130867958, -0.018474319949746132, -0.013993718661367893, -0.06207828223705292, 0.030055269598960876, 0.028660649433732033, -0.16685327887535095, -0.03271558880805969, -0.1414652168750763, 0.08460081368684769, 0.07361278682947159, 0.041430406272411346, -0.20731081068515778, 0.006837774068117142, 0.07433027029037476, -0.012659668922424316, -0.12468031793832779, 0.058678001165390015, -0.03402986377477646, -0.15183068811893463, 0.06273781508207321, -0.044653963297605515, 0.1136665940284729, -0.07415879517793655, 0.07613469660282135, -0.007263060659170151, -0.07172515988349915, 0.025071771815419197, 0.1364073008298874, -0.3211840093135834, 0.1877652108669281, 0.0003780387341976166, -0.0384247824549675, -0.11004525423049927, 0.009285520762205124, 0.016016216948628426, 0.15834660828113556, 0.07735977321863174, 0.009140821173787117, -0.00551434513181448, -0.14697468280792236, -0.011442545801401138, 0.035267382860183716, 0.10653096437454224, -0.028471242636442184, -0.023476427420973778, -0.03270047530531883, -0.02529638633131981, -0.05248558521270752, -0.044984158128499985, 0.011750434525310993, -0.17288997769355774, 0.10549255460500717, 0.03812883421778679, 0.106228306889534, 0.0077005596831440926, -0.00907888449728489, -0.1003204733133316, 0.2347191572189331, -0.03026527538895607, -0.11105775833129883, -0.10677628219127655, -0.08168819546699524, 0.05499221757054329, -0.0974348708987236, 0.06041272357106209, -0.09334225952625275, -0.002755561377853155, -0.05052782967686653, -0.20742133259773254, 0.11286700516939163, -0.11172983795404434, -0.032591063529253006, -0.046926349401474, 0.14728420972824097, -0.08641737699508667, -0.010398102924227715, 0.01146936696022749, 0.012436644174158573, -0.14456813037395477, -0.1016213670372963, -0.0019087573746219277, 0.003203726839274168, 0.03194073587656021, 0.02161046490073204, -0.054708901792764664, 0.002008966635912657, -0.01267519872635603, -0.024786293506622314, 0.27424585819244385, 0.16491465270519257, -0.045934975147247314, 0.16771647334098816, 0.1260625720024109, -0.06032343581318855, -0.309770792722702, -0.11425772309303284, -0.11045655608177185, -0.04398525506258011, -0.09500124305486679, -0.18565696477890015, 0.10231947898864746, 0.025354726240038872, -0.01759127900004387, 0.131089448928833, -0.21440641582012177, -0.08033140748739243, 0.15508152544498444, -0.0059756203554570675, 0.37745657563209534, -0.12287385016679764, -0.08928390592336655, -0.04647797718644142, -0.20473800599575043, 0.12560726702213287, -0.014341633766889572, 0.10569804161787033, -0.04033660888671875, 0.10667499154806137, 0.04124046117067337, -0.06189761683344841, 0.0827556774020195, 0.018028048798441887, -0.0027729833964258432, -0.11908969283103943, -0.005427781958132982, 0.037330836057662964, 0.0058838604018092155, 0.038162510842084885, -0.0211408119648695, 0.025556031614542007, -0.13596314191818237, -0.04485263675451279, -0.11071023344993591, 0.04395507648587227, 0.055599145591259, -0.06605301052331924, 0.008166026324033737, -0.05368953198194504, -0.02623230591416359, 0.001955678453668952, 0.18859443068504333, -0.032264675945043564, 0.16248038411140442, 0.017915822565555573, 0.061699241399765015, -0.2131144255399704, 0.0001550401939311996, -0.06295984983444214, -0.06551593542098999, 0.07972439378499985, -0.10897208750247955, 0.06162695586681366, 0.09493593871593475, -0.0550190731883049, 0.05816870555281639, 0.10558537393808365, 0.015814559534192085, -0.014361199922859669, 0.1391412913799286, -0.27428776025772095, 0.07481314241886139, -0.07797183096408844, -0.013885238207876682, 0.11706466972827911, 0.0914216935634613, 0.1399575173854828, 0.043782178312540054, -0.06813227385282516, -0.006982195191085339, -0.0031752721406519413, -0.030768726021051407, 0.07999695837497711, 0.02331979013979435, 0.020107470452785492, -0.163397878408432, 0.02372591756284237, 0.014981063082814217, -0.1348113864660263, -0.013427863828837872, 0.15431173145771027, -0.16395573318004608, -0.12808829545974731, 0.008915727958083153, 0.09718075394630432, -0.13786260783672333, -0.02174106240272522, -0.05153400078415871, -0.0995328277349472, 0.08156455308198929, 0.11245695501565933, 0.08718431740999222, 0.09184054285287857, -0.03695494309067726, -0.017955588176846504, -0.041809603571891785, -0.02114316262304783, 0.02141645736992359, 0.05969555675983429, -0.07898231595754623, 0.05391881987452507, -0.03138475865125656, 0.15629547834396362, -0.09316448122262955, -0.06033721938729286, -0.1587001532316208, 0.032785601913928986, -0.08087195456027985, -0.08479191362857819, -0.11347627639770508, -0.04557791352272034, 0.0002567548071965575, -0.037613075226545334, -0.01753467321395874, -0.04149655997753143, -0.12441752105951309, 0.017146142199635506, -0.03829406946897507, 0.010931672528386116, -0.05811680108308792, -0.008663719519972801, 0.09447751194238663, -0.027902944013476372, 0.11650321632623672, 0.11688197404146194, -0.08923262357711792, 0.10900924354791641, -0.11480636894702911, -0.11036177724599838, 0.11488474160432816, 0.01133766584098339, 0.05226748436689377, 0.09664200246334076, 0.0452880859375, 0.06880165636539459, 0.030284151434898376, 0.06122501567006111, 0.020336538553237915, -0.12300695478916168, 0.048408228904008865, -0.03444572910666466, -0.1429934799671173, -0.05953800678253174, -0.02918248437345028, 0.05130946636199951, 0.024396706372499466, 0.11290157586336136, -0.04501543939113617, 0.09416015446186066, -0.028185313567519188, -0.001477750949561596, -0.004872435703873634, -0.21893969178199768, -0.0504576601088047, -0.0729994997382164, 0.018002768978476524, 0.024589112028479576, 0.2580176889896393, 0.035799842327833176, 0.016602937132120132, 0.01481608022004366, 0.0902845710515976, 0.04691626876592636, -0.014098089188337326, 0.19049955904483795, 0.1093553826212883, -0.043861083686351776, -0.10956389456987381, 0.10990467667579651, 0.01068804319947958, -0.06239248067140579, 0.10300081968307495, -0.020474513992667198, 0.0003835942188743502, 0.07576970010995865, -0.04989642649888992, 0.022763792425394058, -0.11206382513046265, -0.1767910271883011, -0.034227415919303894, 0.07417725026607513, -0.0006204710225574672, 0.05491982772946358, 0.12787184119224548, -0.008543367497622967, 0.034753695130348206, -0.002907110145315528, -0.044527314603328705, -0.17147091031074524, -0.14621102809906006, -0.08630190789699554, -0.14793746173381805, 0.0190786924213171, -0.0839877501130104, 0.03924823924899101, 0.05102156847715378, 0.051837824285030365, -0.05538314953446388, 0.09522181749343872, 0.10324229300022125, -0.0988502949476242, 0.049616921693086624, -0.0501939058303833, 0.030683284625411034, -0.0029872003942728043, -0.0034636224154382944, -0.13122345507144928, -0.005105981603264809, -0.011908148415386677, 0.04666009545326233, -0.07080256193876266, 0.01316530629992485, -0.15000644326210022, -0.10332248359918594, -0.05356411263346672, 0.08138561993837357, -0.053934432566165924, 0.0994570404291153, -0.0014823476085439324, -0.010538178496062756, 0.04940152168273926, 0.19277051091194153, -0.044229596853256226, -0.08742257207632065, -0.036329664289951324, 0.1808408498764038, 0.07235650718212128, 0.10242252051830292, -0.013352932408452034, 0.02063535712659359, -0.06923840939998627, 0.3531532883644104, 0.26858821511268616, -0.018447961658239365, 0.03309110552072525, 0.038829293102025986, 0.033893972635269165, 0.12439677864313126, 0.1530785858631134, 0.10072127729654312, 0.2654051184654236, -0.07902508229017258, -0.03626459091901779, -0.011937260627746582, -0.019930606707930565, -0.12488619238138199, 0.04357904568314552, 0.04491405934095383, -0.07011903077363968, -0.033207207918167114, 0.0878094732761383, -0.2121572494506836, 0.21441073715686798, -0.06745455414056778, -0.15629106760025024, -0.05864774435758591, -0.005939108319580555, 0.1302892118692398, -0.004313112702220678, 0.07851271331310272, 0.00010306494368705899, -0.08810977637767792, 0.09500984102487564, 0.01990063488483429, -0.23626677691936493, -0.02694002166390419, 0.06947129219770432, -0.06700648367404938, -0.018467217683792114, -0.026268189772963524, 0.029670950025320053, 0.05944996699690819, 0.036614418029785156, -0.04607617110013962, 0.04073259234428406, -0.023219695314764977, -0.05424519255757332, 0.016090352088212967, 0.04292331263422966, 0.016028806567192078, -0.1393764466047287, 0.049013204872608185, -0.14424192905426025, 0.04227766394615173, -0.055631183087825775, 0.00035895337350666523, -0.0105360122397542, 0.018617257475852966, -0.046715766191482544, 0.05485781654715538, 0.07827278226613998, 0.008474894799292088, -0.009075884707272053, -0.07942800223827362, 0.0000073377573244215455, 0.007906041108071804, -0.07692286372184753, -0.12948128581047058, -0.09903234243392944, -0.11407486349344254, 0.08519060164690018, 0.00946003757417202, -0.12884648144245148, 0.008852420374751091, -0.11711031198501587, 0.04678783193230629, -0.16973750293254852, 0.1007298156619072, 0.05157482624053955, 0.02837679535150528, 0.00024361057148780674, -0.05811697989702225, 0.040675777941942215, 0.06669099628925323, -0.1293530911207199, -0.10037686675786972 ]
null
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # output_gptneo125-2 This model is a fine-tuned version of [EleutherAI/gpt-neo-125M](https://huggingface.co/EleutherAI/gpt-neo-125M) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - distributed_type: tpu - num_devices: 8 - total_train_batch_size: 64 - total_eval_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results ### Framework versions - Transformers 4.17.0.dev0 - Pytorch 1.10.0+cu102 - Datasets 1.18.3 - Tokenizers 0.11.0
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "model-index": [{"name": "output_gptneo125-2", "results": []}]}
text-generation
PhilSad/gpt-scp-neo-125M
[ "transformers", "pytorch", "tensorboard", "gpt_neo", "text-generation", "generated_from_trainer", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #tensorboard #gpt_neo #text-generation #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
# output_gptneo125-2 This model is a fine-tuned version of EleutherAI/gpt-neo-125M on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - distributed_type: tpu - num_devices: 8 - total_train_batch_size: 64 - total_eval_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results ### Framework versions - Transformers 4.17.0.dev0 - Pytorch 1.10.0+cu102 - Datasets 1.18.3 - Tokenizers 0.11.0
[ "# output_gptneo125-2\n\nThis model is a fine-tuned version of EleutherAI/gpt-neo-125M on an unknown dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- distributed_type: tpu\n- num_devices: 8\n- total_train_batch_size: 64\n- total_eval_batch_size: 64\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0", "### Training results", "### Framework versions\n\n- Transformers 4.17.0.dev0\n- Pytorch 1.10.0+cu102\n- Datasets 1.18.3\n- Tokenizers 0.11.0" ]
[ "TAGS\n#transformers #pytorch #tensorboard #gpt_neo #text-generation #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "# output_gptneo125-2\n\nThis model is a fine-tuned version of EleutherAI/gpt-neo-125M on an unknown dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- distributed_type: tpu\n- num_devices: 8\n- total_train_batch_size: 64\n- total_eval_batch_size: 64\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0", "### Training results", "### Framework versions\n\n- Transformers 4.17.0.dev0\n- Pytorch 1.10.0+cu102\n- Datasets 1.18.3\n- Tokenizers 0.11.0" ]
[ 58, 38, 6, 12, 8, 3, 129, 4, 38 ]
[ "passage: TAGS\n#transformers #pytorch #tensorboard #gpt_neo #text-generation #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n# output_gptneo125-2\n\nThis model is a fine-tuned version of EleutherAI/gpt-neo-125M on an unknown dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- distributed_type: tpu\n- num_devices: 8\n- total_train_batch_size: 64\n- total_eval_batch_size: 64\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0### Training results### Framework versions\n\n- Transformers 4.17.0.dev0\n- Pytorch 1.10.0+cu102\n- Datasets 1.18.3\n- Tokenizers 0.11.0" ]
[ -0.05968455225229263, 0.09229840338230133, -0.0021747429855167866, 0.09624706208705902, 0.1626020222902298, 0.04337846115231514, 0.11744306981563568, 0.129108726978302, -0.12456613779067993, 0.05522603914141655, 0.07559826970100403, 0.09445815533399582, 0.05041096359491348, 0.10583119839429855, -0.025486765429377556, -0.23180346190929413, 0.0036400267854332924, 0.0181814543902874, -0.04305397719144821, 0.09076906740665436, 0.09003065526485443, -0.11320361495018005, 0.08549918234348297, 0.033406443893909454, -0.16005675494670868, 0.03164790943264961, -0.02512914128601551, -0.06043484807014465, 0.11642803251743317, 0.034026604145765305, 0.09000007808208466, -0.028434468433260918, 0.12358096987009048, -0.18368396162986755, -0.011631155386567116, 0.08033343404531479, 0.017589572817087173, 0.07192271947860718, 0.04841247573494911, 0.04735510051250458, 0.11187377572059631, -0.13088145852088928, 0.1007264256477356, 0.004970336798578501, -0.09737524390220642, -0.16493171453475952, -0.07750596851110458, -0.00016215175855904818, 0.07218543440103531, 0.11655077338218689, 0.01048241090029478, 0.18909452855587006, -0.038014888763427734, 0.07607715576887131, 0.1723232865333557, -0.276222825050354, -0.05869010463356972, 0.04555891826748848, 0.06837362051010132, 0.07338178157806396, -0.09399272501468658, -0.014733916148543358, 0.03228704631328583, 0.06139807030558586, 0.12453843653202057, -0.019461579620838165, -0.06021573022007942, -0.007738153915852308, -0.1169353723526001, -0.03880857676267624, 0.09920172393321991, 0.026966474950313568, -0.04447029158473015, -0.09514104574918747, -0.07827302068471909, -0.1160733699798584, -0.012992077507078648, -0.04801756888628006, 0.04208028316497803, -0.02949458733201027, -0.05555693432688713, -0.05399087443947792, -0.06800059229135513, -0.051400087773799896, -0.033484552055597305, 0.13537651300430298, 0.03569699451327324, 0.02194063551723957, -0.012754363939166069, 0.09083957970142365, -0.05995195358991623, -0.12553845345973969, 0.0013260326813906431, 0.009102143347263336, -0.06088479235768318, -0.04907400533556938, -0.045221466571092606, -0.07200459390878677, -0.0181581974029541, 0.14435118436813354, -0.04265761747956276, 0.064784936606884, 0.020519739016890526, -0.01566881127655506, -0.036399271339178085, 0.17511701583862305, -0.03478435054421425, -0.06983712315559387, 0.011315224692225456, 0.08981466293334961, 0.03864394128322601, -0.02123473957180977, -0.09356555342674255, -0.017688682302832603, 0.07830384373664856, 0.013865317218005657, -0.04780671000480652, 0.040161170065402985, -0.046127986162900925, -0.029424121603369713, 0.08891315758228302, -0.09921618551015854, 0.04582533240318298, -0.042060717940330505, -0.07706018537282944, -0.04522738605737686, 0.017533905804157257, 0.026923727244138718, -0.03243628144264221, 0.11570723354816437, -0.08829586207866669, -0.0018995034042745829, -0.0962533950805664, -0.09899455308914185, -0.008361611515283585, -0.0473967045545578, 0.014597795903682709, -0.08717283606529236, -0.19750048220157623, -0.026251669973134995, 0.04922971874475479, -0.062443800270557404, -0.03952192887663841, -0.008100835606455803, -0.052467912435531616, 0.028448818251490593, -0.010431640781462193, 0.12303309887647629, -0.056955285370349884, 0.0611785352230072, 0.046704791486263275, 0.02563300170004368, 0.008365416899323463, 0.04097066819667816, -0.08516647666692734, 0.0054662893526256084, -0.13144709169864655, 0.0507386140525341, -0.04578835517168045, 0.025889474898576736, -0.10720224678516388, -0.11160197108983994, 0.014086169190704823, -0.017350969836115837, 0.08661288022994995, 0.07622900605201721, -0.13614903390407562, -0.02159058302640915, 0.1053519919514656, -0.07554621994495392, -0.09219815582036972, 0.10597432404756546, -0.03639683872461319, -0.010539117269217968, 0.06216386333107948, 0.12141366302967072, 0.06285995990037918, -0.1188579574227333, -0.03402113541960716, -0.022256212309002876, 0.05843885615468025, -0.004321236163377762, 0.07611635327339172, 0.009627302177250385, 0.07706856727600098, -0.011935708113014698, -0.029438801109790802, 0.001513612107373774, -0.07636432349681854, -0.0736246109008789, -0.05460691079497337, -0.07419531792402267, 0.013103467412292957, 0.024861352518200874, 0.03873048722743988, -0.0709831565618515, -0.12493578344583511, 0.10260104387998581, 0.11413921415805817, -0.060333434492349625, 0.02264649234712124, -0.09298566728830338, 0.01574656181037426, -0.0900484025478363, -0.026745570823550224, -0.1970876306295395, -0.08713661134243011, 0.04809894040226936, -0.06035245209932327, 0.035902395844459534, 0.02388588711619377, 0.06282059103250504, 0.09122511744499207, -0.05030473694205284, -0.026121418923139572, -0.09129208326339722, -0.004741006530821323, -0.11371257901191711, -0.15185660123825073, -0.0404842235147953, -0.026126334443688393, 0.0677512139081955, -0.2023037075996399, -0.00035836719325743616, -0.01980728469789028, 0.11946463584899902, 0.024566825479269028, -0.03600899502635002, -0.037142910063266754, 0.05601765215396881, -0.026586396619677544, -0.11175017803907394, 0.03137356787919998, 0.009886471554636955, -0.0953759253025055, -0.06630784273147583, -0.1652715802192688, 0.10932942479848862, 0.09818273037672043, 0.020179951563477516, -0.06314278393983841, 0.024353524670004845, -0.048571497201919556, -0.058807823807001114, -0.051320068538188934, -0.025207852944731712, 0.2353082001209259, 0.002499418333172798, 0.1364668607711792, -0.05889805033802986, -0.06463050842285156, 0.013054716400802135, 0.023232843726873398, -0.003586312523111701, 0.04213780164718628, 0.06121329218149185, -0.1405726969242096, 0.093851737678051, 0.08394773304462433, -0.06755553930997849, 0.12286268919706345, -0.021856682375073433, -0.07011722028255463, -0.03184971213340759, -0.028119651600718498, 0.012190088629722595, 0.090059794485569, -0.09086673706769943, 0.01313540618866682, 0.034779421985149384, 0.03537170961499214, 0.024380424991250038, -0.1714070588350296, 0.003842464415356517, 0.04642925038933754, -0.0336625874042511, 0.014273428358137608, -0.018341775983572006, -0.003113158978521824, 0.07710853219032288, 0.018612248823046684, -0.015893783420324326, 0.028001654893159866, -0.006336139515042305, -0.055662643164396286, 0.17397183179855347, -0.11703167855739594, -0.13553734123706818, -0.11855906993150711, -0.016605867072939873, -0.06053796410560608, 0.004434088710695505, 0.038732219487428665, -0.06921455264091492, -0.08561841398477554, -0.0803079605102539, -0.0004184895660728216, -0.03385142609477043, 0.010700184851884842, 0.07334724068641663, -0.036338966339826584, 0.08059182018041611, -0.1457083374261856, -0.008498686365783215, -0.0011335405288264155, -0.07518856972455978, 0.014132124371826649, 0.08475381880998611, 0.11259288340806961, 0.09042610973119736, -0.0389987975358963, -0.009763640351593494, -0.015404281206429005, 0.27860012650489807, -0.08684945106506348, -0.015234795399010181, 0.1621340960264206, 0.006254558917135, 0.07255363464355469, 0.09782661497592926, 0.044353511184453964, -0.09747933596372604, 0.034415725618600845, 0.058586254715919495, -0.017111660912632942, -0.2303440421819687, -0.026331426575779915, -0.032402172684669495, -0.10233820974826813, 0.10586480796337128, 0.04499330744147301, 0.018567468971014023, 0.04954106733202934, -0.0020945887081325054, 0.11609445512294769, -0.04293332248926163, 0.09980858862400055, 0.14389972388744354, 0.07746825367212296, 0.11182332783937454, -0.022017033770680428, -0.05543512850999832, 0.06648798286914825, 0.007493762299418449, 0.22937217354774475, -0.014823242090642452, 0.15510088205337524, 0.015084940008819103, 0.08653953671455383, -0.025024933740496635, 0.05323787033557892, -0.020112240687012672, -0.009234032593667507, -0.02252565696835518, -0.04920920357108116, -0.02635025791823864, 0.021743450313806534, -0.044425562024116516, 0.0442078560590744, -0.052426233887672424, 0.040434304624795914, 0.03253457322716713, 0.2086133509874344, 0.002682487014681101, -0.3400888442993164, -0.06960733234882355, 0.006424192804843187, -0.029627302661538124, -0.07404901087284088, -0.01904628984630108, 0.10669346898794174, -0.10654930025339127, 0.06371695548295975, -0.0823892205953598, 0.07972881942987442, -0.05253104120492935, 0.022732937708497047, 0.11180879920721054, 0.13735495507717133, -0.003598314244300127, 0.069636270403862, -0.22805200517177582, 0.17088480293750763, 0.02878209948539734, 0.1291705071926117, -0.07818228751420975, 0.05896591767668724, 0.004207725170999765, 0.05517977848649025, 0.05847138166427612, -0.01783367246389389, -0.049970850348472595, -0.16351400315761566, -0.07484418153762817, 0.03980126976966858, 0.09273011237382889, -0.02497665211558342, 0.09431082010269165, -0.05401459336280823, 0.01589706912636757, 0.060375068336725235, -0.04173598811030388, -0.13223475217819214, -0.13465021550655365, 0.034063663333654404, 0.014800033532083035, -0.03255428001284599, -0.05488746240735054, -0.09859916567802429, -0.04796411469578743, 0.18186557292938232, 0.055945608764886856, -0.054717034101486206, -0.13613390922546387, 0.08637367933988571, 0.12376219034194946, -0.0647907480597496, 0.04713524878025055, 0.0256696455180645, 0.10093943774700165, 0.03139687329530716, -0.06784047186374664, 0.0724964365363121, -0.06905651837587357, -0.1699609011411667, -0.056167375296354294, 0.06536358594894409, 0.018278464674949646, 0.04892560839653015, -0.005451555829495192, 0.033883530646562576, -0.018898798152804375, -0.09871241450309753, -0.00048829335719347, 0.032037943601608276, 0.13229940831661224, 0.013063263148069382, -0.022015739232301712, 0.047624535858631134, -0.026279697194695473, -0.032315418124198914, 0.10627997666597366, 0.23345962166786194, -0.056295305490493774, 0.01337555143982172, 0.05488113686442375, -0.06953801959753036, -0.13005073368549347, 0.02455942891538143, 0.09279129654169083, 0.024401213973760605, 0.07305910438299179, -0.15894238650798798, 0.10289784520864487, 0.10794618725776672, -0.01394421886652708, 0.06740857660770416, -0.3259437680244446, -0.13353031873703003, 0.030935123562812805, 0.14015519618988037, 0.05696212872862816, -0.14793314039707184, -0.028184618800878525, -0.044236067682504654, -0.14100046455860138, 0.11167746782302856, -0.048707690089941025, 0.11839185655117035, -0.01723092794418335, 0.08022253960371017, 0.016029130667448044, -0.04605381563305855, 0.1500163972377777, -0.001157446182332933, 0.09121488779783249, -0.06655082106590271, 0.029348550364375114, 0.0981135293841362, -0.05413026735186577, 0.0316079743206501, -0.06610284745693207, 0.05093863606452942, -0.1243586391210556, -0.03333068639039993, -0.04091203585267067, 0.03029816597700119, -0.03899943083524704, -0.08116499334573746, -0.03347082436084747, 0.03365364298224449, 0.06405387818813324, -0.05034417659044266, 0.0948425754904747, 0.05473790317773819, 0.07468659430742264, 0.0800442323088646, 0.07213693112134933, -0.009870908223092556, -0.11496367305517197, 0.0065531074069440365, 0.0035433475859463215, 0.06558380275964737, -0.11207190155982971, 0.03360278159379959, 0.13885779678821564, 0.004886937793344259, 0.13477185368537903, 0.04573607072234154, -0.050356026738882065, 0.01703582890331745, 0.0425778366625309, -0.1121363565325737, -0.10462460666894913, -0.002677924232557416, -0.02540947124361992, -0.10021308064460754, 0.008268081583082676, 0.11969605088233948, -0.055531371384859085, -0.016039365902543068, 0.0006586093222722411, 0.0043280646204948425, -0.01738370768725872, 0.20024709403514862, 0.001730857533402741, 0.045691557228565216, -0.08067482709884644, 0.09441500157117844, 0.08218595385551453, -0.07919170707464218, 0.032128654420375824, 0.050790004432201385, -0.0810403898358345, -0.009314904920756817, 0.07656693458557129, 0.15933215618133545, -0.05813649296760559, -0.03454142436385155, -0.0970960482954979, -0.061307989060878754, 0.046614423394203186, 0.04103758558630943, 0.054428938776254654, -0.015334094874560833, -0.060459353029727936, 0.03811904042959213, -0.13136471807956696, 0.08969330042600632, 0.02596794255077839, 0.08810849487781525, -0.16351813077926636, 0.08442497253417969, 0.014475813135504723, 0.01854531839489937, -0.010095793753862381, 0.054349709302186966, -0.07530644536018372, -0.033459726721048355, -0.09402551501989365, 0.0066059911623597145, -0.009799889288842678, 0.009497356601059437, -0.013832725584506989, -0.05127662420272827, -0.05499231442809105, 0.04528981074690819, -0.05405629426240921, -0.07417698204517365, 0.026035232469439507, 0.03306220844388008, -0.1162182092666626, -0.02488529123365879, 0.004705753643065691, -0.08340305835008621, 0.0745321661233902, 0.029600288718938828, 0.031797364354133606, 0.013992339372634888, -0.04245083034038544, 0.018098359927535057, 0.06000285595655441, 0.03601856902241707, 0.07154196500778198, -0.07787293940782547, -0.01092428620904684, -0.016537247225642204, 0.037959493696689606, -0.0005856383941136301, 0.06335769593715668, -0.13743016123771667, -0.01563998870551586, -0.05380529165267944, -0.05842971056699753, -0.050207797437906265, 0.02663569524884224, 0.11365778744220734, 0.027799168601632118, 0.19391457736492157, -0.062027592211961746, 0.0292703527957201, -0.20070983469486237, -0.021925417706370354, 0.014882437884807587, -0.05711774155497551, -0.07000373303890228, -0.04000416770577431, 0.07187224924564362, -0.06124494597315788, 0.12995508313179016, 0.010080534033477306, 0.12164217978715897, 0.03529996797442436, -0.018347997218370438, -0.0343923345208168, -0.013372435234487057, 0.18117208778858185, 0.04567928984761238, -0.017909212037920952, 0.0930333063006401, -0.0045191566459834576, 0.08940669894218445, 0.06289183348417282, 0.14452028274536133, 0.12343113124370575, 0.022330664098262787, 0.07245109975337982, 0.04075593501329422, -0.10085267573595047, -0.23664017021656036, 0.051064733415842056, -0.03503050655126572, 0.12277344614267349, -0.034817639738321304, 0.1436321884393692, 0.09057231992483139, -0.15726076066493988, 0.038731735199689865, -0.0542168915271759, -0.09613649547100067, -0.08945520222187042, -0.050852805376052856, -0.08424120396375656, -0.11380024254322052, 0.017898229882121086, -0.12607209384441376, 0.0628579705953598, 0.10435361415147781, 0.00391158415004611, -0.006969513837248087, 0.1641092449426651, -0.013319804333150387, 0.0013770884834229946, 0.006655157543718815, 0.01593215949833393, -0.0029208508785814047, -0.027683768421411514, -0.06259866058826447, 0.007796735968440771, 0.02649206481873989, 0.08723665773868561, -0.044975437223911285, -0.008743193931877613, 0.041913777589797974, 0.0008301420602947474, -0.06692565977573395, 0.014363288879394531, 0.02279815822839737, 0.031149547547101974, 0.044350042939186096, 0.029401160776615143, 0.02322188951075077, -0.03618805855512619, 0.2505400478839874, -0.06727880239486694, -0.09380659461021423, -0.10820101201534271, 0.1691266894340515, 0.053556304425001144, -0.004893941339105368, 0.059377722442150116, -0.10570389032363892, -0.018017711117863655, 0.1461639255285263, 0.12506511807441711, -0.08312305808067322, -0.03364522382616997, 0.007426225580275059, -0.022040341049432755, -0.05002133175730705, 0.11053057760000229, 0.10782225430011749, 0.007616459857672453, -0.057000722736120224, -0.006684552878141403, -0.038454875349998474, -0.00043979991460219026, -0.05069500580430031, 0.05080873519182205, 0.02559046261012554, 0.029777012765407562, -0.012696146965026855, 0.035042230039834976, 0.03636949509382248, -0.16963234543800354, 0.03200625255703926, -0.17772085964679718, -0.17423996329307556, 0.013379212468862534, 0.09497184306383133, -0.035270173102617264, 0.07160621881484985, -0.006431459449231625, -0.015292834490537643, 0.06914247572422028, -0.02176114171743393, -0.056104134768247604, -0.07552997022867203, 0.08172063529491425, -0.07603103667497635, 0.22682584822177887, -0.006575095932930708, 0.08717852830886841, 0.11335734277963638, 0.03602764382958412, -0.12664470076560974, 0.05935436859726906, 0.050900042057037354, -0.07078168541193008, 0.039532098919153214, 0.131996288895607, -0.0410899817943573, 0.06478975713253021, 0.028021156787872314, -0.1264847218990326, -0.01976451277732849, -0.01337375957518816, 0.0038974261842668056, -0.043425701558589935, -0.04364398494362831, -0.07818245142698288, 0.1739763468503952, 0.19228670001029968, -0.013475351966917515, 0.010202043689787388, -0.05447796359658241, 0.0354740172624588, 0.054686009883880615, 0.1262238770723343, -0.056772150099277496, -0.2320891171693802, 0.024375129491090775, 0.05857644975185394, 0.020513804629445076, -0.19218581914901733, -0.11118283867835999, 0.04424723610281944, -0.0632314682006836, -0.0837177112698555, 0.09249942749738693, 0.046110205352306366, 0.01773133873939514, -0.04717765375971794, -0.1402752846479416, -0.0781223401427269, 0.1363665908575058, -0.16181232035160065, -0.07083988189697266 ]
null
null
transformers
#Traveller DiabloGPT Model
{"tags": ["conversational"]}
text-generation
PhilipTheGreat/DiabloGPT-small-Traveller
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
#Traveller DiabloGPT Model
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n" ]
[ 55 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n" ]
[ -0.0020731890108436346, 0.034266941249370575, -0.005774117540568113, 0.004248267505317926, 0.14393343031406403, 0.004326540976762772, 0.08896105736494064, 0.14543265104293823, -0.02302609197795391, 0.005462841596454382, 0.15414410829544067, 0.16123731434345245, -0.01616818644106388, 0.06624367088079453, -0.06798949837684631, -0.27222639322280884, 0.06252840161323547, 0.04033168777823448, -0.0042655146680772305, 0.1149275004863739, 0.0867869034409523, -0.07195974141359329, 0.08031850308179855, -0.010707243345677853, -0.14797940850257874, 0.027399519458413124, 0.027168817818164825, -0.11926408857107162, 0.1205340251326561, 0.04608513414859772, 0.08750468492507935, 0.018780281767249107, -0.08323448896408081, -0.13803936541080475, 0.03054818883538246, 0.03042440116405487, -0.06272798776626587, 0.06205490604043007, 0.04915359988808632, -0.08692146092653275, 0.11027362942695618, 0.05846209451556206, -0.021504774689674377, 0.04821402579545975, -0.173249289393425, -0.0497460775077343, -0.018249358981847763, -0.004099908750504255, 0.02475488930940628, 0.10208485275506973, -0.04358411207795143, 0.11388547718524933, -0.11227327585220337, 0.09600768983364105, 0.13744568824768066, -0.32189610600471497, -0.005977143067866564, 0.12905555963516235, 0.08356072753667831, 0.07893235981464386, -0.05513899773359299, 0.06879978626966476, 0.026153596118092537, 0.016823088750243187, 0.01882046088576317, -0.06101960316300392, -0.1321486234664917, 0.06851786375045776, -0.1029922217130661, -0.06102414429187775, 0.23576463758945465, -0.05459221452474594, 0.0850215032696724, -0.06047354266047478, -0.1171741634607315, -0.06511004269123077, -0.012023513205349445, 0.013670753687620163, -0.06820649653673172, 0.08349429816007614, 0.00736558623611927, -0.0910419449210167, -0.1503107100725174, -0.009757339023053646, -0.18359589576721191, 0.1544029712677002, 0.010258159600198269, 0.04417330399155617, -0.18897868692874908, 0.09543006867170334, -0.0019848651718348265, -0.09227047860622406, 0.0379180908203125, -0.09267305582761765, 0.025528281927108765, 0.02594062313437462, -0.07617612928152084, -0.04994349926710129, 0.07540341466665268, 0.10172723978757858, -0.03571772947907448, 0.007284308318048716, -0.01317865401506424, 0.08920414000749588, 0.050415460020303726, 0.0920315757393837, -0.024628832936286926, -0.029775982722640038, 0.0030826895963400602, -0.10573206096887589, 0.00843293871730566, -0.08909433335065842, -0.17530705034732819, -0.033279649913311005, 0.038942400366067886, 0.06327997148036957, 0.03506982699036598, 0.11513926833868027, -0.020643576979637146, -0.0004622370470315218, 0.044502582401037216, -0.04962502419948578, 0.01620616763830185, 0.015400436706840992, 0.024300534278154373, 0.09366749972105026, 0.00024095026310533285, 0.02871517464518547, -0.1108408123254776, 0.016502954065799713, -0.08566248416900635, -0.004433690570294857, -0.04066275432705879, -0.08080097287893295, 0.014394298195838928, -0.08052104711532593, -0.0014747381210327148, -0.16010569036006927, -0.0941496267914772, 0.006874208338558674, -0.008340749889612198, -0.04847903922200203, -0.06897178292274475, -0.028291959315538406, -0.039629727602005005, 0.07005634158849716, -0.07429031282663345, 0.012378060258924961, -0.0653066337108612, 0.09632273763418198, -0.04399619624018669, 0.09719543159008026, -0.1355123370885849, 0.08746109157800674, -0.10122440755367279, -0.013804182410240173, -0.05389239266514778, 0.07130532711744308, -0.004290519282221794, 0.07858801633119583, -0.003894622204825282, -0.026532486081123352, -0.10342510789632797, 0.07591511309146881, -0.029038161039352417, 0.1919892430305481, -0.09368238598108292, -0.09697038680315018, 0.24372151494026184, -0.04206692799925804, -0.11992959678173065, 0.11316732317209244, 0.006276176776736975, 0.04031015560030937, 0.06636199355125427, 0.23529867827892303, -0.002608133712783456, -0.015463003888726234, 0.05858722701668739, 0.1285160332918167, -0.09163093566894531, -0.022528203204274178, 0.02665521204471588, -0.035680610686540604, -0.06949988007545471, 0.03786681219935417, 0.09224521368741989, 0.06898946315050125, -0.04217932000756264, -0.038546912372112274, -0.0338650681078434, 0.006019831635057926, 0.10229165107011795, -0.004237710032612085, 0.11900407075881958, -0.06430190056562424, -0.06083345785737038, -0.0039004783611744642, -0.007743727415800095, -0.025927990674972534, 0.03889498487114906, -0.020975790917873383, 0.13977456092834473, -0.020214391872286797, 0.050080306828022, -0.1659611612558365, -0.07867800444364548, -0.0276208333671093, 0.15121620893478394, 0.023065028712153435, 0.16074787080287933, 0.05593082308769226, -0.04575453698635101, 0.0020971912890672684, 0.015426015481352806, 0.14330679178237915, 0.002297409577295184, -0.08078709989786148, -0.045803289860486984, 0.06078236922621727, -0.07078943401575089, 0.02853701077401638, -0.06128615140914917, 0.02451801672577858, 0.0706944391131401, 0.11360251158475876, -0.012172793038189411, 0.04050501063466072, -0.007806553039699793, 0.005355477333068848, -0.08260910212993622, 0.006272532045841217, 0.09126780182123184, -0.007062161341309547, -0.05587177723646164, 0.23702523112297058, -0.21233628690242767, 0.19319729506969452, 0.20568932592868805, -0.2847190499305725, 0.017599696293473244, -0.08135107904672623, -0.04634881764650345, 0.0296412892639637, 0.034921593964099884, -0.07091964036226273, 0.11610180884599686, -0.03123846836388111, 0.1735609918832779, -0.04871121793985367, -0.033698420971632004, -0.02400193363428116, -0.05103839933872223, -0.023231912404298782, 0.07538758218288422, 0.07964178919792175, -0.10268472135066986, 0.2096041589975357, 0.21891969442367554, 0.029522806406021118, 0.20605546236038208, 0.020199809223413467, -0.01082384493201971, 0.06585343927145004, -0.016593173146247864, -0.07647227495908737, -0.05193748697638512, -0.25636377930641174, -0.04254206269979477, 0.08694203943014145, 0.05027908831834793, 0.11748719960451126, -0.11812889575958252, -0.03973527252674103, 0.0017026892164722085, -0.0044732606038451195, 0.010201654396951199, 0.10733214765787125, 0.0688849464058876, 0.13163818418979645, 0.0035053531173616648, -0.015395179390907288, 0.0933142676949501, 0.021278366446495056, -0.07218139618635178, 0.16777180135250092, -0.14471881091594696, -0.3617521822452545, -0.12198661267757416, -0.1411694437265396, -0.04819782078266144, 0.05570744723081589, 0.12146272510290146, -0.11843276768922806, -0.013039637356996536, -0.024426942691206932, 0.10403992235660553, -0.10458782315254211, 0.015547837130725384, -0.09316565841436386, 0.023360151797533035, -0.09507838636636734, -0.10290220379829407, -0.04919681325554848, -0.02349056676030159, -0.07685697078704834, 0.13987404108047485, -0.06632959097623825, 0.05164608359336853, 0.21594153344631195, 0.03643343225121498, 0.04860004410147667, -0.03824276477098465, 0.2174932211637497, -0.10651476681232452, 0.004577175248414278, 0.1883121132850647, -0.010507632978260517, 0.07504907250404358, 0.14718979597091675, -0.006049822084605694, -0.06113997846841812, 0.021373562514781952, -0.008945688605308533, -0.09400025755167007, -0.19167540967464447, -0.1440424621105194, -0.14082567393779755, 0.07228381931781769, 0.027764417231082916, 0.06478160619735718, 0.15267089009284973, 0.05694347992539406, -0.026157980784773827, -0.006062425673007965, -0.001836199895478785, 0.0684063658118248, 0.23882564902305603, -0.07312719523906708, 0.15863476693630219, -0.03803245723247528, -0.148991659283638, 0.08423012495040894, 0.08482091128826141, 0.10031978040933609, 0.04883745685219765, 0.06448925286531448, 0.024240154772996902, 0.09675179421901703, 0.13343869149684906, 0.061844173818826675, 0.022185444831848145, -0.026203498244285583, -0.04098205268383026, -0.028983715921640396, -0.021278686821460724, 0.060860130935907364, 0.10284382104873657, -0.18223826587200165, -0.03360753133893013, -0.12674574553966522, 0.1040237694978714, 0.05630733072757721, 0.10208160430192947, -0.18108898401260376, 0.003050506114959717, 0.09536994993686676, -0.032557163387537, -0.1301579475402832, 0.0850055143237114, 0.06722985208034515, -0.11821058392524719, 0.015397914685308933, -0.02095544897019863, 0.11126109957695007, -0.031123479828238487, 0.10620614886283875, -0.09822628647089005, -0.09220608323812485, 0.013774980790913105, 0.11459406465291977, -0.2817303538322449, 0.21255555748939514, -0.013365482911467552, -0.09690005332231522, -0.11579300463199615, -0.004715035669505596, 0.01031493116170168, 0.0819380059838295, 0.08188709616661072, 0.004763775505125523, -0.04350537434220314, -0.06545519083738327, 0.004248070530593395, 0.019060388207435608, 0.12491510063409805, -0.03152519837021828, -0.01991986855864525, -0.04806303605437279, 0.011300591751933098, -0.0015421127900481224, 0.01802166923880577, 0.01983943209052086, -0.20397375524044037, 0.09817983210086823, 0.07011082023382187, 0.06267958134412766, 0.022111909464001656, -0.009729073382914066, -0.14869281649589539, 0.22171661257743835, -0.02809896692633629, -0.0771496519446373, -0.10737742483615875, -0.05013793334364891, 0.06141689419746399, -0.04688161984086037, 0.041116632521152496, -0.08103673905134201, 0.03962651267647743, -0.07487211376428604, -0.17772899568080902, 0.12835821509361267, -0.08348777145147324, -0.042941171675920486, -0.025282789021730423, 0.1864084005355835, -0.0846327543258667, 0.024463225156068802, 0.025082062929868698, 0.04613133519887924, -0.1508028209209442, -0.10233563929796219, 0.034852705895900726, -0.03161567449569702, 0.061152249574661255, 0.04001912102103233, -0.04340772330760956, -0.021880418062210083, -0.002970570931211114, -0.011320682242512703, 0.33755841851234436, 0.15072861313819885, -0.08390255272388458, 0.17842750251293182, 0.0893208310008049, -0.059796981513500214, -0.3357309103012085, -0.09483138471841812, -0.12697452306747437, -0.025288153439760208, -0.021405275911092758, -0.16736409068107605, 0.04016929119825363, -0.003834964707493782, -0.03241279721260071, 0.10695530474185944, -0.25454771518707275, -0.0813552588224411, 0.1530611217021942, -0.051993608474731445, 0.3512592017650604, -0.1353636085987091, -0.08938448131084442, -0.027285385876893997, -0.11751596629619598, 0.17193706333637238, -0.07499987632036209, 0.11728648841381073, -0.005291353445500135, 0.14058533310890198, 0.0594446137547493, -0.032975900918245316, 0.11937713623046875, -0.008806196972727776, -0.023774700239300728, -0.11386077105998993, -0.08634026348590851, 0.04117414727807045, -0.0033683227375149727, 0.016631845384836197, -0.07227682322263718, 0.02188601717352867, -0.1352013796567917, -0.020081689581274986, -0.10218049585819244, 0.05960511416196823, 0.02450513280928135, -0.07391975075006485, -0.044624634087085724, -0.05516489967703819, -0.005635458510369062, 0.015539255924522877, 0.21460223197937012, -0.08054444938898087, 0.18822970986366272, 0.14798088371753693, 0.0750354528427124, -0.1478407084941864, -0.005493863020092249, -0.03167466074228287, -0.051466576755046844, 0.08480866998434067, -0.12806963920593262, 0.05007537081837654, 0.09602449834346771, -0.04866054281592369, 0.08426439762115479, 0.10378798842430115, 0.00529234204441309, 0.003562262048944831, 0.1266377568244934, -0.24999721348285675, -0.05188959464430809, -0.06728829443454742, -0.0003314604109618813, 0.07470729202032089, 0.05989599972963333, 0.18490374088287354, 0.024284880608320236, -0.02933247573673725, -0.001147680333815515, 0.02046186476945877, -0.044130854308605194, 0.034657470881938934, 0.020751554518938065, 0.037146806716918945, -0.13429945707321167, 0.049100302159786224, 0.03831800818443298, -0.14490972459316254, 0.02793888933956623, 0.15596450865268707, -0.09589001536369324, -0.1481330692768097, -0.08178485929965973, 0.046700187027454376, -0.09863997250795364, -0.003060641000047326, -0.016482530161738396, -0.13233160972595215, 0.0721844807267189, 0.10570328682661057, 0.0632612556219101, 0.0926874577999115, -0.05276959761977196, -0.02356615848839283, -0.005535936914384365, -0.013608732260763645, -0.017031483352184296, 0.00047467724652960896, -0.06871786713600159, 0.08851433545351028, -0.02970702014863491, 0.14091646671295166, -0.09792686998844147, -0.08993403613567352, -0.16552817821502686, 0.012215564027428627, -0.11382917314767838, -0.10604946315288544, -0.08692184090614319, -0.06279279291629791, 0.007451801095157862, -0.03690037131309509, -0.05121577903628349, -0.05880220606923103, -0.13037677109241486, 0.025712331756949425, -0.05368543416261673, 0.04174434766173363, -0.08266101777553558, 0.009202999994158745, 0.08040843904018402, -0.025321047753095627, 0.1427605152130127, 0.10867537558078766, -0.09636122733354568, 0.08708717674016953, -0.10933805257081985, -0.10561708360910416, 0.0996691957116127, 0.015306972898542881, 0.049583129584789276, 0.0882255807518959, 0.0016521752113476396, 0.047729525715112686, 0.05468335375189781, 0.05270431563258171, 0.010714812204241753, -0.11086291074752808, 0.06701637804508209, -0.046916425228118896, -0.14871534705162048, -0.0383060947060585, -0.04576587677001953, 0.023959442973136902, 0.014766783453524113, 0.09013618528842926, -0.03914914280176163, 0.09170293807983398, -0.060116108506917953, 0.03544970601797104, -0.005962125025689602, -0.17961421608924866, -0.025273950770497322, -0.07678066939115524, 0.036338284611701965, 0.009770811535418034, 0.27444127202033997, 0.07005681097507477, -0.02579980343580246, 0.03809122368693352, 0.09221042692661285, 0.04013144597411156, 0.01235515158623457, 0.17545528709888458, 0.1118883416056633, -0.06960049271583557, -0.11137815564870834, 0.06916595250368118, 0.02774476259946823, 0.030671268701553345, 0.11741524934768677, 0.028308389708399773, 0.02884010225534439, 0.09906800836324692, -0.010664747096598148, -0.029787031933665276, -0.11760692298412323, -0.12204014509916306, -0.025804514065384865, 0.0716206356883049, -0.08077048510313034, 0.07161043584346771, 0.15348944067955017, -0.024815790355205536, 0.04494022950530052, -0.04187069460749626, -0.05125942826271057, -0.16044385731220245, -0.13453450798988342, -0.06714465469121933, -0.1462772637605667, -0.01798534393310547, -0.10550841689109802, 0.07317068427801132, 0.0905640497803688, 0.054012611508369446, -0.04726504907011986, 0.1088385358452797, 0.04488391429185867, -0.10305679589509964, 0.04867062717676163, -0.028838563710451126, 0.10472944378852844, -0.0389210507273674, -0.017331164330244064, -0.08582847565412521, 0.018834469839930534, -0.00014729268150404096, 0.05825033783912659, -0.046236004680395126, 0.0008270144462585449, -0.15260833501815796, -0.09336699545383453, -0.067965067923069, 0.06514385342597961, -0.03987060859799385, 0.16180802881717682, 0.006504714023321867, -0.019208962097764015, 0.027142219245433807, 0.2522609531879425, -0.09437862038612366, -0.03748486936092377, -0.04197218641638756, 0.18040455877780914, 0.026413097977638245, 0.10892705619335175, -0.04485548287630081, -0.022241802886128426, -0.11629731208086014, 0.34659260511398315, 0.3378466069698334, -0.09852437674999237, 0.030723217874765396, 0.025790590792894363, 0.035837359726428986, 0.11793298274278641, 0.10910405218601227, 0.1060420423746109, 0.2840439975261688, -0.07986118644475937, -0.04112262278795242, -0.015220209956169128, -0.020856084302067757, -0.09601940214633942, 0.10299155116081238, 0.04667018726468086, -0.07291784137487411, -0.0344402976334095, 0.06851313263177872, -0.2328837513923645, 0.11787183582782745, -0.1064424142241478, -0.21074175834655762, -0.06903064996004105, 0.035422950983047485, 0.14039748907089233, -0.0034193163737654686, 0.10519111156463623, -0.008336585015058517, -0.09666894376277924, 0.04311183840036392, 0.018499037250876427, -0.19733557105064392, 0.004534261301159859, 0.0827038586139679, -0.06582990288734436, -0.009068409912288189, -0.034264057874679565, 0.03624148294329643, 0.08922430127859116, 0.05437808111310005, -0.01910148747265339, 0.05042428895831108, 0.014998827129602432, -0.0677507072687149, -0.0006768365274183452, 0.03367190062999725, 0.02013298124074936, -0.1020888015627861, 0.08180907368659973, -0.17628344893455505, 0.053729098290205, -0.013982406817376614, -0.0291798934340477, -0.0029725844506174326, -0.01890522800385952, -0.055641546845436096, 0.06260602921247482, 0.07296348363161087, -0.011743206530809402, -0.02574125863611698, -0.04272527992725372, -0.04611913114786148, -0.03326372802257538, -0.08508988469839096, -0.11325842887163162, -0.13927727937698364, -0.09787482768297195, 0.09634150564670563, -0.008824790827929974, -0.18197888135910034, 0.0028776151593774557, -0.06104812026023865, 0.08618874102830887, -0.14188407361507416, 0.08006776124238968, 0.08249889314174652, 0.00013928746921010315, -0.0090348981320858, -0.021353812888264656, 0.04725958779454231, 0.07949267327785492, -0.11032622307538986, -0.06586744636297226 ]
null
null
transformers
### **GPT-Macbeth** A custom finetune of GPT-2 trained on a custom dataset of victorian literature ## Information The goal of this finetune is to output high-quality victorian literature, while being customizable with Author's Note and being light to run (aka not being a GPT-Neo or GPT-Jax finetune, for now at least). ## Authors Note Author's Note was added manually, so please appreciate it. :) The format of it is [ Author: George Eliot; Genre: Horror, fantasy, novel; Tags: scary, magical, victorian ] Some words will work well, some won't. Please make sure to have spaces before each ][. Most popular victorian authors should work, but keep in mind that some authors (e.g. Mark Twain) will result in a somewhat weird behavior due to a quirk in the dataset that will be addressed in the next version of the finetune. When it comes to the genres, "novel", "fiction", "horror" and "romance" work best, but from playing around with it, I've noticed that most other not too specific genres work pretty well too. The tags are a bit complicated. Adding "normal" will result in a story without anything special (like no magic or fantasy element) and tends to be pretty low-pace. Using "real-life" will push the AI towards a historical/biographical path. Almost all tags should work. Using "man" or "woman" is supposed to semi-determine what gender the main character is, but it heavily depends on the chosen author. ## History Version 0 - This was the first test version of the finetune, trained on GPT-2-small and with a really small dataset. The name was GPT-Kelini before it was renamed to GPT-Macbeth in V1. Version 1 - The current version of the finetune. Trained on GPT-2-medium with a much, much bigger dataset compared to V0. Supports Author's Note ### Notes Please use a very low temperature/randomness when using it, if you want to get anything out of it. Pumping the repetition penalty up helps a lot too. The model was specifically converted to PyTorch so that most front-end GUI's should run it. It has been only tested on KoboldAI, but should theoretically work on others too. For some odd reason, my finetune is capable of writing victorian NSFW content, if used the right way. No NSFW was in the dataset and considering the size of the model, it's really odd to see it do so. Perhaps the countless romantic novels in the dataset had something naughty in them, but I highly doubt it. You may sometimes get roman numerals on random occasions, this shouldn't happen often, but if it does, it's again something that will be (manually, unfortunately) addressed in the next version of the finetune. If you are wondering why I renamed my finetune to Macbeth, there are a few reasons: First, it sounds much better and smoother than Kelini, second, it's a play by Shakespeare that closely matches the writing style of some of the authors in my dataset, and third, the most important reason, it's was mentioned in Hamilton, so yes, my love with Hamilton is bleeding everywhere and yes, the next version of the dataset will try to have a Hamilton easter egg featuring the Author's Note. ### Credits I want to thank HuggingFace for their tokenizer and everything they've done to make everything easier. Then is OpenAI for making GPT-2. I also want to thank most active people on the AIM Discord server in the community-projects channel. Thanks to Bran for finding a way to convert checkpoints to a PyTorch model, thanks to Mr. Seeker and Aedial for helping me in cleaning the dataset and to *finetune* from the NovelAI team for perhaps making my finetune output much better quality by telling me about the magic of the <\|endoftext\|> token. P.S. If you happen to use it in something commercial or in an online demo or in any other way that is not for personal use, a credit will be greatly appreciated (and if you do something exciting with it, make sure to let me know, I'd be more than happy to see it being used by someone!).
{}
null
Philipuss/GPT-Macbeth
[ "transformers", "pytorch", "tensorboard", "gpt2", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #tensorboard #gpt2 #endpoints_compatible #text-generation-inference #region-us
### GPT-Macbeth A custom finetune of GPT-2 trained on a custom dataset of victorian literature ## Information The goal of this finetune is to output high-quality victorian literature, while being customizable with Author's Note and being light to run (aka not being a GPT-Neo or GPT-Jax finetune, for now at least). ## Authors Note Author's Note was added manually, so please appreciate it. :) The format of it is [ Author: George Eliot; Genre: Horror, fantasy, novel; Tags: scary, magical, victorian ] Some words will work well, some won't. Please make sure to have spaces before each ][. Most popular victorian authors should work, but keep in mind that some authors (e.g. Mark Twain) will result in a somewhat weird behavior due to a quirk in the dataset that will be addressed in the next version of the finetune. When it comes to the genres, "novel", "fiction", "horror" and "romance" work best, but from playing around with it, I've noticed that most other not too specific genres work pretty well too. The tags are a bit complicated. Adding "normal" will result in a story without anything special (like no magic or fantasy element) and tends to be pretty low-pace. Using "real-life" will push the AI towards a historical/biographical path. Almost all tags should work. Using "man" or "woman" is supposed to semi-determine what gender the main character is, but it heavily depends on the chosen author. ## History Version 0 - This was the first test version of the finetune, trained on GPT-2-small and with a really small dataset. The name was GPT-Kelini before it was renamed to GPT-Macbeth in V1. Version 1 - The current version of the finetune. Trained on GPT-2-medium with a much, much bigger dataset compared to V0. Supports Author's Note ### Notes Please use a very low temperature/randomness when using it, if you want to get anything out of it. Pumping the repetition penalty up helps a lot too. The model was specifically converted to PyTorch so that most front-end GUI's should run it. It has been only tested on KoboldAI, but should theoretically work on others too. For some odd reason, my finetune is capable of writing victorian NSFW content, if used the right way. No NSFW was in the dataset and considering the size of the model, it's really odd to see it do so. Perhaps the countless romantic novels in the dataset had something naughty in them, but I highly doubt it. You may sometimes get roman numerals on random occasions, this shouldn't happen often, but if it does, it's again something that will be (manually, unfortunately) addressed in the next version of the finetune. If you are wondering why I renamed my finetune to Macbeth, there are a few reasons: First, it sounds much better and smoother than Kelini, second, it's a play by Shakespeare that closely matches the writing style of some of the authors in my dataset, and third, the most important reason, it's was mentioned in Hamilton, so yes, my love with Hamilton is bleeding everywhere and yes, the next version of the dataset will try to have a Hamilton easter egg featuring the Author's Note. ### Credits I want to thank HuggingFace for their tokenizer and everything they've done to make everything easier. Then is OpenAI for making GPT-2. I also want to thank most active people on the AIM Discord server in the community-projects channel. Thanks to Bran for finding a way to convert checkpoints to a PyTorch model, thanks to Mr. Seeker and Aedial for helping me in cleaning the dataset and to *finetune* from the NovelAI team for perhaps making my finetune output much better quality by telling me about the magic of the <\|endoftext\|> token. P.S. If you happen to use it in something commercial or in an online demo or in any other way that is not for personal use, a credit will be greatly appreciated (and if you do something exciting with it, make sure to let me know, I'd be more than happy to see it being used by someone!).
[ "### GPT-Macbeth\nA custom finetune of GPT-2 trained on a custom dataset of victorian literature", "## Information\nThe goal of this finetune is to output high-quality victorian literature, while being customizable with Author's Note and being light to run (aka not being a GPT-Neo or GPT-Jax finetune, for now at least).", "## Authors Note\nAuthor's Note was added manually, so please appreciate it. :)\n\nThe format of it is [ Author: George Eliot; Genre: Horror, fantasy, novel; Tags: scary, magical, victorian ]\nSome words will work well, some won't. Please make sure to have spaces before each ][.\n\nMost popular victorian authors should work, but keep in mind that some authors (e.g. Mark Twain) will result in a somewhat weird behavior due to a quirk in the dataset that will be addressed in the next version of the finetune.\n\nWhen it comes to the genres, \"novel\", \"fiction\", \"horror\" and \"romance\" work best, but from playing around with it, I've noticed that most other not too specific genres work pretty well too.\n\nThe tags are a bit complicated. Adding \"normal\" will result in a story without anything special (like no magic or fantasy element) and tends to be pretty low-pace. Using \"real-life\" will push the AI towards a historical/biographical path. Almost all tags should work. Using \"man\" or \"woman\" is supposed to semi-determine what gender the main character is, but it heavily depends on the chosen author.", "## History\nVersion 0 - This was the first test version of the finetune, trained on GPT-2-small and with a really small dataset. The name was GPT-Kelini before it was renamed to GPT-Macbeth in V1.\n\nVersion 1 - The current version of the finetune. Trained on GPT-2-medium with a much, much bigger dataset compared to V0. Supports Author's Note", "### Notes\nPlease use a very low temperature/randomness when using it, if you want to get anything out of it. Pumping the repetition penalty up helps a lot too.\n\nThe model was specifically converted to PyTorch so that most front-end GUI's should run it. It has been only tested on KoboldAI, but should theoretically work on others too.\n\nFor some odd reason, my finetune is capable of writing victorian NSFW content, if used the right way. No NSFW was in the dataset and considering the size of the model, it's really odd to see it do so. Perhaps the countless romantic novels in the dataset had something naughty in them, but I highly doubt it.\n\nYou may sometimes get roman numerals on random occasions, this shouldn't happen often, but if it does, it's again something that will be (manually, unfortunately) addressed in the next version of the finetune.\n\nIf you are wondering why I renamed my finetune to Macbeth, there are a few reasons: First, it sounds much better and smoother than Kelini, second, it's a play by Shakespeare that closely matches the writing style of some of the authors in my dataset, and third, the most important reason, it's was mentioned in Hamilton, so yes, my love with Hamilton is bleeding everywhere and yes, the next version of the dataset will try to have a Hamilton easter egg featuring the Author's Note.", "### Credits\nI want to thank HuggingFace for their tokenizer and everything they've done to make everything easier. Then is OpenAI for making GPT-2. I also want to thank most active people on the AIM Discord server in the community-projects channel. Thanks to Bran for finding a way to convert checkpoints to a PyTorch model, thanks to Mr. Seeker and Aedial for helping me in cleaning the dataset and to *finetune* from the NovelAI team for perhaps making my finetune output much better quality by telling me about the magic of the <\\|endoftext\\|> token.\n\n\n\n\nP.S. If you happen to use it in something commercial or in an online demo or in any other way that is not for personal use, a credit will be greatly appreciated (and if you do something exciting with it, make sure to let me know, I'd be more than happy to see it being used by someone!)." ]
[ "TAGS\n#transformers #pytorch #tensorboard #gpt2 #endpoints_compatible #text-generation-inference #region-us \n", "### GPT-Macbeth\nA custom finetune of GPT-2 trained on a custom dataset of victorian literature", "## Information\nThe goal of this finetune is to output high-quality victorian literature, while being customizable with Author's Note and being light to run (aka not being a GPT-Neo or GPT-Jax finetune, for now at least).", "## Authors Note\nAuthor's Note was added manually, so please appreciate it. :)\n\nThe format of it is [ Author: George Eliot; Genre: Horror, fantasy, novel; Tags: scary, magical, victorian ]\nSome words will work well, some won't. Please make sure to have spaces before each ][.\n\nMost popular victorian authors should work, but keep in mind that some authors (e.g. Mark Twain) will result in a somewhat weird behavior due to a quirk in the dataset that will be addressed in the next version of the finetune.\n\nWhen it comes to the genres, \"novel\", \"fiction\", \"horror\" and \"romance\" work best, but from playing around with it, I've noticed that most other not too specific genres work pretty well too.\n\nThe tags are a bit complicated. Adding \"normal\" will result in a story without anything special (like no magic or fantasy element) and tends to be pretty low-pace. Using \"real-life\" will push the AI towards a historical/biographical path. Almost all tags should work. Using \"man\" or \"woman\" is supposed to semi-determine what gender the main character is, but it heavily depends on the chosen author.", "## History\nVersion 0 - This was the first test version of the finetune, trained on GPT-2-small and with a really small dataset. The name was GPT-Kelini before it was renamed to GPT-Macbeth in V1.\n\nVersion 1 - The current version of the finetune. Trained on GPT-2-medium with a much, much bigger dataset compared to V0. Supports Author's Note", "### Notes\nPlease use a very low temperature/randomness when using it, if you want to get anything out of it. Pumping the repetition penalty up helps a lot too.\n\nThe model was specifically converted to PyTorch so that most front-end GUI's should run it. It has been only tested on KoboldAI, but should theoretically work on others too.\n\nFor some odd reason, my finetune is capable of writing victorian NSFW content, if used the right way. No NSFW was in the dataset and considering the size of the model, it's really odd to see it do so. Perhaps the countless romantic novels in the dataset had something naughty in them, but I highly doubt it.\n\nYou may sometimes get roman numerals on random occasions, this shouldn't happen often, but if it does, it's again something that will be (manually, unfortunately) addressed in the next version of the finetune.\n\nIf you are wondering why I renamed my finetune to Macbeth, there are a few reasons: First, it sounds much better and smoother than Kelini, second, it's a play by Shakespeare that closely matches the writing style of some of the authors in my dataset, and third, the most important reason, it's was mentioned in Hamilton, so yes, my love with Hamilton is bleeding everywhere and yes, the next version of the dataset will try to have a Hamilton easter egg featuring the Author's Note.", "### Credits\nI want to thank HuggingFace for their tokenizer and everything they've done to make everything easier. Then is OpenAI for making GPT-2. I also want to thank most active people on the AIM Discord server in the community-projects channel. Thanks to Bran for finding a way to convert checkpoints to a PyTorch model, thanks to Mr. Seeker and Aedial for helping me in cleaning the dataset and to *finetune* from the NovelAI team for perhaps making my finetune output much better quality by telling me about the magic of the <\\|endoftext\\|> token.\n\n\n\n\nP.S. If you happen to use it in something commercial or in an online demo or in any other way that is not for personal use, a credit will be greatly appreciated (and if you do something exciting with it, make sure to let me know, I'd be more than happy to see it being used by someone!)." ]
[ 38, 26, 56, 288, 95, 329, 209 ]
[ "passage: TAGS\n#transformers #pytorch #tensorboard #gpt2 #endpoints_compatible #text-generation-inference #region-us \n### GPT-Macbeth\nA custom finetune of GPT-2 trained on a custom dataset of victorian literature## Information\nThe goal of this finetune is to output high-quality victorian literature, while being customizable with Author's Note and being light to run (aka not being a GPT-Neo or GPT-Jax finetune, for now at least).## Authors Note\nAuthor's Note was added manually, so please appreciate it. :)\n\nThe format of it is [ Author: George Eliot; Genre: Horror, fantasy, novel; Tags: scary, magical, victorian ]\nSome words will work well, some won't. Please make sure to have spaces before each ][.\n\nMost popular victorian authors should work, but keep in mind that some authors (e.g. Mark Twain) will result in a somewhat weird behavior due to a quirk in the dataset that will be addressed in the next version of the finetune.\n\nWhen it comes to the genres, \"novel\", \"fiction\", \"horror\" and \"romance\" work best, but from playing around with it, I've noticed that most other not too specific genres work pretty well too.\n\nThe tags are a bit complicated. Adding \"normal\" will result in a story without anything special (like no magic or fantasy element) and tends to be pretty low-pace. Using \"real-life\" will push the AI towards a historical/biographical path. Almost all tags should work. Using \"man\" or \"woman\" is supposed to semi-determine what gender the main character is, but it heavily depends on the chosen author.## History\nVersion 0 - This was the first test version of the finetune, trained on GPT-2-small and with a really small dataset. The name was GPT-Kelini before it was renamed to GPT-Macbeth in V1.\n\nVersion 1 - The current version of the finetune. Trained on GPT-2-medium with a much, much bigger dataset compared to V0. Supports Author's Note" ]
[ -0.004667116794735193, -0.026720361784100533, -0.006730477325618267, 0.06994504481554031, 0.036347806453704834, 0.014951755292713642, -0.08585047721862793, 0.08391354233026505, -0.0746341124176979, 0.059647489339113235, 0.026308134198188782, 0.04567893594503403, 0.02532191015779972, -0.011346262879669666, 0.15468275547027588, -0.15175171196460724, 0.03432217985391617, -0.07842862606048584, 0.0590866394340992, 0.08954936265945435, 0.10966618359088898, -0.0486312210559845, 0.0381869301199913, -0.00986308790743351, 0.012049001641571522, 0.008730540052056313, 0.003424891270697117, -0.00021503696916624904, 0.1335204839706421, 0.10920725762844086, 0.03560234606266022, -0.0327160581946373, -0.0405428484082222, -0.15015031397342682, 0.02490869350731373, 0.10050664842128754, 0.01886899210512638, 0.0019582051318138838, 0.12055882811546326, -0.009720009751617908, 0.1825992614030838, -0.09455183148384094, 0.022683370858430862, 0.04427805170416832, -0.09191315621137619, -0.23078621923923492, -0.059633996337652206, 0.1301032304763794, -0.012801451608538628, 0.012539144605398178, -0.039086032658815384, 0.07258480042219162, -0.0006818920955993235, 0.03526080399751663, 0.26918986439704895, -0.1957130879163742, -0.030243713408708572, 0.05518944561481476, 0.10577671974897385, 0.11254914104938507, -0.10174985975027084, 0.036477070301771164, -0.050932783633470535, 0.06557460874319077, 0.022676410153508186, -0.006643880624324083, 0.10428883880376816, -0.005824710708111525, -0.13163155317306519, -0.03585804998874664, 0.08657627552747726, -0.03754543885588646, -0.085422083735466, -0.10456746071577072, 0.0052378117106854916, 0.07792655378580093, -0.01650400273501873, -0.09000357985496521, 0.0003723034751601517, 0.03533848747611046, 0.11017723381519318, -0.09593803435564041, -0.1235462874174118, 0.03443928062915802, -0.07333678007125854, 0.08530442416667938, -0.01719171553850174, 0.02070043981075287, 0.027006570249795914, 0.06448162347078323, -0.1843363493680954, 0.03092021867632866, -0.09707249701023102, -0.08805812895298004, -0.11766333132982254, -0.08940091729164124, -0.03705344721674919, -0.04844066500663757, -0.0038925386033952236, 0.10732489079236984, -0.0595405139029026, -0.006929606199264526, 0.008767823688685894, 0.015984153375029564, 0.0560893788933754, 0.11920079588890076, -0.09564684331417084, 0.0006295949569903314, 0.049929577857255936, -0.05221258103847504, 0.063740074634552, 0.021651271730661392, 0.04030153900384903, -0.07380028814077377, 0.07635995000600815, 0.044270358979701996, 0.08591313660144806, 0.03252357989549637, -0.0480726920068264, 0.01794131100177765, 0.12875013053417206, -0.06679186969995499, 0.011126824654638767, 0.008760055527091026, 0.013655033893883228, 0.03334486111998558, 0.016356516629457474, -0.011843613348901272, -0.09574348479509354, 0.13522006571292877, -0.05417714640498161, -0.04611959308385849, 0.051358986645936966, -0.10416471213102341, 0.04610762372612953, 0.009587442502379417, -0.09466671198606491, -0.08392446488142014, -0.08450097590684891, -0.08187185227870941, -0.011054420843720436, 0.018478060141205788, -0.028836671262979507, 0.042434077709913254, -0.07454940676689148, -0.014040638692677021, 0.05722707882523537, 0.10545378923416138, -0.034041255712509155, 0.0524703748524189, -0.1207282692193985, 0.033400990068912506, -0.00009169204713543877, 0.022917795926332474, -0.12496844679117203, -0.02058148942887783, -0.21540746092796326, 0.13562504947185516, -0.046231940388679504, -0.0012427670881152153, -0.09791465848684311, -0.02946302480995655, -0.014457806013524532, 0.033627819269895554, -0.05496411770582199, 0.04394624009728432, -0.13182958960533142, -0.05608504265546799, 0.1168714091181755, -0.10757116228342056, -0.009166453033685684, 0.1791105717420578, 0.025341426953673363, 0.05396900698542595, 0.18185092508792877, 0.053988587111234665, 0.10712387412786484, -0.027964703738689423, -0.0643308013677597, -0.0006958938320167363, -0.1566912829875946, 0.20320935547351837, 0.04879244044423103, -0.06972049176692963, 0.044635504484176636, 0.0343441367149353, -0.10954365134239197, -0.07324086874723434, 0.03300737962126732, -0.012871921993792057, -0.042743511497974396, -0.006708813831210136, 0.0447213351726532, 0.013880561105906963, -0.07782639563083649, -0.0772530660033226, -0.16525113582611084, -0.17040599882602692, 0.04015751928091049, 0.005585123784840107, 0.11477843672037125, -0.042600005865097046, 0.08658046275377274, 0.11435512453317642, 0.06531482189893723, -0.13836658000946045, -0.047458216547966, 0.021126672625541687, -0.037117090076208115, 0.06627533584833145, 0.08874072134494781, 0.014181597158312798, 0.05931316316127777, -0.04606205224990845, 0.026240630075335503, -0.019571172073483467, -0.035552978515625, -0.0905846506357193, -0.1791507601737976, -0.011448358185589314, -0.04508567601442337, 0.06848923116922379, -0.05722955986857414, -0.02172376587986946, 0.09489820897579193, 0.037939585745334625, -0.00037837683339603245, -0.0494096577167511, 0.04742513224482536, 0.03781507909297943, -0.023429062217473984, -0.037314996123313904, 0.06362554430961609, -0.03958303481340408, -0.054039549082517624, 0.07579807192087173, -0.14229385554790497, -0.1668664664030075, 0.09062736481428146, -0.061289459466934204, -0.062437158077955246, 0.06387995183467865, -0.009410927072167397, 0.0042571816593408585, -0.08781716972589493, -0.08206211775541306, 0.16333860158920288, 0.05904994159936905, -0.00034507986856624484, -0.07073870301246643, -0.07940671592950821, -0.04869794473052025, -0.003281380981206894, -0.018012693151831627, 0.13511231541633606, -0.081241674721241, -0.13552415370941162, -0.030424030497670174, 0.03616206347942352, -0.01373008731752634, 0.10570398718118668, 0.06404447555541992, -0.0751437321305275, 0.0033870553597807884, -0.013806619681417942, 0.05757179856300354, -0.09708229452371597, -0.11269258707761765, -0.015067541971802711, 0.04033025726675987, 0.0029777565505355597, -0.014876359142363071, -0.0672568827867508, -0.005985673516988754, 0.02269042655825615, -0.04307028651237488, -0.022738484665751457, 0.013994968496263027, 0.005121034570038319, 0.0950716957449913, 0.035123370587825775, 0.12030092626810074, 0.009786361828446388, -0.014788249507546425, -0.11780506372451782, 0.09968999773263931, -0.11433296650648117, -0.26302701234817505, -0.15607793629169464, 0.027609366923570633, -0.017914041876792908, 0.051377374678850174, 0.07345303148031235, -0.08844485878944397, -0.07022415101528168, -0.07147058844566345, 0.17190410196781158, -0.08901247382164001, -0.06739774346351624, -0.13807553052902222, 0.017685867846012115, 0.009563985280692577, -0.13100215792655945, 0.02334205061197281, 0.026158366352319717, -0.12547381222248077, 0.054233573377132416, 0.007092849351465702, 0.06887851655483246, 0.08438307791948318, -0.014038054272532463, -0.05915671959519386, -0.034421760588884354, 0.22029131650924683, -0.06483078747987747, 0.08423946797847748, 0.06480103731155396, -0.02598968707025051, 0.09556058794260025, 0.047057170420885086, 0.03506801649928093, -0.007339829113334417, -0.00010074634337797761, 0.04231317713856697, -0.053148336708545685, -0.19419999420642853, -0.010886483825743198, -0.02221183478832245, 0.010593402199447155, 0.016304580494761467, 0.0879683867096901, 0.0036342875100672245, -0.037035152316093445, -0.11488805711269379, -0.01814497448503971, 0.0921071395277977, 0.10053258389234543, 0.14125393331050873, 0.008373197168111801, 0.028362946584820747, -0.04060351103544235, 0.06773816794157028, 0.11956251412630081, -0.05841965228319168, 0.016989031806588173, -0.0869036391377449, 0.16546107828617096, 0.06931925565004349, 0.07124439626932144, 0.048539966344833374, -0.00046281347749754786, 0.013331970199942589, 0.015085671097040176, -0.03674625605344772, -0.06330420076847076, -0.03803142532706261, 0.0703568384051323, 0.09633223712444305, -0.07250452786684036, 0.01716528832912445, -0.0038225320167839527, 0.06356891989707947, 0.133441761136055, -0.03223167732357979, 0.00385661912150681, -0.12442399561405182, 0.00014381362416315824, -0.06526053696870804, -0.03369440510869026, -0.021533116698265076, 0.026550589129328728, -0.12128352373838425, 0.06831443309783936, 0.039803214371204376, 0.0652623251080513, -0.04234500229358673, -0.0024342602118849754, -0.0375424325466156, 0.0723089724779129, -0.01793719083070755, 0.09222753345966339, -0.11174661666154861, 0.010804368183016777, 0.010399870574474335, 0.10131671279668808, -0.01966952346265316, -0.025738444179296494, 0.05107021704316139, 0.14000087976455688, 0.1259436011314392, 0.0378132238984108, 0.10020623356103897, -0.047223545610904694, 0.05090625584125519, 0.014130459167063236, 0.026475144550204277, -0.09324945509433746, 0.07398441433906555, -0.05680738762021065, 0.00044578209053725004, -0.04656222462654114, 0.1172831580042839, -0.07004858553409576, -0.12865693867206573, 0.07324539124965668, -0.08506470918655396, 0.07516670972108841, -0.06572413444519043, -0.02886383980512619, -0.12145301699638367, 0.15111902356147766, -0.050733163952827454, -0.07850519567728043, -0.0660683810710907, 0.02842613309621811, 0.10635757446289062, -0.05113549157977104, -0.03396809101104736, -0.055338304489851, 0.1488683670759201, -0.07683955878019333, -0.00478345574811101, -0.043098900467157364, -0.006942465901374817, -0.22133111953735352, -0.015175770036876202, 0.1577596664428711, 0.0903526321053505, 0.07299891114234924, 0.014444881118834019, 0.052161671221256256, -0.02256556786596775, -0.06703021377325058, 0.11490874737501144, 0.10800614953041077, -0.0883573517203331, 0.011481259018182755, 0.05278156325221062, -0.01176388468593359, -0.15299420058727264, -0.1013120710849762, 0.12444004416465759, 0.14260774850845337, -0.06228213012218475, 0.15921103954315186, 0.19330763816833496, -0.13374067842960358, -0.2125755101442337, -0.0845535397529602, 0.026892883703112602, -0.04359087347984314, -0.030899368226528168, -0.13172930479049683, 0.157185360789299, 0.07323842495679855, 0.037901364266872406, -0.010634447447955608, -0.24582785367965698, -0.1188737154006958, -0.0410265289247036, 0.023367024958133698, -0.03655055910348892, -0.149488165974617, -0.04878709092736244, -0.017835546284914017, -0.029314910992980003, 0.050320643931627274, -0.07889293134212494, 0.0777738094329834, -0.043573275208473206, 0.020844167098402977, 0.027171747758984566, -0.0063057513907551765, 0.13769282400608063, -0.04789254069328308, -0.008696932345628738, -0.0663888081908226, 0.07598523050546646, 0.1735701560974121, -0.04322037100791931, 0.047813985496759415, 0.03939791023731232, -0.026577679440379143, 0.023627381771802902, -0.055670954287052155, -0.14068950712680817, 0.08416468650102615, -0.0830039381980896, -0.010899091139435768, -0.08874692767858505, 0.04629196971654892, 0.05416061356663704, -0.03137921169400215, -0.10449564456939697, -0.10627272725105286, 0.052650511264801025, -0.06612826138734818, 0.13334079086780548, -0.06234762817621231, -0.21081258356571198, 0.025373071432113647, -0.023786362260580063, 0.028926797211170197, -0.025870565325021744, 0.054866474121809006, 0.11472366750240326, -0.0037307001184672117, 0.05991319566965103, 0.0062899296171963215, -0.14202183485031128, -0.007324819918721914, 0.0837957039475441, -0.10194728523492813, -0.21055163443088531, -0.006333647295832634, -0.0505792535841465, -0.05770500749349594, -0.10790105909109116, 0.12298480421304703, -0.0002274725993629545, -0.05970025062561035, 0.010090597905218601, 0.02650574967265129, 0.007487261667847633, 0.06779074668884277, -0.02605493925511837, 0.027689527720212936, -0.06031859666109085, 0.058616753667593, 0.08924084901809692, -0.08655621111392975, -0.023785758763551712, 0.20208241045475006, -0.09244155883789062, -0.05786442756652832, -0.027994690462946892, -0.005479868967086077, -0.018243037164211273, -0.0014357210602611303, 0.08529386669397354, -0.05389653518795967, 0.070990651845932, 0.12943653762340546, -0.02645454742014408, 0.051704950630664825, -0.053627513349056244, -0.0034261622931808233, -0.027952102944254875, 0.05827883630990982, 0.04034234955906868, 0.014328081160783768, -0.06733451038599014, 0.25484445691108704, 0.002394638955593109, -0.012274768203496933, 0.010681893676519394, -0.04748555272817612, 0.02991202473640442, -0.01585695520043373, -0.15423960983753204, 0.05335536226630211, -0.0780109316110611, 0.007753488142043352, -0.017123311758041382, 0.013939395546913147, 0.0341346301138401, 0.0005320712807588279, -0.0337035097181797, -0.08214890956878662, -0.048476677387952805, 0.05008849501609802, -0.15910308063030243, -0.02711009606719017, 0.06746078282594681, -0.026025408878922462, 0.08606261759996414, -0.025154827162623405, -0.08867695927619934, 0.049523089081048965, -0.18832778930664062, -0.009337262250483036, -0.01166266668587923, 0.0029439779464155436, -0.04383174702525139, -0.12922585010528564, -0.02426963858306408, -0.0336555540561676, -0.046313319355249405, 0.05138561874628067, 0.0865173265337944, -0.06137581914663315, 0.01758226566016674, 0.09671075642108917, -0.05120144411921501, -0.06009715050458908, -0.011687779799103737, 0.11390689760446548, 0.017287271097302437, 0.1531599760055542, -0.060870349407196045, 0.02417835406959057, -0.08980764448642731, 0.02638416737318039, 0.003049151971936226, -0.0012221195502206683, -0.033855609595775604, -0.04266154766082764, 0.015032805502414703, -0.029457926750183105, 0.01743018627166748, 0.006364210043102503, -0.03544678911566734, 0.06971019506454468, -0.01033679861575365, -0.11554713547229767, 0.001002771663479507, 0.04723471403121948, -0.007669172249734402, -0.030124079436063766, 0.0002921479463111609, -0.042583998292684555, 0.012753956019878387, 0.05380401760339737, 0.1477583646774292, 0.039659254252910614, 0.26269033551216125, 0.02954821288585663, -0.0326278991997242, -0.011843318119645119, -0.06475294381380081, 0.019816577434539795, -0.00559525191783905, 0.0443880520761013, -0.062799833714962, 0.06456147879362106, 0.1355140209197998, -0.07371706515550613, 0.08201320469379425, -0.0689530298113823, -0.06749729067087173, -0.029648181051015854, -0.15211902558803558, -0.022767115384340286, 0.054462503641843796, -0.026392947882413864, -0.09945037961006165, 0.08871150761842728, -0.03405212610960007, 0.021369723603129387, -0.025694720447063446, 0.01885995827615261, -0.09970559179782867, -0.1512356698513031, 0.10627304017543793, -0.037810321897268295, 0.021341778337955475, 0.11052113026380539, 0.07652802020311356, 0.03998729586601257, -0.019654173403978348, 0.020167676731944084, 0.1224648728966713, 0.01670048199594021, 0.000013566133929998614, -0.1523023247718811, -0.08433067053556442, 0.029383737593889236, 0.04811810329556465, -0.009339452721178532, 0.16770268976688385, 0.045966316014528275, -0.08699571341276169, 0.021733904257416725, 0.18249472975730896, 0.040299564599990845, -0.0967802181839943, -0.07660428434610367, 0.1529066264629364, -0.014416278339922428, -0.04044685512781143, 0.029629187658429146, -0.08758893609046936, 0.09623879939317703, 0.16418954730033875, 0.11504633724689484, 0.06439675390720367, -0.0007325813639909029, -0.00042479083640500903, 0.005789272487163544, 0.05544503405690193, 0.07604743540287018, 0.02593592181801796, 0.15488936007022858, -0.09501567482948303, 0.19151249527931213, -0.08114468306303024, 0.030569901689887047, -0.05478604510426521, 0.08087693899869919, -0.024823812767863274, -0.00018874616944231093, -0.08080121129751205, 0.05744277685880661, -0.061076369136571884, -0.2884295880794525, 0.012192094698548317, 0.015745293349027634, -0.0175313837826252, -0.010895591229200363, -0.07740888744592667, 0.01224664505571127, 0.09576426446437836, 0.07746534049510956, 0.008901461027562618, 0.10418318212032318, -0.0389350950717926, -0.1114259585738182, -0.015598958358168602, 0.13654600083827972, -0.06791745871305466, 0.14839226007461548, 0.0073931580409407616, 0.06834227591753006, 0.0849340409040451, -0.011617817915976048, -0.11754753440618515, 0.009377049282193184, -0.02824636735022068, 0.0228112880140543, 0.012544025667011738, 0.2030734419822693, 0.0678877905011177, 0.03514544293284416, 0.07596389204263687, 0.03623761981725693, 0.08119768649339676, 0.09778283536434174, 0.01765485666692257, -0.043064866214990616, 0.11628206819295883, -0.1204245463013649, 0.08895035088062286, 0.16375420987606049, 0.029013080522418022, -0.01061529666185379, -0.10467257350683212, -0.06549033522605896, 0.03292139619588852, 0.06633652001619339, 0.004570419434458017, -0.13747631013393402, 0.038088563829660416, 0.03961002826690674, 0.05615193396806717, -0.16334274411201477, -0.09013287723064423, 0.05979000777006149, 0.008836646564304829, 0.0008997046388685703, 0.12806475162506104, 0.0034363975282758474, -0.007035191636532545, 0.0025127646513283253, 0.022287892177700996, -0.005556153133511543, 0.09424581378698349, -0.05863428860902786, -0.018723640590906143 ]
null
null
null
This is Brain Piano --- inference: parameters: temperature: 0.7 ---
{}
null
Pikachu/BrainPiano
[ "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #region-us
This is Brain Piano --- inference: parameters: temperature: 0.7 ---
[]
[ "TAGS\n#region-us \n" ]
[ 6 ]
[ "passage: TAGS\n#region-us \n" ]
[ 0.024608636274933815, -0.026205500587821007, -0.009666500613093376, -0.10395516455173492, 0.08638657629489899, 0.059816278517246246, 0.01882290467619896, 0.020661840215325356, 0.23975107073783875, -0.005599027033895254, 0.1219947561621666, 0.0015615287702530622, -0.037353623658418655, 0.03733762726187706, -0.0035912662278860807, -0.17583473026752472, 0.03876631706953049, -0.018274923786520958, 0.01843859627842903, 0.026470553129911423, -0.07776834815740585, -0.07564429938793182, 0.015296397730708122, -0.10247814655303955, -0.083692267537117, 0.11002834886312485, 0.031466204673051834, -0.019670886918902397, 0.10779199749231339, -0.04243955761194229, 0.18699054419994354, -0.011512263678014278, -0.11213519424200058, -0.2536850869655609, 0.021806683391332626, -0.01765260472893715, -0.08747660368680954, 0.01506110467016697, 0.0665089413523674, -0.09014441072940826, -0.0588928684592247, 0.0795099288225174, -0.01132340170443058, 0.04246443510055542, -0.27593839168548584, -0.12684126198291779, -0.05297930911183357, -0.1421966552734375, 0.08651168644428253, 0.04035491496324539, 0.008764253929257393, 0.15506891906261444, -0.20897391438484192, 0.004104613792151213, 0.08255259692668915, -0.2538507878780365, 0.05591634660959244, 0.17671173810958862, 0.03623908758163452, 0.18037272989749908, 0.0060391901060938835, 0.11029672622680664, 0.0716743916273117, -0.024263937026262283, -0.17590197920799255, -0.08127854019403458, -0.04696211963891983, 0.16642488539218903, -0.06727185100317001, -0.14248386025428772, 0.34701237082481384, 0.00015008423360995948, 0.009657775051891804, 0.16921205818653107, -0.059524230659008026, -0.09972117841243744, 0.07259953022003174, 0.016484731808304787, 0.018492350354790688, 0.1471305936574936, 0.16307872533798218, -0.0458691343665123, -0.13837823271751404, -0.018630273640155792, -0.22798998653888702, 0.17510560154914856, -0.03248048573732376, 0.13137903809547424, -0.27447956800460815, 0.01684025302529335, -0.2570667266845703, 0.0032130838371813297, 0.04178816080093384, -0.06004921346902847, -0.0226522795855999, -0.013265985064208508, -0.08018817007541656, 0.004899587947875261, 0.06192673370242119, 0.1266920566558838, -0.06128726154565811, 0.06128238886594772, -0.09319206327199936, 0.141696035861969, 0.07166698575019836, 0.07868369668722153, 0.13037432730197906, 0.041205424815416336, -0.07187089323997498, -0.21872246265411377, -0.0026476888451725245, -0.06275863200426102, -0.09502086788415909, -0.0020165652967989445, -0.11606067419052124, 0.17244569957256317, -0.030802514404058456, -0.09825427830219269, -0.11208184063434601, 0.09148659557104111, -0.032992321997880936, -0.03437839448451996, -0.03552987426519394, -0.020977836102247238, 0.019381176680326462, 0.04704452306032181, -0.1548958420753479, -0.005131472367793322, 0.07039852440357208, 0.11502562463283539, -0.1346137970685959, -0.003783059772104025, -0.07908964157104492, 0.03039063885807991, 0.07654735445976257, -0.16510222852230072, 0.03158547356724739, -0.1124754324555397, -0.07531405985355377, 0.002912673633545637, -0.015710093080997467, -0.016202643513679504, 0.166526660323143, -0.0020451415330171585, 0.0714716836810112, -0.026345307007431984, -0.05890209600329399, -0.11243434250354767, -0.08489254862070084, 0.05390460044145584, 0.03670717030763626, 0.03266148269176483, -0.2193479984998703, 0.014805203303694725, -0.12762966752052307, 0.1360815018415451, -0.10566820204257965, -0.04705966264009476, -0.022842247039079666, 0.20562705397605896, 0.037286072969436646, 0.08762791007757187, -0.22171171009540558, 0.039756543934345245, -0.05404696613550186, 0.18480908870697021, -0.1502426266670227, -0.0799463614821434, 0.20813211798667908, -0.07964949309825897, -0.10115210711956024, 0.021235812455415726, 0.020391687750816345, 0.026287272572517395, 0.0766737088561058, 0.4564172327518463, -0.09766800701618195, -0.09146861732006073, 0.10178250074386597, 0.17055274546146393, -0.12427149713039398, -0.1827561855316162, 0.06446871906518936, -0.16666454076766968, -0.1973118633031845, 0.0018917324487119913, 0.09222044050693512, 0.038269978016614914, -0.07875611633062363, -0.020746968686580658, 0.06325206160545349, -0.0007678253459744155, 0.09095914661884308, 0.03755716234445572, 0.09034032374620438, -0.08716782182455063, 0.11115926504135132, -0.05017651244997978, 0.004037132486701012, 0.1343354731798172, 0.027325427159667015, -0.03223329409956932, 0.08694463223218918, -0.0485352948307991, 0.05295134335756302, -0.1662379503250122, -0.15068690478801727, 0.03398871049284935, 0.06283251196146011, 0.03186952322721481, 0.1280253529548645, 0.08141885697841644, -0.10732853412628174, 0.022690722718834877, -0.004228927195072174, 0.058398615568876266, 0.03891623765230179, 0.006107209715992212, 0.008764320984482765, 0.0961301177740097, -0.10607069730758667, -0.13589619100093842, -0.07336436957120895, -0.014715781435370445, 0.14371353387832642, -0.0302802175283432, 0.07690227776765823, -0.004240254405885935, 0.00013200697139836848, 0.06930823624134064, 0.08137880265712738, 0.016412746161222458, 0.08971183747053146, -0.05237193778157234, -0.05160155147314072, 0.10863113403320312, -0.13533565402030945, 0.17837053537368774, 0.14053137600421906, -0.20532016456127167, 0.029453208670020103, -0.06838275492191315, 0.03670361638069153, -0.008162540383636951, 0.0975119024515152, -0.08272241055965424, -0.02106042578816414, 0.013134466484189034, 0.0052274600602686405, -0.013007243163883686, 0.017682146281003952, -0.07295988500118256, -0.07787393033504486, -0.10233919322490692, 0.08436838537454605, 0.11562882363796234, -0.10282530635595322, 0.14214380085468292, 0.4384984076023102, 0.11495281755924225, 0.21582984924316406, -0.09581480920314789, -0.0412987545132637, 0.007486371789127588, 0.0001535322517156601, -0.04476691037416458, 0.08031861484050751, -0.15973517298698425, -0.038901735097169876, 0.027348900213837624, 0.07128690183162689, 0.11475157737731934, -0.14959022402763367, -0.09639324247837067, -0.00793045200407505, 0.0022841424215584993, -0.1249532699584961, 0.023905446752905846, -0.03974650055170059, 0.04015624523162842, 0.07232289016246796, -0.021535737439990044, 0.13939237594604492, -0.04166141897439957, -0.0639561116695404, 0.07585346698760986, -0.2017085999250412, -0.23179671168327332, -0.12309670448303223, -0.14680525660514832, 0.04366797208786011, 0.05154111236333847, 0.01726446859538555, -0.17635835707187653, -0.015074856579303741, 0.07706750929355621, 0.07820965349674225, -0.20886357128620148, -0.022814949974417686, -0.004290030337870121, 0.0895976573228836, -0.10227091610431671, -0.0017130117630586028, -0.04419664293527603, -0.10150232166051865, 0.0017003051470965147, 0.07279510796070099, -0.137485533952713, 0.13807645440101624, 0.21589438617229462, 0.07225540280342102, 0.07359948754310608, -0.019093448296189308, 0.09936179965734482, -0.10856141895055771, -0.16549113392829895, 0.08348225057125092, -0.06234746053814888, 0.047262318432331085, 0.17534415423870087, 0.03307317942380905, -0.13904969394207, -0.015682822093367577, -0.0402069091796875, -0.15603256225585938, -0.238995760679245, -0.09178274869918823, -0.1182505264878273, 0.16442428529262543, 0.0009358620154671371, 0.06651917099952698, 0.08258313685655594, -0.022042419761419296, 0.16447891294956207, -0.07379321753978729, -0.07578866183757782, -0.006978808436542749, 0.12375060468912125, -0.056660156697034836, -0.03080669604241848, -0.10566964000463486, -0.008295975625514984, 0.1151021271944046, 0.15304014086723328, 0.12214863300323486, 0.2957419455051422, 0.08268889784812927, 0.026645636186003685, 0.08958091586828232, 0.17622539401054382, 0.09495089203119278, 0.07838419824838638, -0.045413073152303696, -0.014814783819019794, 0.014317171648144722, -0.04022889584302902, 0.010141594335436821, 0.14683100581169128, -0.2679629921913147, -0.006678564939647913, -0.2710230350494385, 0.0965198427438736, -0.10913380235433578, 0.11837165057659149, -0.01015760749578476, 0.10194015502929688, 0.11082887649536133, 0.03233652561903, -0.03858073800802231, 0.16613617539405823, 0.08450309932231903, -0.11277695000171661, 0.001758623169735074, 0.03737903758883476, 0.09715615212917328, -0.02818971499800682, 0.12721189856529236, -0.11048974841833115, -0.1464834064245224, 0.013753619976341724, 0.07152791321277618, -0.15373679995536804, 0.3138748109340668, 0.012069208547472954, -0.13481520116329193, -0.01481647603213787, -0.09957809001207352, -0.006440147757530212, 0.1254177987575531, 0.09333524852991104, 0.07935678958892822, -0.2185502052307129, -0.13339371979236603, 0.05872276425361633, -0.00575496768578887, 0.22408108413219452, -0.034034017473459244, -0.11356475204229355, -0.027013886719942093, 0.04241163283586502, -0.06043251231312752, 0.08524788916110992, 0.023536119610071182, -0.08113526552915573, -0.032957352697849274, 0.05323701351881027, 0.012368366122245789, 0.00524376705288887, 0.09360801428556442, 0.020107939839363098, -0.0009265501867048442, 0.01785753294825554, 0.047885000705718994, -0.0675911232829094, -0.1984109878540039, 0.09357594698667526, -0.05215044692158699, 0.0015536568826064467, -0.08013670891523361, -0.15122665464878082, -0.08837161958217621, -0.16009655594825745, 0.12540200352668762, -0.034406669437885284, 0.12700119614601135, -0.06619787961244583, 0.17341409623622894, -0.07871770113706589, 0.04481020197272301, -0.047349292784929276, 0.050332702696323395, -0.007268077693879604, -0.07756082713603973, 0.16585899889469147, -0.15564003586769104, 0.01809087023139, 0.19572502374649048, -0.018915493041276932, 0.07177707552909851, 0.021322092041373253, -0.0636206790804863, 0.23147478699684143, 0.3014698624610901, 0.008138049393892288, 0.1665448248386383, 0.3018903136253357, -0.07466315478086472, -0.2642788887023926, -0.05505012720823288, -0.2841376066207886, -0.05371501296758652, 0.10716094076633453, -0.22523896396160126, 0.06986407935619354, 0.14383509755134583, -0.06471995264291763, 0.30228954553604126, -0.21825523674488068, 0.012589273042976856, 0.15434536337852478, -0.08868814259767532, 0.5515313148498535, -0.1133413165807724, -0.17677772045135498, -0.008122089318931103, -0.08741296827793121, 0.10602109134197235, -0.0340677872300148, 0.06877441704273224, 0.013465235009789467, 0.04797380417585373, 0.048932258039712906, -0.03111894056200981, 0.22701001167297363, 0.008710170164704323, 0.09015397727489471, -0.07378865778446198, -0.18624304234981537, 0.11639340221881866, -0.04359482601284981, -0.08891059458255768, 0.0849778801202774, -0.05942516401410103, -0.11078983545303345, 0.04663389176130295, -0.07950539886951447, -0.024862350896000862, 0.08423490077257156, -0.04678233340382576, -0.042606171220541, -0.008054176345467567, -0.1618063747882843, -0.0002289071271661669, 0.31360217928886414, -0.07096036523580551, 0.16695955395698547, 0.03677211329340935, 0.00038613268407061696, -0.11027684062719345, 0.030288029462099075, -0.05203165486454964, -0.021576624363660812, 0.09578979015350342, -0.11096979677677155, 0.03204701095819473, 0.14160704612731934, -0.04864364117383957, 0.05846960097551346, 0.09256096184253693, -0.0849417969584465, 0.007583672646433115, 0.17753590643405914, -0.17537221312522888, -0.1273445188999176, -0.006135711446404457, -0.09862716495990753, 0.14055661857128143, 0.04394126310944557, 0.05191568285226822, 0.16669964790344238, 0.03967129811644554, -0.029474308714270592, -0.02817419543862343, -0.1153380498290062, -0.0201893113553524, 0.040153320878744125, 0.00045633706031367183, -0.08791285753250122, 0.2262638509273529, 0.06409153342247009, -0.1328488290309906, -0.051157206296920776, 0.2161225974559784, -0.06805316358804703, -0.04911920800805092, -0.223562553524971, 0.10752306133508682, -0.07112517952919006, -0.0965060144662857, 0.05453834682703018, -0.02270081453025341, 0.005106312222778797, 0.181985542178154, 0.03941008821129799, 0.11070270836353302, 0.03738937899470329, -0.02448922023177147, 0.15798696875572205, -0.142850860953331, -0.14191335439682007, -0.025354057550430298, -0.08757315576076508, -0.13844476640224457, -0.026804137974977493, 0.1617041826248169, -0.09177309274673462, -0.14772607386112213, -0.2621181011199951, 0.10968475043773651, -0.16432365775108337, -0.10192688554525375, -0.03469514101743698, -0.08968492597341537, 0.0696166530251503, 0.030301768332719803, -0.03093348816037178, -0.06706760823726654, -0.18593791127204895, 0.0816768929362297, 0.06349513679742813, 0.045533183962106705, -0.017847947776317596, 0.0067379772663116455, 0.1720137596130371, 0.025955144315958023, 0.10040043294429779, 0.16762186586856842, 0.011397695168852806, 0.2246655523777008, -0.1671202927827835, -0.11496317386627197, 0.1336962729692459, -0.026543032377958298, 0.06762003898620605, 0.16792191565036774, -0.0772583931684494, 0.015526676550507545, -0.028136352077126503, 0.07066910713911057, -0.11003983020782471, -0.105624258518219, 0.007937257178127766, 0.02567129209637642, -0.2755882740020752, -0.005599735304713249, -0.19717298448085785, 0.14788752794265747, 0.02579621411859989, 0.03297143429517746, 0.10257530212402344, 0.10404334217309952, 0.08312062919139862, -0.0017710148822516203, 0.03226327523589134, -0.1176818460226059, 0.02753005363047123, -0.059239376336336136, -0.020663779228925705, 0.017624232918024063, 0.36952024698257446, -0.03603357449173927, -0.046802736818790436, 0.003710439894348383, 0.1307835876941681, -0.02139742486178875, 0.017395347356796265, 0.13209912180900574, 0.12607666850090027, -0.08595693111419678, -0.1504845917224884, 0.04888554662466049, -0.04565655067563057, -0.02836887165904045, 0.1464131623506546, 0.05905961990356445, 0.1050296202301979, 0.0908031314611435, -0.014463032595813274, -0.00318976235575974, 0.012856799177825451, -0.15486004948616028, 0.06223496049642563, -0.010558074340224266, 0.012565906159579754, 0.017934376373887062, 0.15238402783870697, -0.005540105979889631, 0.07739730179309845, -0.09889880567789078, 0.004208535887300968, -0.13498884439468384, -0.07913459837436676, 0.03617347031831741, -0.13393273949623108, 0.04141177982091904, -0.01871878281235695, 0.029611799865961075, 0.30386561155319214, 0.02558239921927452, -0.020639164373278618, 0.12512871623039246, -0.1214587539434433, -0.12050267308950424, -0.001594188273884356, -0.029960084706544876, 0.0791488066315651, -0.02633434161543846, -0.0997740775346756, -0.1001306027173996, -0.15166029334068298, -0.09759195148944855, 0.05182836204767227, -0.04993441700935364, -0.059362251311540604, -0.17634081840515137, -0.05707859992980957, -0.05147340148687363, 0.14025864005088806, -0.12263951450586319, 0.15159130096435547, -0.014490418136119843, 0.004084470681846142, 0.04405883327126503, 0.1950942426919937, -0.03644494712352753, 0.08714226633310318, 0.0154351145029068, 0.1522706001996994, -0.05119588226079941, 0.14720745384693146, -0.10931728035211563, -0.04014137014746666, -0.06710435450077057, 0.21513493359088898, 0.25630924105644226, -0.06136954948306084, -0.008937356993556023, -0.012760217301547527, 0.058654606342315674, 0.1073930487036705, 0.16049085557460785, 0.002326392102986574, 0.2802925705909729, -0.03133585304021835, 0.04815128445625305, 0.02901598811149597, 0.013607407920062542, -0.06336209923028946, 0.03397751972079277, 0.07539387792348862, -0.035039983689785004, -0.1412304788827896, 0.15837742388248444, -0.21980468928813934, 0.18157227337360382, 0.11640069633722305, -0.19996967911720276, -0.013728445395827293, -0.04882071167230606, 0.1689416468143463, -0.0856364443898201, 0.1637246012687683, -0.0903693437576294, -0.2108195722103119, -0.2056000679731369, 0.03867346793413162, -0.34623071551322937, -0.254462867975235, 0.10422009229660034, 0.1488201916217804, 0.04015883058309555, -0.018507536500692368, -0.019967829808592796, -0.018367022275924683, 0.04877542704343796, -0.0067357709631323814, 0.06014643982052803, 0.031397558748722076, -0.02988368645310402, -0.24127542972564697, -0.029804671183228493, 0.023964406922459602, -0.07093082368373871, 0.07464958727359772, -0.06874357163906097, -0.022495782002806664, 0.08059766888618469, -0.03066304884850979, 0.03298592567443848, -0.035373736172914505, -0.16326889395713806, 0.027529051527380943, 0.03900543600320816, 0.036012712866067886, 0.00634160777553916, 0.0008072225609794259, -0.03455270454287529, 0.0644603744149208, -0.16716794669628143, -0.16015739738941193, 0.14140215516090393, -0.06745140254497528, 0.2779497504234314, -0.05812826007604599, -0.0809100940823555, 0.04766704887151718, -0.03426874056458473, 0.1807648241519928, -0.07756473124027252, 0.047254521399736404, 0.12766779959201813, 0.011127962730824947, 0.03121316432952881, -0.3092964291572571, 0.11082969605922699, -0.000795336440205574, -0.006093299947679043, -0.07581598311662674 ]
null
null
transformers
@ Shrek DialoGPT Model
{"tags": ["conversational"]}
text-generation
PinoCorgi/DialoGPT-small-Shrek1
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
@ Shrek DialoGPT Model
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 51 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.009697278961539268, 0.03208012506365776, -0.007204889785498381, 0.004809224978089333, 0.16726240515708923, 0.014898733235895634, 0.09765533357858658, 0.13672804832458496, -0.007841327227652073, -0.031050153076648712, 0.14490588009357452, 0.20411323010921478, -0.006439372431486845, 0.0661218985915184, -0.07572533935308456, -0.2683109939098358, 0.05759621039032936, 0.046649303287267685, 0.016515716910362244, 0.1200079694390297, 0.08573378622531891, -0.05473608896136284, 0.08714032918214798, -0.014583407901227474, -0.150366872549057, 0.017733458429574966, 0.043394338339567184, -0.12260226160287857, 0.11910516023635864, 0.05462685227394104, 0.07063519209623337, 0.014929565601050854, -0.07541623711585999, -0.1631229966878891, 0.03031250834465027, 0.01425902172923088, -0.0594632662832737, 0.04757995903491974, 0.059961482882499695, -0.10165371745824814, 0.10819483548402786, 0.09530027210712433, -0.013078106567263603, 0.06798283755779266, -0.16849711537361145, -0.020869607105851173, -0.01446688175201416, 0.009899779222905636, 0.05550243332982063, 0.09964893013238907, -0.03413357585668564, 0.10497362166643143, -0.09214533120393753, 0.11017382889986038, 0.10932035744190216, -0.32057443261146545, -0.005767723545432091, 0.09167823940515518, 0.039358653128147125, 0.07352814823389053, -0.04467793554067612, 0.06258884817361832, 0.018015462905168533, 0.017986174672842026, -0.014015024527907372, -0.07283061742782593, -0.11612214148044586, 0.04717336222529411, -0.08668071031570435, -0.059868961572647095, 0.2244078367948532, -0.05464440956711769, 0.06881742179393768, -0.05281897634267807, -0.10522868484258652, -0.04308144748210907, -0.029833965003490448, 0.00475557055324316, -0.07660607248544693, 0.08692064881324768, 0.00869679357856512, -0.09547875821590424, -0.1376667022705078, -0.02496783249080181, -0.1776352822780609, 0.16140350699424744, 0.02465328387916088, 0.05232657864689827, -0.2027255892753601, 0.09623090922832489, 0.017906051129102707, -0.08045592904090881, 0.022091427817940712, -0.10046248883008957, 0.029131146147847176, 0.013760408386588097, -0.04754498973488808, -0.061387211084365845, 0.0843690037727356, 0.11199145019054413, -0.01731434464454651, 0.025486016646027565, -0.039331406354904175, 0.08100687712430954, 0.03553595021367073, 0.09077847748994827, 0.007288969587534666, -0.028338588774204254, 0.025842782109975815, -0.13719046115875244, -0.003647835226729512, -0.07116208970546722, -0.16572439670562744, -0.021088803187012672, 0.02994808368384838, 0.08289173990488052, 0.015449047088623047, 0.11682453751564026, -0.03272046521306038, -0.025152435526251793, 0.03602350503206253, -0.047656361013650894, -0.012649794109165668, 0.016648368909955025, 0.013163427822291851, 0.12399329990148544, -0.0022096503525972366, 0.03235051408410072, -0.13653022050857544, 0.031423524022102356, -0.06793295592069626, -0.003740974934771657, -0.03486552834510803, -0.040637075901031494, 0.009043924510478973, -0.06862333416938782, 0.003486064961180091, -0.15030112862586975, -0.15063877403736115, 0.007587034720927477, -0.007836631499230862, -0.04107699543237686, -0.06370922178030014, -0.06952770054340363, -0.013550350442528725, 0.04251532256603241, -0.07093454152345657, -0.011352915316820145, -0.06403283774852753, 0.11004766076803207, -0.03197755664587021, 0.07921615242958069, -0.11953279376029968, 0.08390819281339645, -0.11260783672332764, -0.02386913076043129, -0.060801517218351364, 0.09317506104707718, -0.0006014376995153725, 0.09549830108880997, -0.006563255097717047, -0.017931854352355003, -0.07981178909540176, 0.06445012241601944, -0.042872510850429535, 0.21701598167419434, -0.0615808479487896, -0.11181682348251343, 0.28781595826148987, -0.052628401666879654, -0.1370542049407959, 0.11647392809391022, 0.008682746440172195, 0.05777018144726753, 0.10703510791063309, 0.19733482599258423, -0.015276194550096989, 0.004040541127324104, 0.09471915662288666, 0.11263324320316315, -0.11276852339506149, -0.033160366117954254, 0.013019153848290443, -0.04081077128648758, -0.10867965966463089, 0.04689536616206169, 0.09810488671064377, 0.07090286910533905, -0.04786505550146103, -0.03377414867281914, -0.01366397924721241, 0.0052589005790650845, 0.08885077387094498, -0.007157256826758385, 0.10962837189435959, -0.05819983780384064, -0.03796621412038803, -0.029282379895448685, -0.012126247398555279, -0.03951939567923546, 0.03137664496898651, -0.043376367539167404, 0.10821941494941711, -0.011204327456653118, 0.06364280730485916, -0.16185984015464783, -0.07691477984189987, -0.017002692446112633, 0.1581239402294159, 0.024538565427064896, 0.09859629720449448, 0.0552486926317215, -0.040398042649030685, -0.0012767292791977525, 0.012792680412530899, 0.15581141412258148, -0.022091681137681007, -0.065607450902462, -0.052166227251291275, 0.08642971515655518, -0.05641226842999458, 0.04504093527793884, -0.05937713757157326, 0.012367865070700645, 0.05064384639263153, 0.10342344641685486, -0.00018274025933351368, 0.03323284164071083, -0.008164864964783192, 0.002145637758076191, -0.058205123990774155, 0.007405933458358049, 0.10799351334571838, 0.00036868182360194623, -0.07365862280130386, 0.22074243426322937, -0.17796069383621216, 0.1765957772731781, 0.1893044263124466, -0.299345999956131, 0.017949223518371582, -0.10759581625461578, -0.04561871662735939, 0.014407722279429436, 0.05567655712366104, -0.0454222597181797, 0.1703362911939621, -0.009871348738670349, 0.18874616920948029, -0.04946064203977585, -0.04464937001466751, -0.0200483538210392, -0.05118836089968681, -0.0024189651012420654, 0.07781197130680084, 0.10685696452856064, -0.13992026448249817, 0.1964332014322281, 0.1621224284172058, 0.048237916082143784, 0.19945049285888672, 0.015346456319093704, -0.011589210480451584, 0.0909530371427536, 0.005220826715230942, -0.058739423751831055, -0.07409929484128952, -0.2594851851463318, -0.030033592134714127, 0.07992640137672424, 0.0422382652759552, 0.1212305948138237, -0.11349532753229141, -0.038956157863140106, -0.01763172075152397, -0.023146281018853188, 0.021672505885362625, 0.0914369598031044, 0.06075398623943329, 0.13201528787612915, -0.001710098935291171, -0.007300339173525572, 0.10524573177099228, 0.01783694699406624, -0.09354141354560852, 0.18308524787425995, -0.13652534782886505, -0.37097251415252686, -0.13911493122577667, -0.18057456612586975, -0.05449081212282181, 0.05712554603815079, 0.11679314076900482, -0.12011238187551498, -0.018752124160528183, 0.01578843593597412, 0.10931742936372757, -0.08449502289295197, 0.0021454424131661654, -0.06880278885364532, 0.0321490578353405, -0.10310184955596924, -0.09194442629814148, -0.055416494607925415, -0.031392451375722885, -0.08001253753900528, 0.1423761546611786, -0.10777941346168518, 0.04476889222860336, 0.20262959599494934, 0.04653622955083847, 0.05625178664922714, -0.044105201959609985, 0.19377262890338898, -0.11264272034168243, -0.01661740615963936, 0.19215328991413116, -0.048360925167798996, 0.07476246356964111, 0.1232115849852562, -0.006348740309476852, -0.08765771239995956, 0.03011748194694519, -0.02085109055042267, -0.07988511025905609, -0.23219464719295502, -0.13938382267951965, -0.12429051846265793, 0.09477275609970093, 0.028005298227071762, 0.056365787982940674, 0.17219258844852448, 0.06577219814062119, -0.038416244089603424, 0.006410336587578058, 0.02959546446800232, 0.08237514644861221, 0.23417828977108002, -0.06035616248846054, 0.1364797055721283, -0.03420931473374367, -0.14982740581035614, 0.08169995993375778, 0.0713929831981659, 0.10213395953178406, 0.06678459793329239, 0.0804823637008667, 0.0149586396291852, 0.06188136339187622, 0.1311223804950714, 0.08191446959972382, 0.019586285576224327, -0.02480296604335308, -0.03388110175728798, -0.025523077696561813, -0.05937909707427025, 0.040128443390131, 0.06589099019765854, -0.16763372719287872, -0.039227183908224106, -0.09338314831256866, 0.09657008945941925, 0.0873042419552803, 0.06609832495450974, -0.1842060089111328, -0.008006223477423191, 0.08488986641168594, -0.03854905813932419, -0.13727426528930664, 0.09535189718008041, 0.01523482333868742, -0.15144726634025574, 0.03139317408204079, -0.04061909019947052, 0.12188644707202911, -0.07804752141237259, 0.09809603542089462, -0.08108244836330414, -0.07448557764291763, 0.02123199962079525, 0.1261177361011505, -0.30527687072753906, 0.20240111649036407, -0.0024993624538183212, -0.06486981362104416, -0.1243603527545929, -0.0032166161108762026, 0.002410882618278265, 0.07357452809810638, 0.10519039630889893, -0.007196315098553896, 0.001897757756523788, -0.06300821900367737, -0.01829923689365387, 0.032471053302288055, 0.13080233335494995, -0.0401318334043026, -0.021158374845981598, -0.050194524228572845, -0.001653497340157628, -0.03173094615340233, -0.06934895366430283, 0.02002747356891632, -0.19509181380271912, 0.08751901984214783, 0.04166261479258537, 0.09648149460554123, 0.029994789510965347, 0.004265148192644119, -0.09651939570903778, 0.24698667228221893, -0.07148019969463348, -0.10072879493236542, -0.10919588059186935, -0.046813901513814926, 0.03569883480668068, -0.05628936365246773, 0.04309194162487984, -0.0788632407784462, 0.028997479006648064, -0.06352769583463669, -0.19235502183437347, 0.12410202622413635, -0.09027006477117538, -0.04412810131907463, -0.02371402643620968, 0.2110891044139862, -0.05598580464720726, 0.010335659608244896, 0.02930437959730625, 0.01208863127976656, -0.11645778268575668, -0.09678568691015244, 0.031018631532788277, -0.007351789623498917, 0.050603240728378296, 0.041841957718133926, -0.05915454775094986, -0.017138581722974777, -0.052199993282556534, -0.022926922887563705, 0.3496883809566498, 0.14231905341148376, -0.043836336582899094, 0.19347235560417175, 0.12347975373268127, -0.07452994585037231, -0.3159443140029907, -0.1066238060593605, -0.10937739163637161, -0.04680149629712105, -0.07012093812227249, -0.2002030611038208, 0.06474938243627548, 0.00662544509395957, -0.013415241613984108, 0.12749312818050385, -0.2561831772327423, -0.07571036368608475, 0.15906259417533875, -0.017980827018618584, 0.3745945692062378, -0.1168576180934906, -0.10926306992769241, -0.03950892388820648, -0.14175476133823395, 0.16968177258968353, -0.01989765651524067, 0.11221715062856674, -0.009765521623194218, 0.14388824999332428, 0.05548359826207161, -0.023479344323277473, 0.08544106781482697, 0.004999885335564613, -0.03290518373250961, -0.10304180532693863, -0.05676887184381485, 0.007092386484146118, 0.02477436140179634, 0.018026655539870262, -0.041834570467472076, 0.02227151393890381, -0.11731979995965958, -0.04657655209302902, -0.08982590585947037, 0.04431166127324104, 0.03899754583835602, -0.07325074821710587, -0.002380647463724017, -0.07165111601352692, -0.012272949330508709, 0.022334342822432518, 0.20356793701648712, -0.08029330521821976, 0.16448934376239777, 0.09239562600851059, 0.12419285625219345, -0.14376309514045715, -0.00019283240544609725, -0.0762530043721199, -0.05611240118741989, 0.07737895101308823, -0.09433035552501678, 0.058893077075481415, 0.10901971161365509, -0.04567738622426987, 0.08828683942556381, 0.10377411544322968, 0.008936077356338501, 0.003213887568563223, 0.10916902124881744, -0.2667325437068939, -0.0296600554138422, -0.07532413303852081, 0.000883326749317348, 0.09092561900615692, 0.08562852442264557, 0.18840822577476501, 0.025361526757478714, -0.04293036088347435, -0.002770674182102084, 0.028597986325621605, -0.039021048694849014, 0.051667019724845886, 0.001123449532315135, 0.01947369985282421, -0.1530752182006836, 0.072522833943367, 0.01490565575659275, -0.15215420722961426, 0.021316176280379295, 0.16572684049606323, -0.11656328290700912, -0.1283872276544571, -0.06520111113786697, 0.08313824236392975, -0.11755692958831787, -0.01578943058848381, -0.03279297426342964, -0.13145680725574493, 0.07992171496152878, 0.12629036605358124, 0.05557859688997269, 0.0972496047616005, -0.06061713397502899, -0.020469192415475845, -0.018721895292401314, -0.014099318534135818, -0.012384648434817791, -0.007667020428925753, -0.055978111922740936, 0.0590752474963665, -0.026677248999476433, 0.1425808072090149, -0.09221141785383224, -0.1037059873342514, -0.16142144799232483, 0.0374140702188015, -0.11013076454401016, -0.08825794607400894, -0.08821134269237518, -0.050188567489385605, 0.002360827289521694, -0.019856395199894905, -0.04037635400891304, -0.05829505994915962, -0.12300454825162888, 0.0338277705013752, -0.040771447122097015, 0.024727050215005875, -0.07512269169092178, 0.015856385231018066, 0.08507686108350754, -0.03285100311040878, 0.15655414760112762, 0.1450488418340683, -0.1006515845656395, 0.10741901397705078, -0.14806775748729706, -0.09138492494821548, 0.11116421222686768, 0.015329592861235142, 0.0449691042304039, 0.09723787009716034, 0.013362943194806576, 0.0635865181684494, 0.032776717096567154, 0.05308786407113075, 0.027619892731308937, -0.11959987878799438, 0.06483134627342224, -0.03626115620136261, -0.14700546860694885, -0.049338050186634064, -0.05282869189977646, 0.01647452637553215, 0.013054544106125832, 0.09622690081596375, -0.05301849544048309, 0.10698331147432327, -0.04055701196193695, 0.0346808135509491, 0.017554637044668198, -0.1730053424835205, -0.03816922754049301, -0.08538098633289337, 0.03681723028421402, 0.014741539023816586, 0.25266793370246887, 0.030072299763560295, 0.012416383251547813, 0.032671261578798294, 0.08285367488861084, 0.03899408504366875, 0.010228337720036507, 0.17482228577136993, 0.1162426546216011, -0.06621865928173065, -0.10445023328065872, 0.0729617029428482, 0.016332454979419708, 0.01286179106682539, 0.13617953658103943, 0.008365051820874214, 0.005795429926365614, 0.08649782836437225, -0.016865963116288185, 0.009968153201043606, -0.10052056610584259, -0.13426925241947174, -0.022176474332809448, 0.05151832848787308, -0.04655967652797699, 0.11727844923734665, 0.1406494379043579, -0.01806013658642769, 0.03222079202532768, -0.021771740168333054, -0.05699979141354561, -0.1683429479598999, -0.1429590880870819, -0.06883849948644638, -0.13416796922683716, 0.00897989235818386, -0.11180389672517776, 0.05395037308335304, 0.06001098081469536, 0.06750501692295074, -0.06899319589138031, 0.10220931470394135, 0.04626858979463577, -0.11440542340278625, 0.06264589726924896, -0.0296088308095932, 0.09430401772260666, -0.02759445086121559, -0.019505485892295837, -0.09039592742919922, 0.014574515633285046, 0.011419114656746387, 0.06245238706469536, -0.04707273095846176, 0.007463190704584122, -0.14696238934993744, -0.08972041308879852, -0.0523175448179245, 0.0718572810292244, -0.050409089773893356, 0.14282815158367157, 0.00775480642914772, -0.0170906875282526, 0.039554283022880554, 0.22787313163280487, -0.07476283609867096, -0.04778539761900902, -0.05269690603017807, 0.20717895030975342, 0.02975541539490223, 0.1171872541308403, -0.022938819602131844, -0.006106364540755749, -0.0919521227478981, 0.3764844834804535, 0.30030161142349243, -0.09031439572572708, 0.011794124729931355, 0.02137952297925949, 0.04502861574292183, 0.1316293478012085, 0.1216534823179245, 0.10318691283464432, 0.3006802201271057, -0.07452366501092911, -0.04653361067175865, -0.012629742734134197, -0.023858042433857918, -0.09059546142816544, 0.1021224707365036, 0.04839762672781944, -0.06382183730602264, -0.03313443064689636, 0.0954432487487793, -0.25862133502960205, 0.1277991235256195, -0.12311873584985733, -0.17578600347042084, -0.06654827296733856, 0.009760108776390553, 0.10465722531080246, 0.015642458572983742, 0.0946015790104866, 0.007128213066607714, -0.11252258718013763, 0.06305865943431854, 0.03397420793771744, -0.22762253880500793, 0.0006893770187161863, 0.06642123311758041, -0.07006710022687912, -0.0024247700348496437, -0.026499588042497635, 0.05657242611050606, 0.0656052976846695, 0.054629553109407425, -0.00971333310008049, 0.03816632181406021, 0.0034184439573436975, -0.0585215799510479, 0.016623929142951965, 0.05121519789099693, 0.02472509816288948, -0.09763528406620026, 0.06927435845136642, -0.1574270874261856, 0.04766253009438515, -0.0030655991286039352, -0.04124255105853081, 0.006064958870410919, 0.008823691867291927, -0.06491616368293762, 0.05165379121899605, 0.07916834205389023, -0.0016257909592241049, -0.0062433634884655476, -0.057178743183612823, -0.02632102556526661, -0.027755750343203545, -0.09291748702526093, -0.10495562851428986, -0.14682936668395996, -0.11640441417694092, 0.09368976950645447, -0.01011267676949501, -0.1848134547472, 0.022154374048113823, -0.08606051653623581, 0.08319322764873505, -0.1670055389404297, 0.08040720224380493, 0.07041648775339127, 0.013038921169936657, -0.0031511052511632442, -0.02002427540719509, 0.054132770746946335, 0.086809903383255, -0.10407156497240067, -0.07400695979595184 ]
null
null
transformers
# Harry Potter DialoGPT Model
{"tags": ["conversational"]}
text-generation
Piumi/DialogGPT-small-harrypotter
[ "transformers", "pytorch", "gpt2", "text-generation", "conversational", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Harry Potter DialoGPT Model
[ "# Harry Potter DialoGPT Model" ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Harry Potter DialoGPT Model" ]
[ 51, 8 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Harry Potter DialoGPT Model" ]
[ -0.0009023238671943545, 0.07815738022327423, -0.006546166725456715, 0.07792752981185913, 0.10655936598777771, 0.048972971737384796, 0.17639793455600739, 0.12185695022344589, 0.016568755730986595, -0.04774167761206627, 0.11647630482912064, 0.2130284160375595, -0.002118367003276944, 0.024608047679066658, -0.05022026598453522, -0.3065771162509918, 0.0474756620824337, 0.014356585219502449, -0.07174845039844513, 0.11724270135164261, 0.09064973145723343, -0.046179238706827164, 0.08330509811639786, -0.009135239757597446, -0.13198648393154144, -0.039482954889535904, 0.019292812794446945, -0.11745545268058777, 0.1662212759256363, 0.05298272892832756, 0.02469746209681034, -0.008447164669632912, -0.06598151475191116, -0.15036040544509888, 0.037190426141023636, -0.027472136542201042, -0.01080626156181097, 0.05462246760725975, 0.023526115342974663, -0.07521048933267593, 0.170567125082016, 0.17678891122341156, 0.0833497866988182, 0.0349111407995224, -0.14917024970054626, -0.045548245310783386, 0.008950977586209774, 0.05421316996216774, -0.017893504351377487, 0.09349167346954346, -0.019903047010302544, 0.11801653355360031, -0.04491448402404785, 0.09210366010665894, 0.15255063772201538, -0.4016275703907013, -0.027563704177737236, 0.08920855820178986, 0.05989706888794899, 0.12076901644468307, -0.10560955852270126, 0.03972794860601425, -0.0039703017100691795, 0.01236654631793499, -0.014540530741214752, -0.08304883539676666, -0.07308239489793777, 0.032504837960004807, -0.1272556483745575, 0.008525865152478218, 0.23756256699562073, -0.10643257945775986, 0.037069112062454224, -0.09791990369558334, -0.07414398342370987, 0.048336777836084366, -0.053761593997478485, -0.081727035343647, -0.054839808493852615, 0.06347949057817459, 0.004366500303149223, -0.06301609426736832, -0.08326146006584167, -0.0006536149303428829, -0.12781435251235962, 0.17595994472503662, 0.061243366450071335, 0.041611745953559875, -0.21322020888328552, 0.08940251916646957, 0.04477722570300102, -0.04711297154426575, 0.007116159424185753, -0.11796226352453232, 0.04023287072777748, 0.005483259446918964, -0.03256071358919144, -0.021854614838957787, 0.0393419973552227, 0.13909944891929626, -0.01777748204767704, 0.03252175822854042, 0.006831915583461523, 0.05811219662427902, 0.08162496984004974, 0.02222144603729248, 0.019291909411549568, -0.0818009302020073, 0.019385190680623055, -0.08128736168146133, -0.0030400939285755157, -0.048940129578113556, -0.17071883380413055, -0.07477642595767975, 0.052610911428928375, 0.020047198981046677, 0.03746970370411873, 0.08054786175489426, -0.0017944995779544115, -0.05560554191470146, 0.03284840285778046, 0.01671096310019493, -0.020622212439775467, -0.010361049324274063, -0.02412462793290615, 0.19123271107673645, 0.019619356840848923, 0.014111656695604324, -0.12379156798124313, 0.10023640841245651, -0.08179095387458801, 0.0037731381598860025, 0.02743307314813137, -0.04204464703798294, -0.004716555587947369, 0.02917117439210415, 0.023101668804883957, -0.1252521574497223, -0.1099385917186737, -0.0030569476075470448, -0.012054097838699818, -0.036421261727809906, -0.10490952432155609, -0.08483029156923294, -0.012153145857155323, 0.0449371263384819, -0.013397793285548687, 0.007936403155326843, -0.05143149942159653, 0.0985720232129097, -0.0514979362487793, 0.09873400628566742, -0.08342572301626205, 0.06359215080738068, -0.09124887734651566, -0.061886150389909744, -0.11452563107013702, 0.05216052383184433, 0.012905281968414783, 0.066250741481781, 0.016998225823044777, -0.044836658984422684, -0.014836243353784084, 0.05253177136182785, -0.07656687498092651, 0.1940697431564331, -0.041674621403217316, -0.12459053844213486, 0.24146439135074615, -0.09138800948858261, -0.1802034229040146, 0.12973085045814514, -0.022254703566432, 0.08523941785097122, 0.12802475690841675, 0.20380465686321259, -0.00019822151807602495, -0.01302915159612894, 0.07281201332807541, 0.07031642645597458, -0.09803894907236099, 0.06239739805459976, 0.029653839766979218, -0.008071083575487137, -0.08906278014183044, 0.05762826278805733, 0.046033453196287155, -0.010650773532688618, -0.035073768347501755, -0.001896020956337452, -0.012895751744508743, -0.022185025736689568, 0.14126582443714142, -0.02006692811846733, 0.1300428807735443, -0.06926563382148743, -0.03515486419200897, -0.009500149637460709, 0.03533667325973511, -0.04091939330101013, 0.08151165395975113, -0.0436173714697361, 0.10586477071046829, 0.09034156054258347, 0.053724925965070724, -0.13120363652706146, 0.00466286763548851, -0.015246815048158169, 0.17014820873737335, 0.08964069187641144, 0.05222717300057411, 0.06265474855899811, -0.0020888058934360743, -0.06708643585443497, 0.045407816767692566, 0.13778303563594818, -0.037020038813352585, -0.12218865007162094, -0.1755627691745758, 0.051157694309949875, -0.045444171875715256, 0.10855234414339066, -0.10010123997926712, 0.022670533508062363, -0.055906031280756, 0.07772238552570343, -0.024998966604471207, 0.020512236282229424, -0.0013405600329861045, -0.021700702607631683, -0.08356887847185135, -0.002377772703766823, 0.08597290515899658, -0.02048647589981556, -0.06707409024238586, 0.16556480526924133, -0.16400809586048126, 0.1631954461336136, 0.2116095870733261, -0.28542569279670715, -0.005696662236005068, -0.15163889527320862, -0.0208092350512743, 0.019645055755972862, 0.07834604382514954, 0.026225795969367027, 0.2044338881969452, -0.012928472831845284, 0.16565458476543427, -0.05699567869305611, -0.07730039209127426, -0.06881127506494522, -0.048101142048835754, 0.013522743247449398, 0.09095205366611481, 0.04542696103453636, -0.11962861567735672, 0.13119758665561676, 0.1054433062672615, 0.06484298408031464, 0.12711186707019806, 0.1030748188495636, -0.008113685995340347, 0.07252490520477295, -0.03624548763036728, -0.03462279960513115, -0.09254947304725647, -0.30446043610572815, -0.04840317741036415, 0.0939924493432045, 0.007963384501636028, 0.09285714477300644, -0.0919896736741066, -0.03311870992183685, 0.006042704917490482, 0.009473444893956184, 0.028337622061371803, 0.09653715789318085, 0.013490920886397362, 0.15320514142513275, -0.008011690340936184, -0.03430786728858948, 0.05891305208206177, 0.017982570454478264, -0.09147711098194122, 0.17280617356300354, -0.17050009965896606, -0.27190929651260376, -0.06990014761686325, -0.21745692193508148, -0.013139115646481514, 0.05258983001112938, 0.0786920040845871, -0.11818131804466248, -0.018352627754211426, -0.006239492911845446, 0.05685517191886902, -0.2425733357667923, 0.0004911290016025305, -0.1354890614748001, 0.0501418262720108, -0.1974833607673645, -0.09718500077724457, -0.02271542325615883, -0.013450481928884983, -0.0464281290769577, 0.13365240395069122, -0.1448695808649063, -0.011572926305234432, 0.2329535037279129, 0.032479673624038696, 0.027794739231467247, -0.05020907148718834, 0.19788463413715363, -0.0958966314792633, -0.023973820731043816, 0.11024576425552368, -0.05038975924253464, 0.04834126681089401, 0.06649978458881378, -0.012981836684048176, -0.08557141572237015, 0.023789849132299423, -0.068336620926857, -0.03150583803653717, -0.27926525473594666, -0.0930178239941597, -0.09319330751895905, 0.11305391043424606, 0.04079577326774597, 0.06421639025211334, 0.16545771062374115, 0.05191578343510628, -0.024325082078576088, -0.03006586618721485, 0.11609793454408646, 0.12905290722846985, 0.2277202159166336, -0.06067761778831482, 0.10221996158361435, 0.009445492178201675, -0.08203992247581482, 0.06062209978699684, 0.056782789528369904, 0.06324724853038788, 0.02584579586982727, 0.03694582358002663, -0.030939655378460884, 0.1121687963604927, 0.12571842968463898, 0.05258069559931755, 0.0481170229613781, 0.0002127334737451747, -0.0561506561934948, -0.008168719708919525, -0.05726633965969086, 0.06774696707725525, 0.061340972781181335, -0.12918008863925934, -0.08061543852090836, 0.0011613310780376196, 0.06660808622837067, -0.016230419278144836, 0.06823775917291641, -0.13560809195041656, -0.03582429885864258, 0.0790911465883255, -0.07693151384592056, -0.14156894385814667, 0.11972879618406296, -0.026570770889520645, -0.19904157519340515, 0.05265914276242256, 0.007704653777182102, 0.0908159390091896, -0.06360849738121033, 0.05343840271234512, -0.13023801147937775, -0.12935101985931396, -0.018437571823596954, 0.07945099472999573, -0.3450873792171478, 0.13536721467971802, -0.013286802917718887, -0.02876877970993519, -0.06474969536066055, -0.02640824392437935, 0.013905409723520279, 0.12719078361988068, 0.08667250722646713, 0.0008821099763736129, 0.0991629809141159, 0.03823768347501755, 0.04188435152173042, -0.002011700300499797, 0.10950417071580887, 0.0050011589191854, 0.004797275178134441, -0.04982118681073189, 0.007274609990417957, -0.05164213851094246, -0.07472953200340271, 0.08393982797861099, -0.20678792893886566, 0.09087453782558441, -0.03378438204526901, 0.08427679538726807, 0.04304937273263931, -0.018965769559144974, -0.1001204177737236, 0.19745583832263947, -0.012206900864839554, -0.11405988782644272, -0.07517550885677338, -0.02810264565050602, 0.09103139489889145, -0.013817726634442806, 0.012886416167020798, -0.045470476150512695, 0.032183047384023666, -0.1263762265443802, -0.1597503274679184, 0.08734500408172607, -0.04441224783658981, -0.10894393920898438, -0.025462759658694267, 0.20382575690746307, -0.007266622502356768, 0.08242089301347733, 0.01605331338942051, 0.010653935372829437, -0.18066231906414032, -0.04018142446875572, 0.02645772136747837, -0.0016437612939625978, 0.005979063920676708, 0.047698814421892166, 0.019091911613941193, 0.06207629665732384, -0.1069745197892189, -0.013920160941779613, 0.3158324360847473, 0.15978319942951202, -0.00912671908736229, 0.14943915605545044, 0.1093616932630539, -0.08669080585241318, -0.17238758504390717, -0.1171615794301033, -0.1210922971367836, -0.08425768464803696, -0.10681738704442978, -0.1525043100118637, 0.09535340964794159, -0.03392014652490616, 0.03498011827468872, 0.14615866541862488, -0.280263751745224, -0.10949636250734329, 0.13820378482341766, 0.010744688101112843, 0.3510635495185852, -0.12303631007671356, -0.044944874942302704, -0.06214528530836105, -0.16933435201644897, 0.08021392673254013, -0.031203703954815865, 0.11581093072891235, -0.0744495838880539, 0.19395925104618073, 0.01719796098768711, 0.014287159778177738, 0.0916559100151062, 0.05038322135806084, -0.05808406323194504, -0.07368700206279755, -0.10248131304979324, 0.010812131687998772, 0.03546109423041344, 0.010252019390463829, -0.008802837692201138, 0.0211968794465065, -0.11341743916273117, -0.050869911909103394, -0.06302189081907272, 0.0072614275850355625, -0.01001308299601078, -0.042155615985393524, -0.05533592775464058, -0.022557416930794716, -0.020093943923711777, 0.02266426384449005, 0.14185629785060883, -0.07527699321508408, 0.18586260080337524, 0.02357078716158867, 0.1586609035730362, -0.11956068128347397, -0.06724818795919418, -0.029193658381700516, -0.05280323326587677, 0.06468886137008667, -0.08884575963020325, -0.027708567678928375, 0.1332162618637085, -0.01903904788196087, 0.04655366763472557, 0.12936700880527496, 0.02046884410083294, 0.015383756719529629, 0.034968774765729904, -0.2578005790710449, -0.07463036477565765, -0.03505445644259453, -0.012416874058544636, 0.05272092670202255, 0.05525677278637886, 0.19735674560070038, -0.03551921248435974, -0.08521962910890579, 0.020131373777985573, 0.02735883742570877, -0.02776256389915943, 0.10749414563179016, 0.019579345360398293, -0.004837906453758478, -0.16151933372020721, 0.08257976174354553, -0.005964108742773533, -0.08297000825405121, 0.028665626421570778, 0.2024049311876297, -0.12141239643096924, -0.10309756547212601, -0.06804922968149185, 0.07315051555633545, -0.09220825880765915, 0.016043387353420258, -0.005091092549264431, -0.1521538347005844, 0.06916408240795135, 0.07598215341567993, 0.04075418785214424, 0.06513199955224991, -0.11743064224720001, -0.015730571001768112, -0.04170290008187294, -0.002195435343310237, 0.03521120920777321, 0.01863143965601921, -0.057492829859256744, 0.15846455097198486, -0.0676199421286583, 0.08538917452096939, -0.0744810476899147, -0.1058846190571785, -0.1395980566740036, 0.04660497233271599, -0.08038312196731567, -0.07247276604175568, -0.12832807004451752, -0.052204377949237823, -0.0067099276930093765, -0.03388519585132599, 0.006552806124091148, -0.06627799570560455, -0.10922821611166, 0.01822470687329769, -0.00743203004822135, -0.009385870769619942, -0.06096754968166351, 0.026706209406256676, 0.06246216222643852, -0.039788868278265, 0.15730851888656616, 0.22509248554706573, -0.13591648638248444, 0.11564400047063828, -0.09797432273626328, -0.105463907122612, 0.046008042991161346, 0.009427277371287346, 0.03594303876161575, 0.0503489226102829, -0.03594081476330757, 0.0044484552927315235, 0.03905477747321129, 0.08074651658535004, 0.08456914126873016, -0.06776505708694458, 0.020801106467843056, -0.05122765153646469, -0.14904099702835083, -0.016655439510941505, -0.0464773029088974, 0.06876829266548157, -0.006725262850522995, 0.11020535975694656, -0.0515950471162796, 0.07739507406949997, -0.07558431476354599, 0.050614211708307266, 0.021146971732378006, -0.14688286185264587, -0.006612539757043123, -0.07093682140111923, 0.042144812643527985, -0.008834975771605968, 0.20241086184978485, -0.03228091076016426, 0.010342049412429333, 0.033811055123806, 0.06203942745923996, -0.01957780309021473, 0.009357001632452011, 0.2014283686876297, 0.12640917301177979, -0.08496357500553131, -0.02679651789367199, 0.06793134659528732, 0.07248228788375854, 0.07093550264835358, 0.10807815194129944, -0.015352966263890266, 0.028434239327907562, 0.07829629629850388, -0.060215238481760025, 0.07576877623796463, -0.08603982627391815, -0.11668483167886734, 0.05793621391057968, 0.012955795042216778, -0.055695828050374985, 0.20305177569389343, 0.19142870604991913, -0.026278704404830933, 0.018410727381706238, -0.0029499190859496593, -0.10117456316947937, -0.15619947016239166, -0.05423750728368759, -0.07170962542295456, -0.1319410353899002, -0.004549739416688681, -0.16646917164325714, 0.022016216069459915, -0.01132756657898426, 0.09506805986166, -0.06855440139770508, -0.01345991250127554, 0.1364889293909073, -0.1055467277765274, 0.0847758799791336, -0.024517204612493515, 0.07877567410469055, -0.03746940940618515, -0.018209461122751236, -0.10342709720134735, 0.007514837197959423, 0.01131442841142416, 0.06840907037258148, -0.10897937417030334, 0.02432350255548954, -0.12208317965269089, -0.08617185056209564, -0.026142612099647522, 0.09279687702655792, -0.0403008833527565, 0.15116846561431885, 0.02645145356655121, -0.06710928678512573, -0.004313822835683823, 0.2646709978580475, -0.08046227693557739, -0.08319197595119476, -0.030799202620983124, 0.2152107208967209, 0.04053696244955063, 0.06396269053220749, 0.019140036776661873, 0.038027774542570114, -0.07184682041406631, 0.2957373559474945, 0.34401440620422363, -0.1318037211894989, -0.007773484103381634, 0.04225075617432594, 0.04406323283910751, 0.14687567949295044, 0.07998795062303543, 0.11360671371221542, 0.2849363386631012, -0.09197647124528885, 0.016657205298542976, -0.04230864346027374, -0.01424806285649538, -0.06908884644508362, 0.045314885675907135, 0.08216670155525208, -0.09241747111082077, -0.022950593382120132, 0.08125471323728561, -0.29741767048835754, 0.10791494697332382, -0.15600289404392242, -0.14948409795761108, -0.05027429759502411, -0.008771711029112339, 0.014683255925774574, 0.019041186198592186, 0.09663030505180359, 0.025651484727859497, -0.07275258749723434, 0.07816889137029648, 0.024486342445015907, -0.23020237684249878, -0.01345184724777937, 0.1456068754196167, -0.06789913028478622, -0.025938833132386208, -0.021313713863492012, 0.051610056310892105, 0.05763651058077812, 0.09027529507875443, -0.03809558227658272, -0.0746568813920021, -0.007141788024455309, -0.022818787023425102, 0.01914946548640728, 0.0597183033823967, 0.06841408461332321, -0.0920223817229271, 0.1167774423956871, -0.07350476831197739, 0.0650370642542839, 0.037623800337314606, -0.022277191281318665, 0.0018526542698964477, 0.013183658011257648, -0.06512464582920074, 0.05533479526638985, 0.1295643299818039, -0.025459708645939827, -0.002524374984204769, -0.028180841356515884, -0.0767761766910553, -0.024015206843614578, -0.04643676429986954, -0.09101243317127228, -0.18130090832710266, -0.12738600373268127, 0.041754670441150665, -0.03240608796477318, -0.2046082615852356, 0.0060346988029778, -0.1128578633069992, 0.03700976446270943, -0.14154092967510223, 0.10004086047410965, 0.07216610759496689, 0.004716616589576006, 0.006774604320526123, 0.0675399899482727, 0.045677728950977325, 0.14796748757362366, -0.16543124616146088, -0.04919974133372307 ]
null
null
transformers
# RoBERTa base trained with Spanish Legal Domain Corpora ## Table of contents <details> <summary>Click to expand</summary> - [Overview](#overview) - [Model description](#model-description) - [Intended uses and limitations](#intended-uses-and-limitations) - [How to use](#how-to-use) - [Limitations and bias](#limitations-and-bias) - [Training](#training) - [Training data](#training-data) - [Training procedure](#training-procedure) - [Evaluation](#evaluation) - [Additional information](#additional-information) - [Author](#author) - [Contact information](#contact-information) - [Copyright](#copyright) - [Licensing information](#licensing-information) - [Funding](#funding) - [Citation Information](#citation-information) - [Disclaimer](#disclaimer) </details> ## Overview - **Architecture:** roberta-base - **Language:** Spanish - **Task:** fill-mask - **Data:** Legal ## Model description The **RoBERTalex** is a transformer-based masked language model for the Spanish language. It is based on the [RoBERTa](https://arxiv.org/abs/1907.11692) base model and has been pre-trained using a large [Spanish Legal Domain Corpora](https://zenodo.org/record/5495529), with a total of 8.9GB of text. ## Intended uses and limitations The **RoBERTalex** model is ready-to-use only for masked language modeling to perform the Fill Mask task (try the inference API or read the next section). However, it is intended to be fine-tuned on non-generative downstream tasks such as Question Answering, Text Classification, or Named Entity Recognition. You can use the raw model for fill mask or fine-tune it to a downstream task. ## How to use Here is how to use this model: ```python >>> from transformers import pipeline >>> from pprint import pprint >>> unmasker = pipeline('fill-mask', model='PlanTL-GOB-ES/RoBERTalex') >>> pprint(unmasker("La ley fue <mask> finalmente.")) [{'score': 0.21217258274555206, 'sequence': ' La ley fue modificada finalmente.', 'token': 5781, 'token_str': ' modificada'}, {'score': 0.20414969325065613, 'sequence': ' La ley fue derogada finalmente.', 'token': 15951, 'token_str': ' derogada'}, {'score': 0.19272951781749725, 'sequence': ' La ley fue aprobada finalmente.', 'token': 5534, 'token_str': ' aprobada'}, {'score': 0.061143241822719574, 'sequence': ' La ley fue revisada finalmente.', 'token': 14192, 'token_str': ' revisada'}, {'score': 0.041809432208538055, 'sequence': ' La ley fue aplicada finalmente.', 'token': 12208, 'token_str': ' aplicada'}] ``` Here is how to use this model to get the features of a given text in PyTorch: ```python >>> from transformers import RobertaTokenizer, RobertaModel >>> tokenizer = RobertaTokenizer.from_pretrained('PlanTL-GOB-ES/RoBERTalex') >>> model = RobertaModel.from_pretrained('PlanTL-GOB-ES/RoBERTalex') >>> text = "Gracias a los datos legales se ha podido desarrollar este modelo del lenguaje." >>> encoded_input = tokenizer(text, return_tensors='pt') >>> output = model(**encoded_input) >>> print(output.last_hidden_state.shape) torch.Size([1, 16, 768]) ``` ## Limitations and bias At the time of submission, no measures have been taken to estimate the bias embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated. ## Training data The [Spanish Legal Domain Corpora](https://zenodo.org/record/5495529) corpora comprise multiple digital resources and it has a total of 8.9GB of textual data. Part of it has been obtained from [previous work](https://aclanthology.org/2020.lt4gov-1.6/). To obtain a high-quality training corpus, the corpus has been preprocessed with a pipeline of operations, including among others, sentence splitting, language detection, filtering of bad-formed sentences, and deduplication of repetitive contents. During the process, document boundaries are kept. ### Training procedure The training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original [RoBERTA](https://arxiv.org/abs/1907.11692) model with a vocabulary size of 50,262 tokens. The **RoBERTalex** pre-training consists of a masked language model training, that follows the approach employed for the RoBERTa base. The model was trained until convergence with 2 computing nodes, each one with 4 NVIDIA V100 GPUs of 16GB VRAM. ## Evaluation Due to the lack of domain-specific evaluation data, the model was evaluated on general domain tasks, where it obtains reasonable performance. We fine-tuned the model in the following task: | Dataset | Metric | **RoBERtalex** | |--------------|----------|------------| | UD-POS | F1 | 0.9871 | | CoNLL-NERC | F1 | 0.8323 | | CAPITEL-POS | F1 | 0.9788| | CAPITEL-NERC | F1 | 0.8394 | | STS | Combined | 0.7374 | | MLDoc | Accuracy | 0.9417 | | PAWS-X | F1 | 0.7304 | | XNLI | Accuracy | 0.7337 | ## Additional information ### Author Text Mining Unit (TeMU) at the Barcelona Supercomputing Center ([email protected]) ### Contact information For further information, send an email to <[email protected]> ### Copyright Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022) ### Licensing information [Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0) ### Funding This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL. ## Citing information ``` @misc{gutierrezfandino2021legal, title={Spanish Legalese Language Model and Corpora}, author={Asier Gutiérrez-Fandiño and Jordi Armengol-Estapé and Aitor Gonzalez-Agirre and Marta Villegas}, year={2021}, eprint={2110.12201}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ## Disclaimer The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models. Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables. Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial. En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos.
{"language": ["es"], "license": "apache-2.0", "tags": ["legal", "spanish"], "datasets": ["legal_ES", "temu_legal"], "metrics": ["ppl"], "widget": [{"text": "La ley fue <mask> finalmente."}, {"text": "El Tribunal <mask> desestim\u00f3 el recurso de amparo."}, {"text": "Hay base legal dentro del marco <mask> actual."}]}
fill-mask
PlanTL-GOB-ES/RoBERTalex
[ "transformers", "pytorch", "roberta", "fill-mask", "legal", "spanish", "es", "dataset:legal_ES", "dataset:temu_legal", "arxiv:1907.11692", "arxiv:2110.12201", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:04+00:00
[ "1907.11692", "2110.12201" ]
[ "es" ]
TAGS #transformers #pytorch #roberta #fill-mask #legal #spanish #es #dataset-legal_ES #dataset-temu_legal #arxiv-1907.11692 #arxiv-2110.12201 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us
RoBERTa base trained with Spanish Legal Domain Corpora ====================================================== Table of contents ----------------- Click to expand * Overview * Model description * Intended uses and limitations * How to use * Limitations and bias * Training + Training data + Training procedure * Evaluation * Additional information + Author + Contact information + Copyright + Licensing information + Funding + Citation Information + Disclaimer Overview -------- * Architecture: roberta-base * Language: Spanish * Task: fill-mask * Data: Legal Model description ----------------- The RoBERTalex is a transformer-based masked language model for the Spanish language. It is based on the RoBERTa base model and has been pre-trained using a large Spanish Legal Domain Corpora, with a total of 8.9GB of text. Intended uses and limitations ----------------------------- The RoBERTalex model is ready-to-use only for masked language modeling to perform the Fill Mask task (try the inference API or read the next section). However, it is intended to be fine-tuned on non-generative downstream tasks such as Question Answering, Text Classification, or Named Entity Recognition. You can use the raw model for fill mask or fine-tune it to a downstream task. How to use ---------- Here is how to use this model: Here is how to use this model to get the features of a given text in PyTorch: Limitations and bias -------------------- At the time of submission, no measures have been taken to estimate the bias embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated. Training data ------------- The Spanish Legal Domain Corpora corpora comprise multiple digital resources and it has a total of 8.9GB of textual data. Part of it has been obtained from previous work. To obtain a high-quality training corpus, the corpus has been preprocessed with a pipeline of operations, including among others, sentence splitting, language detection, filtering of bad-formed sentences, and deduplication of repetitive contents. During the process, document boundaries are kept. ### Training procedure The training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original RoBERTA model with a vocabulary size of 50,262 tokens. The RoBERTalex pre-training consists of a masked language model training, that follows the approach employed for the RoBERTa base. The model was trained until convergence with 2 computing nodes, each one with 4 NVIDIA V100 GPUs of 16GB VRAM. Evaluation ---------- Due to the lack of domain-specific evaluation data, the model was evaluated on general domain tasks, where it obtains reasonable performance. We fine-tuned the model in the following task: Dataset: UD-POS, Metric: F1, RoBERtalex: 0.9871 Dataset: CoNLL-NERC, Metric: F1, RoBERtalex: 0.8323 Dataset: CAPITEL-POS, Metric: F1, RoBERtalex: 0.9788 Dataset: CAPITEL-NERC, Metric: F1, RoBERtalex: 0.8394 Dataset: STS, Metric: Combined, RoBERtalex: 0.7374 Dataset: MLDoc, Metric: Accuracy, RoBERtalex: 0.9417 Dataset: PAWS-X, Metric: F1, RoBERtalex: 0.7304 Dataset: XNLI, Metric: Accuracy, RoBERtalex: 0.7337 Additional information ---------------------- ### Author Text Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL) ### Contact information For further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL) ### Copyright Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022) ### Licensing information Apache License, Version 2.0 ### Funding This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL. Citing information ------------------ Disclaimer ---------- The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models. Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables. Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial. En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos.
[ "### Training procedure\n\n\nThe training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original RoBERTA model with a vocabulary size of 50,262 tokens.\n\n\nThe RoBERTalex pre-training consists of a masked language model training, that follows the approach employed for the RoBERTa base. The model was trained until convergence with 2 computing nodes, each one with 4 NVIDIA V100 GPUs of 16GB VRAM.\n\n\nEvaluation\n----------\n\n\nDue to the lack of domain-specific evaluation data, the model was evaluated on general domain tasks, where it obtains reasonable performance. We fine-tuned the model in the following task:\n\n\nDataset: UD-POS, Metric: F1, RoBERtalex: 0.9871\nDataset: CoNLL-NERC, Metric: F1, RoBERtalex: 0.8323\nDataset: CAPITEL-POS, Metric: F1, RoBERtalex: 0.9788\nDataset: CAPITEL-NERC, Metric: F1, RoBERtalex: 0.8394\nDataset: STS, Metric: Combined, RoBERtalex: 0.7374\nDataset: MLDoc, Metric: Accuracy, RoBERtalex: 0.9417\nDataset: PAWS-X, Metric: F1, RoBERtalex: 0.7304\nDataset: XNLI, Metric: Accuracy, RoBERtalex: 0.7337\n\n\nAdditional information\n----------------------", "### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)", "### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)", "### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)", "### Licensing information\n\n\nApache License, Version 2.0", "### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.\n\n\nCiting information\n------------------\n\n\nDisclaimer\n----------\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos." ]
[ "TAGS\n#transformers #pytorch #roberta #fill-mask #legal #spanish #es #dataset-legal_ES #dataset-temu_legal #arxiv-1907.11692 #arxiv-2110.12201 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n", "### Training procedure\n\n\nThe training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original RoBERTA model with a vocabulary size of 50,262 tokens.\n\n\nThe RoBERTalex pre-training consists of a masked language model training, that follows the approach employed for the RoBERTa base. The model was trained until convergence with 2 computing nodes, each one with 4 NVIDIA V100 GPUs of 16GB VRAM.\n\n\nEvaluation\n----------\n\n\nDue to the lack of domain-specific evaluation data, the model was evaluated on general domain tasks, where it obtains reasonable performance. We fine-tuned the model in the following task:\n\n\nDataset: UD-POS, Metric: F1, RoBERtalex: 0.9871\nDataset: CoNLL-NERC, Metric: F1, RoBERtalex: 0.8323\nDataset: CAPITEL-POS, Metric: F1, RoBERtalex: 0.9788\nDataset: CAPITEL-NERC, Metric: F1, RoBERtalex: 0.8394\nDataset: STS, Metric: Combined, RoBERtalex: 0.7374\nDataset: MLDoc, Metric: Accuracy, RoBERtalex: 0.9417\nDataset: PAWS-X, Metric: F1, RoBERtalex: 0.7304\nDataset: XNLI, Metric: Accuracy, RoBERtalex: 0.7337\n\n\nAdditional information\n----------------------", "### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)", "### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)", "### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)", "### Licensing information\n\n\nApache License, Version 2.0", "### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.\n\n\nCiting information\n------------------\n\n\nDisclaimer\n----------\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos." ]
[ 88, 347, 28, 37, 22, 12, 402 ]
[ "passage: TAGS\n#transformers #pytorch #roberta #fill-mask #legal #spanish #es #dataset-legal_ES #dataset-temu_legal #arxiv-1907.11692 #arxiv-2110.12201 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n### Training procedure\n\n\nThe training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original RoBERTA model with a vocabulary size of 50,262 tokens.\n\n\nThe RoBERTalex pre-training consists of a masked language model training, that follows the approach employed for the RoBERTa base. The model was trained until convergence with 2 computing nodes, each one with 4 NVIDIA V100 GPUs of 16GB VRAM.\n\n\nEvaluation\n----------\n\n\nDue to the lack of domain-specific evaluation data, the model was evaluated on general domain tasks, where it obtains reasonable performance. We fine-tuned the model in the following task:\n\n\nDataset: UD-POS, Metric: F1, RoBERtalex: 0.9871\nDataset: CoNLL-NERC, Metric: F1, RoBERtalex: 0.8323\nDataset: CAPITEL-POS, Metric: F1, RoBERtalex: 0.9788\nDataset: CAPITEL-NERC, Metric: F1, RoBERtalex: 0.8394\nDataset: STS, Metric: Combined, RoBERtalex: 0.7374\nDataset: MLDoc, Metric: Accuracy, RoBERtalex: 0.9417\nDataset: PAWS-X, Metric: F1, RoBERtalex: 0.7304\nDataset: XNLI, Metric: Accuracy, RoBERtalex: 0.7337\n\n\nAdditional information\n----------------------### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)" ]
[ -0.0817701667547226, 0.15581193566322327, -0.004946072120219469, 0.03541235253214836, 0.10528713464736938, 0.02680639550089836, 0.053509291261434555, 0.1348845511674881, 0.003224223153665662, 0.1290067732334137, 0.004468121565878391, 0.04069529101252556, 0.0639968067407608, 0.10029981285333633, 0.04171249270439148, -0.1597130000591278, 0.01708150841295719, -0.08976732194423676, 0.008675380609929562, 0.07797674089670181, 0.11978453397750854, -0.07846008241176605, 0.05409454181790352, -0.03676457703113556, -0.005038455594331026, -0.020622611045837402, -0.016521526500582695, -0.060570377856492996, 0.04739748314023018, 0.07336222380399704, 0.05575926974415779, 0.006742686498910189, 0.06267336010932922, -0.2612248957157135, 0.006811437662690878, 0.0819866731762886, -0.007429656572639942, 0.04233287274837494, 0.06355641782283783, -0.06539514660835266, 0.1305464804172516, -0.13590580224990845, 0.06597986817359924, 0.04935825988650322, -0.12452720105648041, -0.22342374920845032, -0.1240280270576477, 0.0963626280426979, 0.08879595249891281, 0.06994682550430298, -0.03080791048705578, 0.06470908969640732, -0.044672880321741104, 0.07534168660640717, 0.08624476939439774, -0.20906677842140198, -0.04132590815424919, 0.10170675814151764, 0.036916010081768036, 0.09107360243797302, -0.12481798231601715, -0.00014089190517552197, 0.00634854007512331, 0.031314197927713394, 0.005219735205173492, -0.010812840424478054, 0.04961812496185303, 0.03364358842372894, -0.10886546969413757, -0.09885526448488235, 0.146596297621727, 0.016346722841262817, -0.07354997098445892, -0.1479564905166626, -0.0007783891633152962, -0.058241210877895355, -0.029901182278990746, -0.024209056049585342, 0.00879804790019989, -0.0066028013825416565, 0.00011350986460456625, -0.04599073529243469, -0.09534647315740585, -0.012433221563696861, -0.04877665638923645, 0.0696054995059967, 0.01241036131978035, 0.025455370545387268, -0.016200723126530647, 0.08396084606647491, 0.012137452140450478, -0.09095510840415955, -0.05039895325899124, -0.04691195487976074, -0.12232597172260284, -0.033404640853405, 0.012061579152941704, -0.0643809586763382, 0.05123301222920418, 0.1999155431985855, -0.018207378685474396, 0.052553173154592514, 0.008917227387428284, -0.014912037178874016, 0.09413322061300278, 0.15486600995063782, -0.022897599264979362, -0.14163969457149506, -0.0501396507024765, 0.020670969039201736, 0.007387232501059771, -0.012671710923314095, -0.00012705169501714408, 0.08083004504442215, 0.07606413960456848, 0.08399118483066559, 0.10114578902721405, 0.022029565647244453, -0.04500528424978256, -0.030605530366301537, 0.13489992916584015, -0.12308897078037262, 0.035987868905067444, 0.04096538946032524, -0.04296564310789108, -0.04817920923233032, 0.028057266026735306, 0.0008225539932027459, -0.03737051412463188, 0.05837105214595795, -0.0658019557595253, -0.04662974178791046, -0.05147995799779892, -0.10023659467697144, 0.03509419411420822, -0.030836090445518494, -0.06897735595703125, -0.09093759208917618, -0.12490948289632797, -0.09059906005859375, 0.049573346972465515, -0.06412065774202347, 0.002917703241109848, -0.01338744256645441, -0.02766522951424122, 0.060729146003723145, -0.007196941412985325, 0.04287610948085785, -0.05161145702004433, 0.051906973123550415, -0.0472879558801651, 0.015884699299931526, 0.051439523696899414, 0.02782200090587139, -0.06621649116277695, -0.008732927963137627, -0.13351914286613464, 0.09354177117347717, -0.17259982228279114, 0.02043512836098671, -0.16415487229824066, -0.03628138080239296, -0.00022028452076483518, -0.000686723506078124, 0.01073531061410904, 0.09163782000541687, -0.1312265247106552, -0.0282793827354908, 0.1179371327161789, -0.07000402361154556, -0.03074878640472889, 0.08537974953651428, -0.031178899109363556, 0.02035404182970524, 0.0638628751039505, 0.11172336339950562, 0.09977972507476807, -0.12415342777967453, -0.06636038422584534, -0.018724270164966583, -0.00521484250202775, 0.12208818644285202, 0.06962577253580093, -0.10834746062755585, 0.06354540586471558, 0.03480720520019531, -0.05759017914533615, -0.051567304879426956, -0.042996685951948166, -0.06570563465356827, 0.007005762774497271, -0.005711683537811041, 0.009556966833770275, 0.022657418623566628, -0.030157102271914482, -0.022715089842677116, -0.11547122895717621, 0.009655349887907505, 0.08656371384859085, -0.01272508967667818, 0.006235715467482805, -0.0667392909526825, 0.025905875489115715, -0.05509810894727707, -0.00887862779200077, -0.17693845927715302, -0.1734054535627365, 0.04684864729642868, -0.06414798647165298, 0.027264054864645004, 0.024247758090496063, 0.05628148838877678, 0.0019393081311136484, -0.039096344262361526, -0.009045890532433987, -0.034926898777484894, -0.0014343520160764456, -0.055264733731746674, -0.16132625937461853, -0.047448981553316116, -0.0591648668050766, 0.18191513419151306, -0.13148407638072968, -0.009155898354947567, 0.0761411041021347, 0.09118495136499405, 0.05943616107106209, -0.08245599269866943, 0.019526079297065735, 0.0005879568634554744, 0.0027019481640309095, -0.045255206525325775, 0.04013568535447121, 0.023288795724511147, -0.008568352088332176, 0.036976203322410583, -0.11157609522342682, -0.07194647192955017, 0.04733479395508766, 0.15944428741931915, -0.07906820625066757, -0.02939707599580288, -0.07570541650056839, 0.0034119209740310907, -0.07361096888780594, -0.04320310801267624, 0.16316668689250946, 0.038096047937870026, 0.05205429345369339, -0.04488575458526611, -0.06696400791406631, 0.021737899631261826, -0.012322385795414448, -0.053892385214567184, 0.1308804154396057, 0.012865051627159119, -0.08417905122041702, 0.06935620307922363, -0.025559622794389725, 0.08777495473623276, 0.16715088486671448, -0.0021101473830640316, -0.06744442880153656, -0.05904080346226692, 0.014802630059421062, 0.02405693382024765, 0.12406277656555176, -0.022107871249318123, 0.003411129117012024, 0.05971858277916908, 0.03556618466973305, 0.036797747015953064, -0.08224072307348251, 0.05614051967859268, 0.01738808862864971, -0.03459896892309189, 0.0014179452555254102, 0.008784309960901737, -0.004590901546180248, 0.08327218145132065, 0.02685258537530899, 0.04965359345078468, -0.04006841778755188, -0.03007582202553749, -0.10696950554847717, 0.15553605556488037, -0.1013946458697319, -0.17100653052330017, -0.15083810687065125, 0.034275904297828674, -0.09166066348552704, -0.026255199685692787, 0.024888668209314346, -0.058812569826841354, -0.08591607213020325, -0.09665320068597794, -0.011784246191382408, -0.0005226634093560278, -0.043911412358284, -0.008381561376154423, 0.0034322396386414766, 0.006708734203130007, -0.1506052166223526, -0.018145572394132614, -0.031024636700749397, -0.09282499551773071, 0.0001319727598456666, 0.04020131006836891, 0.1117272675037384, 0.10864455997943878, 0.08153638243675232, 0.008156313560903072, -0.03324640542268753, 0.22791150212287903, -0.11711633205413818, 0.08482086658477783, 0.07043972611427307, 0.07435058057308197, 0.036773938685655594, 0.14704617857933044, 0.011428164318203926, -0.05942932888865471, 0.02537146396934986, 0.08813458681106567, -0.01995520479977131, -0.2645518481731415, -0.10433231294155121, -0.03626282513141632, -0.023395473137497902, 0.04647638648748398, 0.07700203359127045, 0.06123203784227371, -0.03765193745493889, -0.036180611699819565, -0.016134344041347504, 0.041483260691165924, 0.08625826239585876, 0.024840399622917175, 0.026687633246183395, 0.08220356702804565, -0.05501200258731842, 0.026437310501933098, 0.11525829136371613, 0.022272272035479546, 0.26104915142059326, -0.01699727401137352, 0.11802873760461807, 0.10387931019067764, 0.043791308999061584, -0.052249081432819366, -0.035311970859766006, -0.002602178603410721, 0.033324725925922394, -0.01465697307139635, -0.07600440829992294, -0.03521599620580673, 0.03914541378617287, 0.09860725700855255, -0.058714162558317184, -0.03109261766076088, -0.07380503416061401, 0.1002163216471672, 0.18135526776313782, 0.026882026344537735, -0.1391616016626358, -0.07609201222658157, 0.0413794182240963, -0.044020045548677444, -0.045245297253131866, -0.05133547633886337, 0.02702781744301319, -0.16207480430603027, 0.052521366626024246, -0.024940812960267067, 0.06517502665519714, -0.06566189974546432, -0.04050599783658981, 0.031937964260578156, 0.012621340341866016, -0.011945975013077259, 0.05933786928653717, -0.1416989266872406, 0.1903551071882248, 0.025587452575564384, 0.09343381971120834, -0.04990166425704956, 0.0379723384976387, 0.002996944123879075, -0.0326760970056057, 0.15998883545398712, 0.030307017266750336, -0.08536826074123383, -0.06171409413218498, -0.10475868731737137, 0.013849896378815174, 0.10308320820331573, -0.09188967198133469, 0.09476592391729355, -0.0008755472954362631, -0.0400826632976532, -0.046218425035476685, -0.013267561793327332, -0.15266788005828857, -0.139113187789917, 0.07745037972927094, -0.04848460480570793, -0.0555267333984375, -0.06059321388602257, -0.047522637993097305, -0.02394464984536171, 0.1496192067861557, -0.1256571114063263, -0.046005912125110626, -0.16345427930355072, 0.009420649148523808, 0.11647612601518631, -0.08252432942390442, 0.03961288556456566, -0.03157676383852959, 0.10646312683820724, 0.0060875304043293, -0.04598482325673103, 0.0377059206366539, -0.07677783071994781, -0.1497088372707367, -0.06109343096613884, 0.13797445595264435, 0.08971933275461197, 0.05109450966119766, 0.0023665400221943855, 0.048477232456207275, 0.014383767731487751, -0.10675659030675888, 0.037053681910037994, 0.16195939481258392, 0.05708819255232811, 0.06174081191420555, -0.03698933124542236, -0.17958591878414154, -0.07020839303731918, -0.0482160821557045, 0.07436919957399368, 0.16183720529079437, -0.07677608728408813, 0.1512119621038437, 0.11431955546140671, -0.13742974400520325, -0.1959005892276764, -0.001651137019507587, 0.09430983662605286, -0.004992700647562742, 0.009282096289098263, -0.13679102063179016, 0.01509733684360981, 0.06031365320086479, -0.03342512622475624, 0.027397215366363525, -0.2713141441345215, -0.10775186121463776, 0.017872599884867668, 0.03130767494440079, -0.05684817582368851, -0.17078500986099243, -0.08745650202035904, -0.007989495992660522, -0.2118600457906723, 0.10475175827741623, -0.032590411603450775, 0.0637902319431305, 0.014698809012770653, -0.02111097238957882, 0.034562867134809494, -0.06289738416671753, 0.1457868069410324, 0.07015817612409592, 0.02733604423701763, -0.04090062156319618, -0.017797669395804405, 0.09338181465864182, -0.046666014939546585, 0.0900009274482727, 0.06539544463157654, 0.02609720453619957, -0.17937107384204865, -0.023825524374842644, -0.11715415865182877, 0.013711675070226192, -0.06798829883337021, -0.047261811792850494, -0.05869359150528908, 0.10720104724168777, 0.10475825518369675, -0.03893456235527992, 0.038438525050878525, -0.06899545341730118, 0.07888206839561462, 0.10667431354522705, 0.05386337265372276, 0.03398750722408295, -0.14301101863384247, -0.02084256522357464, -0.052876025438308716, -0.005508586764335632, -0.15900655090808868, 0.0464850515127182, 0.09476473927497864, 0.023180855438113213, 0.16222001612186432, -0.015583032742142677, -0.12289425730705261, -0.011632504872977734, 0.10115020722150803, -0.06897548586130142, -0.11930947005748749, 0.004316335543990135, -0.006915505975484848, -0.11707377433776855, -0.009400737471878529, 0.1460016667842865, 0.021888451650738716, -0.03016410954296589, -0.02199132926762104, 0.09571944922208786, 0.038436368107795715, 0.1048157662153244, 0.036594290286302567, 0.05092648044228554, -0.05861024558544159, 0.12594081461429596, 0.10593640059232712, -0.11329906433820724, 0.007346665486693382, 0.06369749456644058, -0.08225317299365997, -0.05505312979221344, 0.00370913278311491, 0.06947223097085953, -0.04935641959309578, -0.023774374276399612, -0.04445130005478859, -0.03937894478440285, 0.029355473816394806, 0.059881482273340225, 0.0006622897926717997, 0.033446166664361954, 0.003102442715317011, 0.021643320098519325, -0.040195267647504807, 0.08358433842658997, 0.046937860548496246, -0.0015953948022797704, -0.040317602455616, 0.09735589474439621, 0.006013783626258373, -0.013904813677072525, 0.00927289854735136, -0.029385201632976532, -0.09746915847063065, -0.023506972938776016, -0.027013881132006645, -0.007156051695346832, -0.02762863226234913, -0.012523145414888859, -0.024166179820895195, -0.02442130818963051, -0.03452618792653084, 0.0344957672059536, -0.0711999237537384, -0.0679459497332573, -0.02640436589717865, 0.09410129487514496, -0.1579333245754242, -0.0218590646982193, 0.033802714198827744, -0.06435689330101013, 0.09258290380239487, 0.027814792469143867, 0.034729890525341034, 0.032442983239889145, -0.10668647289276123, 0.026121536269783974, -0.03906816989183426, 0.005978951696306467, 0.03927826136350632, -0.15460750460624695, 0.019761396571993828, -0.0017028892179951072, -0.019612757489085197, 0.04721997678279877, 0.0035580135881900787, -0.10021815448999405, 0.0481911227107048, -0.013874362222850323, -0.03595929965376854, -0.04516350105404854, 0.041723333299160004, 0.06777457147836685, 0.054284341633319855, 0.10825379937887192, -0.07717656344175339, 0.01692383550107479, -0.16584418714046478, -0.025319494307041168, -0.019861219450831413, 0.003119016531854868, -0.02200651727616787, -0.0004794250999111682, 0.0593268983066082, 0.0058288974687457085, 0.11128599941730499, 0.020638853311538696, 0.06024763360619545, 0.04059991240501404, 0.014590378850698471, -0.05255218595266342, 0.010094823315739632, 0.018434839323163033, 0.01661701127886772, 0.00864590797573328, 0.055960651487112045, -0.00462383683770895, -0.033764731138944626, 0.025028934702277184, 0.11600765585899353, 0.09969226270914078, 0.2134595811367035, 0.026788635179400444, 0.04401158541440964, -0.13344082236289978, -0.060320932418107986, 0.056123919785022736, -0.07644347101449966, 0.09561294317245483, -0.07692761719226837, 0.04011430963873863, 0.1016453206539154, -0.1925770789384842, 0.10614512115716934, -0.0612974688410759, -0.08181639760732651, -0.05241255834698677, -0.12454117834568024, -0.04302782937884331, -0.038576241582632065, 0.02031397446990013, -0.09870226681232452, 0.06844841688871384, 0.03169897198677063, 0.007716941647231579, 0.015306542627513409, 0.03800341486930847, -0.13073088228702545, -0.05438397452235222, 0.06218688189983368, 0.010688167065382004, 0.026637377217411995, 0.029253752902150154, 0.05517525598406792, -0.0020201837178319693, 0.04987719655036926, 0.08897405862808228, 0.06467790901660919, 0.017314745113253593, 0.04635869711637497, 0.0003128830576315522, -0.06329984217882156, 0.0008868470322340727, -0.010224675759673119, -0.0015640039928257465, 0.19066545367240906, 0.08953255414962769, 0.006002672947943211, 0.017427200451493263, 0.2837325632572174, -0.02754145860671997, -0.0744563564658165, -0.174173504114151, 0.09148993343114853, 0.06635144352912903, 0.04020453244447708, 0.011741846799850464, -0.14182506501674652, 0.005781699437648058, 0.0762937143445015, 0.15885840356349945, 0.0011723877396434546, -0.0021711639128625393, 0.031992241740226746, -0.0024482551962137222, 0.02752595953643322, 0.06209433823823929, 0.005728105083107948, 0.2564447522163391, -0.06117694824934006, 0.05453784763813019, -0.012777749449014664, -0.01586451195180416, -0.04168548434972763, 0.13695204257965088, -0.021045278757810593, -0.0023797741159796715, -0.04225769266486168, 0.10364289581775665, -0.03546614199876785, -0.25035834312438965, 0.04091886430978775, -0.11907441169023514, -0.1364067792892456, 0.010388543829321861, 0.07573523372411728, 0.034128401428461075, 0.07383744418621063, 0.04961858317255974, -0.014777081087231636, 0.2009187787771225, 0.003284571459516883, -0.047170303761959076, -0.11221493780612946, 0.08535216748714447, 0.018508249893784523, 0.24089853465557098, -0.008558070287108421, 0.05213554948568344, 0.11628789454698563, -0.02898155152797699, -0.13140727579593658, 0.05399345979094505, 0.04891705885529518, -0.0716615840792656, 0.06374751776456833, 0.11246487498283386, -0.011415972374379635, 0.07861998677253723, 0.07286915183067322, -0.025785431265830994, 0.07043259590864182, 0.06923253834247589, -0.02541421540081501, -0.08623749017715454, 0.07975955307483673, -0.0972580537199974, 0.11152659356594086, 0.13320539891719818, -0.027434708550572395, 0.018893323838710785, -0.055328916758298874, 0.027881508693099022, -0.00780705688521266, 0.1551288366317749, -0.026404166594147682, -0.14385415613651276, 0.06624584645032883, -0.06382916867733002, 0.0781833603978157, -0.2136792540550232, -0.09569697082042694, 0.07389989495277405, -0.02689097635447979, -0.02620307169854641, 0.1253930628299713, 0.019807608798146248, 0.0004900608328171074, -0.04250858724117279, -0.06744953989982605, 0.019194887951016426, 0.09996761381626129, -0.08858244121074677, -0.030508462339639664 ]
null
null
transformers
# GPT2-base (gpt2-base-bne) trained with data from the National Library of Spain (BNE) ## Table of Contents <details> <summary>Click to expand</summary> - [Overview](#overview) - [Model description](#model-description) - [Intended uses and limitations](#intended-uses-and-limitations) - [How to Use](#how-to-use) - [Limitations and bias](#limitations-and-bias) - [Training](#training) - [Training data](#training-data) - [Training procedure](#training-procedure) - [Additional information](#additional-information) - [Author](#author) - [Contact information](#contact-information) - [Copyright](#copyright) - [Licensing information](#licensing-information) - [Funding](#funding) - [Citation Information](#citation-information) - [Disclaimer](#disclaimer) </details> ## Overview - **Architecture:** gpt2-base - **Language:** Spanish - **Task:** text-generation - **Data:** BNE ## Model description **GPT2-base-bne** is a transformer-based model for the Spanish language. It is based on the [GPT-2](https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf) model and has been pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text processed for this work, compiled from the web crawlings performed by the [National Library of Spain (Biblioteca Nacional de España)](http://www.bne.es/en/Inicio/index.html) from 2009 to 2019. ## Intended uses and limitations You can use the raw model for text generation or fine-tune it to a downstream task. ## How to Use Here is how to use this model: You can use this model directly with a pipeline for text generation. Since the generation relies on some randomness, we set a seed for reproducibility: ```python >>> from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline, set_seed >>> tokenizer = AutoTokenizer.from_pretrained("PlanTL-GOB-ES/gpt2-base-bne") >>> model = AutoModelForCausalLM.from_pretrained("PlanTL-GOB-ES/gpt2-base-bne") >>> generator = pipeline('text-generation', tokenizer=tokenizer, model=model) >>> set_seed(42) >>> generator("La Biblioteca Nacional de España es una entidad pública y sus fines son", num_return_sequences=5) [{'generated_text': 'La Biblioteca Nacional de España es una entidad pública y sus fines son difundir la cultura y el arte hispánico, así como potenciar las publicaciones de la Biblioteca y colecciones de la Biblioteca Nacional de España para su difusión e inquisición. '}, {'generated_text': 'La Biblioteca Nacional de España es una entidad pública y sus fines son diversos. '}, {'generated_text': 'La Biblioteca Nacional de España es una entidad pública y sus fines son la publicación, difusión y producción de obras de arte español, y su patrimonio intelectual es el que tiene la distinción de Patrimonio de la Humanidad. '}, {'generated_text': 'La Biblioteca Nacional de España es una entidad pública y sus fines son los de colaborar en el mantenimiento de los servicios bibliotecarios y mejorar la calidad de la información de titularidad institucional y en su difusión, acceso y salvaguarda para la sociedad. '}, {'generated_text': 'La Biblioteca Nacional de España es una entidad pública y sus fines son la conservación, enseñanza y difusión del patrimonio bibliográfico en su lengua específica y/o escrita. '}] ``` Here is how to use this model to get the features of a given text in PyTorch: ```python >>> from transformers import AutoTokenizer, GPT2Model >>> tokenizer = AutoTokenizer.from_pretrained("PlanTL-GOB-ES/gpt2-base-bne") >>> model = GPT2Model.from_pretrained("PlanTL-GOB-ES/gpt2-base-bne") >>> text = "La Biblioteca Nacional de España es una entidad pública y sus fines son" >>> encoded_input = tokenizer(text, return_tensors='pt') >>> output = model(**encoded_input) >>> print(output.last_hidden_state.shape) torch.Size([1, 14, 768]) ``` ## Limitations and bias At the time of submission, no measures have been taken to estimate the bias and toxicity embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated. Nevertheless, here's an example of how the model can have biased predictions: ```python >>> from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline, set_seed >>> tokenizer = AutoTokenizer.from_pretrained("PlanTL-GOB-ES/gpt2-base-bne") >>> model = AutoModelForCausalLM.from_pretrained("PlanTL-GOB-ES/gpt2-base-bne") >>> generator = pipeline('text-generation', tokenizer=tokenizer, model=model) >>> set_seed(42) >>> generator("El hombre se dedica a", num_return_sequences=5) [{'generated_text': 'El hombre se dedica a comprar armas a sus amigos, pero les cuenta la historia de las ventajas de ser "buenos y regulares en la vida" e ir "bien" por los pueblos. '}, {'generated_text': 'El hombre se dedica a la venta de todo tipo de juguetes durante todo el año y los vende a través de Internet con la intención de alcanzar una mayor rentabilidad. '}, {'generated_text': 'El hombre se dedica a la venta ambulante en plena Plaza Mayor. '}, {'generated_text': 'El hombre se dedica a los toros y él se dedica a los servicios religiosos. '}, {'generated_text': 'El hombre se dedica a la caza y a la tala de pinos. '}] >>> set_seed(42) >>> generator("La mujer se dedica a", num_return_sequences=5) [{'generated_text': 'La mujer se dedica a comprar vestidos de sus padres, como su madre, y siempre le enseña el último que ha hecho en poco menos de un año para ver si le da tiempo. '}, {'generated_text': 'La mujer se dedica a la venta ambulante y su pareja vende su cuerpo desde que tenía uso del automóvil. '}, {'generated_text': 'La mujer se dedica a la venta ambulante en plena ola de frío. '}, {'generated_text': 'La mujer se dedica a limpiar los suelos y paredes en pueblos con mucha humedad. '}, {'generated_text': 'La mujer se dedica a la prostitución en varios locales de alterne clandestinos en Barcelona. '}] ``` ## Training ### Training Data The [National Library of Spain (Biblioteca Nacional de España)](http://www.bne.es/en/Inicio/index.html) crawls all .es domains once a year. The training corpus consists of 59TB of WARC files from these crawls, carried out from 2009 to 2019. To obtain a high-quality training corpus, the corpus has been preprocessed with a pipeline of operations, including among others, sentence splitting, language detection, filtering of bad-formed sentences, and deduplication of repetitive contents. During the process, document boundaries are kept. This resulted in 2TB of Spanish clean corpus. Further global deduplication among the corpus is applied, resulting in 570GB of text. Some of the statistics of the corpus: | Corpora | Number of documents | Number of tokens | Size (GB) | |---------|---------------------|------------------|-----------| | BNE | 201,080,084 | 135,733,450,668 | 570GB | ### Training Procedure The pretraining objective used for this architecture is next token prediction. The configuration of the **GPT2-base-bne** model is as follows: - gpt2-base: 12-layer, 768-hidden, 12-heads, 117M parameters. The training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original [GPT-2](https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf) model with a vocabulary size of 50,262 tokens. The GPT2-base-bne pre-training consists of an autoregressive language model training that follows the approach of the GPT-2. The training lasted a total of 3 days with 16 computing nodes each one with 4 NVIDIA V100 GPUs of 16GB VRAM. ## Additional information ### Author Text Mining Unit (TeMU) at the Barcelona Supercomputing Center ([email protected]) ### Contact information For further information, send an email to <[email protected]> ### Copyright Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022) ### Licensing information This work is licensed under a [Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0) ### Funding This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL. ### Citation information If you use this model, please cite our [paper](http://journal.sepln.org/sepln/ojs/ojs/index.php/pln/article/view/6405): ``` @article{, abstract = {We want to thank the National Library of Spain for such a large effort on the data gathering and the Future of Computing Center, a Barcelona Supercomputing Center and IBM initiative (2020). This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.}, author = {Asier Gutiérrez Fandiño and Jordi Armengol Estapé and Marc Pàmies and Joan Llop Palao and Joaquin Silveira Ocampo and Casimiro Pio Carrino and Carme Armentano Oller and Carlos Rodriguez Penagos and Aitor Gonzalez Agirre and Marta Villegas}, doi = {10.26342/2022-68-3}, issn = {1135-5948}, journal = {Procesamiento del Lenguaje Natural}, keywords = {Artificial intelligence,Benchmarking,Data processing.,MarIA,Natural language processing,Spanish language modelling,Spanish language resources,Tractament del llenguatge natural (Informàtica),Àrees temàtiques de la UPC::Informàtica::Intel·ligència artificial::Llenguatge natural}, publisher = {Sociedad Española para el Procesamiento del Lenguaje Natural}, title = {MarIA: Spanish Language Models}, volume = {68}, url = {https://upcommons.upc.edu/handle/2117/367156#.YyMTB4X9A-0.mendeley}, year = {2022}, } ``` ### Disclaimer <details> <summary>Click to expand</summary> The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence. In no event shall the owner of the models (SEDIA – State Secretariat for Digitalization and Artificial Intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models. Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables. Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial. En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos. </details>
{"language": ["es"], "license": "apache-2.0", "tags": ["national library of spain", "spanish", "bne", "gpt2-base-bne"], "datasets": ["bne"], "widget": [{"text": "El modelo del lenguaje GPT es capaz de"}, {"text": "La Biblioteca Nacional de Espa\u00f1a es una entidad p\u00fablica y sus fines son"}]}
text-generation
PlanTL-GOB-ES/gpt2-base-bne
[ "transformers", "pytorch", "gpt2", "text-generation", "national library of spain", "spanish", "bne", "gpt2-base-bne", "es", "dataset:bne", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "es" ]
TAGS #transformers #pytorch #gpt2 #text-generation #national library of spain #spanish #bne #gpt2-base-bne #es #dataset-bne #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
GPT2-base (gpt2-base-bne) trained with data from the National Library of Spain (BNE) ==================================================================================== Table of Contents ----------------- Click to expand * Overview * Model description * Intended uses and limitations * How to Use * Limitations and bias * Training + Training data + Training procedure * Additional information + Author + Contact information + Copyright + Licensing information + Funding + Citation Information + Disclaimer Overview -------- * Architecture: gpt2-base * Language: Spanish * Task: text-generation * Data: BNE Model description ----------------- GPT2-base-bne is a transformer-based model for the Spanish language. It is based on the GPT-2 model and has been pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text processed for this work, compiled from the web crawlings performed by the National Library of Spain (Biblioteca Nacional de España) from 2009 to 2019. Intended uses and limitations ----------------------------- You can use the raw model for text generation or fine-tune it to a downstream task. How to Use ---------- Here is how to use this model: You can use this model directly with a pipeline for text generation. Since the generation relies on some randomness, we set a seed for reproducibility: Here is how to use this model to get the features of a given text in PyTorch: Limitations and bias -------------------- At the time of submission, no measures have been taken to estimate the bias and toxicity embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated. Nevertheless, here's an example of how the model can have biased predictions: Training -------- ### Training Data The National Library of Spain (Biblioteca Nacional de España) crawls all .es domains once a year. The training corpus consists of 59TB of WARC files from these crawls, carried out from 2009 to 2019. To obtain a high-quality training corpus, the corpus has been preprocessed with a pipeline of operations, including among others, sentence splitting, language detection, filtering of bad-formed sentences, and deduplication of repetitive contents. During the process, document boundaries are kept. This resulted in 2TB of Spanish clean corpus. Further global deduplication among the corpus is applied, resulting in 570GB of text. Some of the statistics of the corpus: ### Training Procedure The pretraining objective used for this architecture is next token prediction. The configuration of the GPT2-base-bne model is as follows: * gpt2-base: 12-layer, 768-hidden, 12-heads, 117M parameters. The training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original GPT-2 model with a vocabulary size of 50,262 tokens. The GPT2-base-bne pre-training consists of an autoregressive language model training that follows the approach of the GPT-2. The training lasted a total of 3 days with 16 computing nodes each one with 4 NVIDIA V100 GPUs of 16GB VRAM. Additional information ---------------------- ### Author Text Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL) ### Contact information For further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL) ### Copyright Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022) ### Licensing information This work is licensed under a Apache License, Version 2.0 ### Funding This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL. information If you use this model, please cite our paper: ### Disclaimer Click to expand The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence. In no event shall the owner of the models (SEDIA – State Secretariat for Digitalization and Artificial Intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models. Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables. Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial. En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos.
[ "### Training Data\n\n\nThe National Library of Spain (Biblioteca Nacional de España) crawls all .es domains once a year. The training corpus consists of 59TB of WARC files from these crawls, carried out from 2009 to 2019.\n\n\nTo obtain a high-quality training corpus, the corpus has been preprocessed with a pipeline of operations, including among others, sentence splitting, language detection, filtering of bad-formed sentences, and deduplication of repetitive contents. During the process, document boundaries are kept. This resulted in 2TB of Spanish clean corpus. Further global deduplication among the corpus is applied, resulting in 570GB of text.\n\n\nSome of the statistics of the corpus:", "### Training Procedure\n\n\nThe pretraining objective used for this architecture is next token prediction.\nThe configuration of the GPT2-base-bne model is as follows:\n\n\n* gpt2-base: 12-layer, 768-hidden, 12-heads, 117M parameters.\n\n\nThe training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original GPT-2 model with a vocabulary size of 50,262 tokens.\n\n\nThe GPT2-base-bne pre-training consists of an autoregressive language model training that follows the approach of the GPT-2.\n\n\nThe training lasted a total of 3 days with 16 computing nodes each one with 4 NVIDIA V100 GPUs of 16GB VRAM.\n\n\nAdditional information\n----------------------", "### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)", "### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)", "### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)", "### Licensing information\n\n\nThis work is licensed under a Apache License, Version 2.0", "### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.\n\n\ninformation\nIf you use this model, please cite our paper:", "### Disclaimer\n\n\n\nClick to expand\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for Digitalization and Artificial Intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos." ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #national library of spain #spanish #bne #gpt2-base-bne #es #dataset-bne #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n", "### Training Data\n\n\nThe National Library of Spain (Biblioteca Nacional de España) crawls all .es domains once a year. The training corpus consists of 59TB of WARC files from these crawls, carried out from 2009 to 2019.\n\n\nTo obtain a high-quality training corpus, the corpus has been preprocessed with a pipeline of operations, including among others, sentence splitting, language detection, filtering of bad-formed sentences, and deduplication of repetitive contents. During the process, document boundaries are kept. This resulted in 2TB of Spanish clean corpus. Further global deduplication among the corpus is applied, resulting in 570GB of text.\n\n\nSome of the statistics of the corpus:", "### Training Procedure\n\n\nThe pretraining objective used for this architecture is next token prediction.\nThe configuration of the GPT2-base-bne model is as follows:\n\n\n* gpt2-base: 12-layer, 768-hidden, 12-heads, 117M parameters.\n\n\nThe training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original GPT-2 model with a vocabulary size of 50,262 tokens.\n\n\nThe GPT2-base-bne pre-training consists of an autoregressive language model training that follows the approach of the GPT-2.\n\n\nThe training lasted a total of 3 days with 16 computing nodes each one with 4 NVIDIA V100 GPUs of 16GB VRAM.\n\n\nAdditional information\n----------------------", "### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)", "### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)", "### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)", "### Licensing information\n\n\nThis work is licensed under a Apache License, Version 2.0", "### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.\n\n\ninformation\nIf you use this model, please cite our paper:", "### Disclaimer\n\n\n\nClick to expand\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for Digitalization and Artificial Intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos." ]
[ 88, 160, 178, 28, 37, 22, 19, 46, 364 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #national library of spain #spanish #bne #gpt2-base-bne #es #dataset-bne #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n### Training Data\n\n\nThe National Library of Spain (Biblioteca Nacional de España) crawls all .es domains once a year. The training corpus consists of 59TB of WARC files from these crawls, carried out from 2009 to 2019.\n\n\nTo obtain a high-quality training corpus, the corpus has been preprocessed with a pipeline of operations, including among others, sentence splitting, language detection, filtering of bad-formed sentences, and deduplication of repetitive contents. During the process, document boundaries are kept. This resulted in 2TB of Spanish clean corpus. Further global deduplication among the corpus is applied, resulting in 570GB of text.\n\n\nSome of the statistics of the corpus:### Training Procedure\n\n\nThe pretraining objective used for this architecture is next token prediction.\nThe configuration of the GPT2-base-bne model is as follows:\n\n\n* gpt2-base: 12-layer, 768-hidden, 12-heads, 117M parameters.\n\n\nThe training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original GPT-2 model with a vocabulary size of 50,262 tokens.\n\n\nThe GPT2-base-bne pre-training consists of an autoregressive language model training that follows the approach of the GPT-2.\n\n\nThe training lasted a total of 3 days with 16 computing nodes each one with 4 NVIDIA V100 GPUs of 16GB VRAM.\n\n\nAdditional information\n----------------------### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)" ]
[ -0.06871765851974487, 0.1431153267621994, -0.0050664967857301235, 0.09142608940601349, 0.12023372203111649, 0.04339389503002167, 0.02550130896270275, 0.11181235313415527, -0.06482136994600296, 0.07737304270267487, 0.010574680753052235, 0.04214124009013176, 0.092653788626194, 0.07342973351478577, 0.15314966440200806, -0.21467016637325287, 0.027861006557941437, -0.09482840448617935, -0.08879109472036362, 0.013176003471016884, 0.0653769001364708, -0.036037687212228775, 0.08304036408662796, -0.04331923648715019, -0.061427123844623566, -0.017833659425377846, -0.03344760462641716, -0.06582251191139221, 0.11473148316144943, 0.08064000308513641, 0.0023335327859967947, -0.03954674303531647, 0.054750651121139526, -0.10248734056949615, 0.006491953507065773, 0.10038772970438004, -0.04177132621407509, 0.05850595608353615, 0.09917233139276505, -0.0068885874934494495, 0.11550062894821167, -0.0750812441110611, -0.00039093944360502064, 0.04791848361492157, -0.15789252519607544, -0.1955442726612091, -0.09609239548444748, -0.01384451799094677, -0.012757902964949608, 0.0013088216073811054, 0.0037097774911671877, -0.0050301202572882175, -0.006445324048399925, 0.00017941791156772524, 0.08296465873718262, -0.2540564239025116, -0.017770441249012947, 0.1421041190624237, 0.013251892291009426, 0.09503811597824097, -0.054047178477048874, 0.017995022237300873, 0.013568691909313202, 0.04639480635523796, 0.004396507982164621, -0.008431969210505486, -0.019773021340370178, -0.018345503136515617, -0.0586758516728878, -0.11457511782646179, 0.126384899020195, -0.04923226311802864, -0.08761687576770782, -0.05912118777632713, -0.049179453402757645, -0.030529357492923737, 0.05261300504207611, 0.0006327718729153275, 0.045767031610012054, 0.01654699817299843, 0.0639350563287735, -0.091950424015522, -0.10172683000564575, -0.0746804028749466, -0.12658850848674774, 0.10349500924348831, 0.04071768373250961, 0.04127686098217964, 0.029379302635788918, 0.17134997248649597, -0.053598079830408096, -0.04127173870801926, -0.04551316425204277, -0.0418362095952034, -0.09409154951572418, 0.04514797404408455, -0.005839236546307802, -0.0635354146361351, -0.023161713033914566, 0.10984019935131073, -0.06449692696332932, 0.00819824356585741, 0.0530274473130703, 0.010960920713841915, 0.0551602765917778, 0.06344611197710037, -0.08641083538532257, -0.06456156820058823, 0.021507395431399345, -0.07352127879858017, 0.015589525923132896, -0.012304100207984447, -0.06509912014007568, -0.05016845464706421, -0.003951650112867355, 0.045876625925302505, 0.042735423892736435, 0.034500688314437866, -0.0025773204397410154, -0.06494202464818954, 0.20423246920108795, -0.05877025052905083, -0.010608955286443233, -0.0002783258387353271, -0.08603440970182419, -0.10780201107263565, 0.04876978322863579, -0.06760425120592117, -0.07763991504907608, 0.0007453652797266841, -0.08350133150815964, -0.08707191050052643, -0.10112468153238297, -0.10190065950155258, 0.026383478194475174, -0.08781113475561142, -0.042789485305547714, -0.12325851619243622, -0.20740187168121338, -0.008896972984075546, 0.04698747768998146, -0.06954154372215271, 0.03970353305339813, 0.0023848828859627247, 0.015338977798819542, -0.017643773928284645, -0.03783353045582771, 0.07935898005962372, -0.05215425416827202, 0.05064942687749863, 0.008092022500932217, 0.05527852848172188, -0.041582588106393814, 0.004529189318418503, -0.061522457748651505, 0.01243991032242775, -0.13008931279182434, 0.09868333488702774, -0.034272536635398865, -0.0061083221808075905, -0.08476075530052185, -0.04224163293838501, -0.08899551630020142, 0.04225906729698181, 0.05381770431995392, 0.11351732909679413, -0.14538519084453583, -0.04246716946363449, 0.22788098454475403, -0.06792482733726501, -0.0014972236240282655, 0.08336333930492401, -0.011784919537603855, 0.06535401195287704, 0.09415820986032486, 0.09964434057474136, 0.13306118547916412, -0.053217992186546326, -0.02033320814371109, -0.01709894649684429, -0.08058365434408188, -0.016653673723340034, 0.05183485895395279, -0.07100635766983032, 0.04844564199447632, 0.08924216032028198, -0.03416403383016586, 0.05207201465964317, -0.0016441619955003262, -0.012587425298988819, 0.040361952036619186, -0.019038522616028786, -0.004933557007461786, -0.028952009975910187, 0.008622504770755768, -0.020604072138667107, -0.05919986590743065, -0.04652165248990059, 0.058845289051532745, -0.035025667399168015, 0.024527134373784065, -0.03138411045074463, 0.03432108089327812, -0.08289248496294022, 0.03360477089881897, -0.11638244986534119, -0.11595925688743591, 0.06146177276968956, -0.09958896785974503, 0.10683556646108627, 0.01946152001619339, 0.05969104915857315, 0.09851907193660736, -0.038553155958652496, -0.008167735300958157, -0.01258634403347969, -0.07982955873012543, -0.020240753889083862, -0.06967992335557938, -0.007143472321331501, -0.0401880145072937, -0.024268750101327896, -0.038069549947977066, 0.028021425008773804, 0.013706400990486145, 0.051791250705718994, -0.011376584880053997, -0.07077813148498535, 0.008624621666967869, -0.026946838945150375, -0.015387238003313541, -0.08788729459047318, 0.04447991028428078, 0.018729303032159805, 0.05282408371567726, 0.03454668074846268, -0.1428939402103424, -0.011318388395011425, 0.08663157373666763, 0.11528085917234421, -0.03194545581936836, -0.04022102430462837, -0.028934532776474953, -0.015373233705759048, -0.01009406615048647, -0.05150556564331055, 0.24648424983024597, -0.017987042665481567, 0.08717238903045654, -0.09837104380130768, -0.015073069371283054, 0.04701856151223183, 0.0007430592668242753, -0.027700120583176613, 0.02659761533141136, 0.02782638743519783, -0.04423271119594574, 0.07390505820512772, -0.08063257485628128, 0.059338875114917755, 0.19342492520809174, 0.08542267978191376, -0.054293546825647354, -0.04787307232618332, 0.03106546588242054, 0.0392296276986599, 0.1305467039346695, -0.09625063836574554, -0.0045334710739552975, 0.03256645053625107, 0.0956202819943428, 0.12222079187631607, -0.11280438303947449, 0.03775998204946518, -0.01087119895964861, -0.07189592719078064, 0.06592713296413422, 0.003245746251195669, -0.057475700974464417, 0.08708502352237701, 0.05551819130778313, 0.07255280017852783, 0.0028001407627016306, -0.026461895555257797, -0.0653291791677475, 0.1812533438205719, -0.1696130633354187, -0.2902522385120392, -0.21191251277923584, 0.004246749449521303, -0.10350948572158813, 0.09579433500766754, 0.019210796803236008, -0.09934036433696747, -0.05953214317560196, -0.032011717557907104, 0.1302861124277115, -0.0267997644841671, -0.003938681911677122, -0.12487231940031052, 0.036279212683439255, -0.02019413560628891, -0.17965078353881836, 0.017581988126039505, -0.013342377729713917, -0.15043605864048004, 0.019192714244127274, 0.0263898354023695, 0.07459541410207748, 0.05880706012248993, 0.0253120306879282, -0.039760977029800415, -0.012334809638559818, 0.13059519231319427, -0.062058985233306885, 0.04661179333925247, 0.1329835206270218, 0.09639543294906616, 0.028719190508127213, 0.05180125683546066, -0.0034461517352610826, -0.09486262500286102, 0.02833177149295807, -0.0037752690259367228, -0.0688442587852478, -0.23121553659439087, -0.08586375415325165, -0.05159490928053856, -0.005681742914021015, 0.06818479299545288, 0.08107024431228638, 0.045077454298734665, 0.05144108086824417, -0.038565460592508316, 0.05714262276887894, 0.029442189261317253, 0.09481032937765121, 0.035447925329208374, -0.03228512033820152, 0.028041528537869453, -0.0600583516061306, 0.009003766812384129, 0.13988158106803894, 0.17314653098583221, 0.23722122609615326, -0.09319731593132019, 0.1539449542760849, 0.03817599266767502, 0.057672400027513504, 0.014779358170926571, 0.09958896785974503, 0.0008799913339316845, 0.03660517558455467, -0.059089742600917816, -0.018470369279384613, -0.10358325392007828, 0.036903660744428635, 0.03396686166524887, -0.05124514922499657, -0.031193329021334648, -0.11708880960941315, 0.057205069810152054, 0.046803127974271774, -0.03698096051812172, -0.201080784201622, -0.10438787192106247, 0.0445900522172451, 0.011492074467241764, -0.126817524433136, -0.0064216251485049725, 0.1267738789319992, -0.11465746909379959, -0.0019074419979006052, -0.03107302449643612, 0.04294176772236824, -0.15669362246990204, -0.008769326843321323, 0.016727976500988007, 0.13270792365074158, -0.0026493824552744627, 0.0868704691529274, -0.13626088201999664, 0.09825273603200912, 0.018673885613679886, 0.05682414025068283, -0.064936563372612, 0.048103220760822296, -0.02459411509335041, -0.09904475510120392, 0.11005676537752151, 0.03619271144270897, -0.04838579148054123, -0.0217695664614439, -0.07143866270780563, 0.033094678074121475, 0.07383528351783752, -0.06324680149555206, 0.052252281457185745, 0.03926602005958557, 0.02020835131406784, -0.04678983986377716, -0.035027697682380676, -0.0581197552382946, -0.1382681429386139, 0.060660604387521744, -0.12978017330169678, 0.026968223974108696, -0.07792560756206512, -0.0013827331131324172, -0.09210895746946335, 0.2080032229423523, -0.17847084999084473, -0.04806568846106529, -0.08304712176322937, 0.022612927481532097, 0.11099541932344437, -0.059174545109272, 0.07329666614532471, -0.032941434532403946, 0.027593817561864853, -0.02659277245402336, -0.09055979549884796, 0.06772786378860474, -0.05469086393713951, -0.08556708693504333, -0.08004555851221085, 0.08403675258159637, 0.07166600972414017, -0.004234124906361103, -0.010279408656060696, -0.024431852623820305, 0.016958272084593773, -0.12042642384767532, 0.019734181463718414, 0.14587602019309998, 0.06172710657119751, 0.04503129795193672, -0.08016147464513779, -0.10291005671024323, 0.0070000807754695415, -0.02828962355852127, 0.08079510182142258, 0.20563974976539612, -0.026275161653757095, 0.14844201505184174, 0.1728287935256958, -0.12733057141304016, -0.19907115399837494, -0.003370358143001795, 0.04929988458752632, 0.04077184200286865, -0.03308824449777603, -0.24519333243370056, -0.008755810558795929, 0.11612808704376221, 0.0027388310991227627, 0.12295185029506683, -0.2417667806148529, -0.1058511957526207, 0.0026936919894069433, 0.07123852521181107, 0.1534213423728943, -0.0844496414065361, -0.04992764815688133, -0.03505539149045944, -0.09036213159561157, 0.14935240149497986, -0.009040532633662224, 0.12249685823917389, -0.02225271426141262, -0.02658686600625515, 0.022704027593135834, -0.052534203976392746, 0.17844049632549286, 0.027317749336361885, 0.06496081501245499, -0.036878906190395355, 0.09503301233053207, 0.22816994786262512, 0.00004096922930330038, 0.06937869638204575, 0.08960483968257904, 0.022605067119002342, -0.12463918328285217, -0.05908869951963425, -0.07994484901428223, 0.017098477110266685, -0.025148747488856316, -0.06829681247472763, -0.035828087478876114, 0.1141141802072525, 0.04972819238901138, 0.00543356453999877, -0.07390103489160538, -0.011652343906462193, -0.020556990057229996, 0.125536248087883, 0.047389186918735504, 0.03190656751394272, -0.059879545122385025, 0.005512109026312828, 0.015143663622438908, 0.04872436821460724, -0.10062902420759201, 0.0010775860864669085, 0.0960676446557045, -0.03475041687488556, 0.04044080153107643, 0.026910487562417984, -0.17369768023490906, 0.03120003454387188, 0.1379823088645935, -0.07620593905448914, -0.06830967962741852, 0.02206178568303585, -0.07796435803174973, -0.01664593070745468, 0.060838356614112854, 0.13956451416015625, 0.04115265607833862, -0.03708244487643242, -0.00660324189811945, 0.04421130195260048, -0.042439911514520645, 0.12437527626752853, 0.004989928565919399, -0.006329611875116825, -0.06832676380872726, 0.18568043410778046, 0.06331707537174225, -0.0931299701333046, -0.030905921012163162, 0.09970146417617798, -0.09113548696041107, -0.046887822449207306, -0.08070824295282364, -0.03535907715559006, -0.07499001920223236, -0.04026223346590996, -0.026953281834721565, -0.022662818431854248, -0.0036090072244405746, -0.04904840141534805, -0.004425022751092911, 0.05221220478415489, -0.03800789266824722, 0.08670022338628769, -0.015227044932544231, 0.013740120455622673, 0.013826626352965832, 0.019904043525457382, -0.03726952150464058, 0.11630033701658249, 0.006022902671247721, -0.011745266616344452, -0.016151389107108116, -0.039521023631095886, -0.09353422373533249, -0.010734571143984795, -0.06173736974596977, -0.03997232764959335, -0.06665437668561935, -0.018825212493538857, -0.03604119271039963, 0.014650348573923111, 0.007174577098339796, -0.005492695141583681, -0.022784363478422165, -0.04440867155790329, -0.08475261926651001, 0.05259048566222191, -0.057627756148576736, -0.022996827960014343, 0.019112827256321907, -0.06418488919734955, 0.12701985239982605, 0.02782292477786541, 0.03282683715224266, 0.06971561908721924, -0.08969046920537949, 0.04883967712521553, 0.02246314287185669, 0.04173773154616356, -0.003675969084724784, -0.04184851422905922, 0.05376233533024788, -0.008620747365057468, -0.03455040603876114, -0.01710248924791813, 0.057678014039993286, -0.10918433964252472, 0.022280653938651085, -0.006361717823892832, -0.011878639459609985, -0.05856290087103844, 0.03102813847362995, 0.044592130929231644, 0.042114049196243286, -0.0004809653910342604, -0.03661031648516655, -0.03574333339929581, -0.13251039385795593, -0.03395730257034302, 0.0034043595660477877, -0.027701249346137047, 0.023336485028266907, -0.0011221562745049596, 0.046815428882837296, 0.06307357549667358, 0.23133158683776855, 0.052087899297475815, -0.022845353931188583, -0.028031932190060616, -0.0016925239469856024, -0.036858897656202316, 0.008623114787042141, 0.04133528470993042, 0.029746176674962044, 0.023234589025378227, -0.050186142325401306, 0.10150711238384247, 0.000019393195543671027, -0.016761228442192078, 0.10302510857582092, 0.03146601840853691, 0.18308310210704803, 0.0500834695994854, 0.002914851764217019, -0.1369452327489853, -0.08165054023265839, 0.10537467151880264, -0.015354220755398273, 0.01229859795421362, -0.06790375709533691, -0.1008516475558281, 0.12801973521709442, -0.1642635613679886, 0.1135232076048851, 0.042607955634593964, -0.06283271312713623, -0.015133158303797245, -0.23219230771064758, 0.019736262038350105, -0.06484928727149963, 0.019918181002140045, -0.09990456700325012, 0.00541356997564435, 0.03216192126274109, 0.009085850790143013, -0.04740464687347412, 0.07402867823839188, -0.05713709071278572, -0.11082577705383301, -0.0022887790109962225, 0.041469983756542206, 0.10325047373771667, 0.08577131479978561, -0.0023224875330924988, -0.01359615009278059, 0.09209033846855164, 0.08572148531675339, 0.06755400449037552, 0.0786600187420845, 0.07891560345888138, 0.0187972579151392, -0.00919022224843502, -0.037139520049095154, 0.013472399674355984, -0.056284099817276, 0.23506388068199158, 0.052575163543224335, -0.08643228560686111, 0.03965046629309654, 0.16952084004878998, -0.002636804711073637, -0.03319408744573593, -0.07300601154565811, 0.14263004064559937, 0.044741686433553696, 0.05458270385861397, 0.03129898011684418, -0.099904865026474, -0.049554646015167236, 0.15282540023326874, 0.260729044675827, -0.03467000648379326, -0.04333869367837906, 0.06487523019313812, -0.026855699717998505, 0.055248234421014786, 0.15666760504245758, 0.03779375180602074, 0.3498046398162842, -0.05891825631260872, 0.054109904915094376, 0.028281157836318016, 0.04109678044915199, -0.006099865771830082, 0.09055110812187195, -0.09581069648265839, 0.028880825266242027, -0.04188844561576843, 0.022576026618480682, -0.0948135256767273, -0.259864866733551, 0.07065550982952118, -0.050663452595472336, -0.08559960126876831, 0.011101649142801762, -0.03874317184090614, 0.0006026778137311339, 0.08220051974058151, 0.029012344777584076, 0.02534910850226879, 0.15046626329421997, -0.006156908348202705, -0.13588286936283112, -0.09664924442768097, 0.029754867777228355, 0.02962205559015274, 0.20549553632736206, -0.04281226918101311, 0.053436506539583206, 0.06697642803192139, 0.018641801550984383, -0.16677744686603546, 0.027351319789886475, -0.043661829084157944, -0.02838188223540783, 0.04541391506791115, 0.026350107043981552, 0.0002678664168342948, 0.008667019195854664, 0.027690600603818893, 0.0070258500054478645, 0.04378726705908775, 0.12444482743740082, 0.02477787435054779, -0.12133929878473282, 0.07794646918773651, -0.1120523139834404, 0.12784211337566376, 0.09253107011318207, 0.008581473492085934, -0.0012845139717683196, -0.07561269402503967, 0.033559706062078476, -0.0032173285726457834, 0.1452609747648239, 0.00042674416908994317, -0.15106244385242462, -0.002629638183861971, -0.03258927911520004, 0.0343567319214344, -0.17672500014305115, -0.05770396068692207, 0.05022265389561653, -0.04598896577954292, -0.039423517882823944, 0.11966270953416824, -0.01562744751572609, 0.04791583865880966, -0.05703986436128616, 0.033238355070352554, -0.04456246644258499, 0.03527852147817612, -0.1370314359664917, -0.14570990204811096 ]
null
null
transformers
# GPT2-large trained with data from the National Library of Spain (BNE) ## Table of Contents <details> <summary>Click to expand</summary> - [Overview](#overview) - [Model description](#model-description) - [Intended uses and limitations](#intended-use) - [How to use](#how-to-use) - [Limitations and bias](#limitations-and-bias) - [Training](#training) - [Training data](#training-data) - [Training procedure](#training-procedure) - [Additional Information](#additional-information) - [Author](#author) - [Contact information](#contact-information) - [Copyright](#copyright) - [Licensing information](#licensing-information) - [Funding](#funding) - [Disclaimer](#disclaimer) </details> ## Overview - **Architecture:** gpt2-large - **Language:** Spanish - **Task:** text-generation - **Data:** BNE ## Model description **GPT2-large-bne** is a transformer-based model for the Spanish language. It is based on the [GPT-2](https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf) model and has been pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text processed for this work, compiled from the web crawlings performed by the [National Library of Spain (Biblioteca Nacional de España)](http://www.bne.es/en/Inicio/index.html) from 2009 to 2019. ## Intended uses and limitations You can use the raw model for text generation or fine-tune it to a downstream task. ## How to use Here is how to use this model: You can use this model directly with a pipeline for text generation. Since the generation relies on some randomness, we set a seed for reproducibility: ```python >>> from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline, set_seed >>> tokenizer = AutoTokenizer.from_pretrained("PlanTL-GOB-ES/gpt2-large-bne") >>> model = AutoModelForCausalLM.from_pretrained("PlanTL-GOB-ES/gpt2-large-bne") >>> generator = pipeline('text-generation', tokenizer=tokenizer, model=model) >>> set_seed(42) >>> generator("La Biblioteca Nacional de España es una entidad pública y sus fines son", num_return_sequences=5) [{'generated_text': 'La Biblioteca Nacional de España es una entidad pública y sus fines son servir como herramienta básica en la difusión de la cultura. '}, {'generated_text': 'La Biblioteca Nacional de España es una entidad pública y sus fines son el desarrollo de la educación, la cultura y el conocimiento, promoviendo actividades a través de Internet con la información que recibe del acceso a los fondos que en ella se almacenan. '}, {'generated_text': 'La Biblioteca Nacional de España es una entidad pública y sus fines son la publicación y difusión cultural. '}, {'generated_text': 'La Biblioteca Nacional de España es una entidad pública y sus fines son preservar y difundir los fondos y colecciones de la Biblioteca Nacional, así como servir de punto de encuentro para toda la comunidad científica, la academia y para la sociedad civil. '}, {'generated_text': 'La Biblioteca Nacional de España es una entidad pública y sus fines son la conservación, estudio y difusión del Patrimonio Bibliográfico en cualquiera de sus formas así como la formación y perfeccionamiento de los especialistas e investigadores en el campo de la información y de las bibliotecas.'}] ``` Here is how to use this model to get the features of a given text in PyTorch: ```python >>> from transformers import AutoTokenizer, GPT2Model >>> tokenizer = AutoTokenizer.from_pretrained("PlanTL-GOB-ES/gpt2-large-bne") >>> model = GPT2Model.from_pretrained("PlanTL-GOB-ES/gpt2-large-bne") >>> text = "La Biblioteca Nacional de España es una entidad pública y sus fines son" >>> encoded_input = tokenizer(text, return_tensors='pt') >>> output = model(**encoded_input) >>> print(output.last_hidden_state.shape) torch.Size([1, 14, 1280]) ``` ## Limitations and bias At the time of submission, no measures have been taken to estimate the bias and toxicity embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated. Nevertheless, here's an example of how the model can have biased predictions: ```python >>> from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline, set_seed >>> tokenizer = AutoTokenizer.from_pretrained("PlanTL-GOB-ES/gpt2-large-bne") >>> model = AutoModelForCausalLM.from_pretrained("PlanTL-GOB-ES/gpt2-large-bne") >>> generator = pipeline('text-generation', tokenizer=tokenizer, model=model) >>> set_seed(42) >>> generator("El hombre se dedica a", num_return_sequences=5) [{'generated_text': 'El hombre se dedica a comprar móviles a sus padres, pero les paga por ellos y luego les devuelve la pasta a ella. '}, {'generated_text': 'El hombre se dedica a la venta ambulante ilegal en la zona de la Alameda, con puestos del rastro callejero o de supermercados a los que luego roba. '}, {'generated_text': 'El hombre se dedica a la venta ambulante en el Paseo de Melilla. '}, {'generated_text': 'El hombre se dedica a los tatuajes y los dibujos en el cuerpo con su apariencia física y no da a basto en las tareas domésticas. '}, {'generated_text': 'El hombre se dedica a la caza indiscriminada de animales. '}] >>> set_seed(42) >>> generator("La mujer se dedica a", num_return_sequences=5) [{'generated_text': 'La mujer se dedica a comprar móviles a sus padres, pero les paga por ellos y luego no paga la factura." '}, {'generated_text': 'La mujer se dedica a la venta ambulante y su pareja vende cupones en el mercadillo navideño. '}, {'generated_text': 'La mujer se dedica a la venta al por mayor de perfumes, cosmética, complementos, y otros bienes de consumo. '}, {'generated_text': 'La mujer se dedica a los servicios sexuales y se aprovecha de los servicios religiosos. '}, {'generated_text': 'La mujer se dedica a la prostitución y tiene dos hijas del matrimonio y la propia familia de la víctima. '}] ``` ## Training ### Training data The [National Library of Spain (Biblioteca Nacional de España)](http://www.bne.es/en/Inicio/index.html) crawls all .es domains once a year. The training corpus consists of 59TB of WARC files from these crawls, carried out from 2009 to 2019. To obtain a high-quality training corpus, the corpus has been preprocessed with a pipeline of operations, including among others, sentence splitting, language detection, filtering of bad-formed sentences, and deduplication of repetitive contents. During the process, document boundaries are kept. This resulted in 2TB of Spanish clean corpus. Further global deduplication among the corpus is applied, resulting in 570GB of text. Some of the statistics of the corpus: | Corpora | Number of documents | Number of tokens | Size (GB) | |---------|---------------------|------------------|-----------| | BNE | 201,080,084 | 135,733,450,668 | 570GB | ### Training procedure The pretraining objective used for this architecture is next token prediction. The configuration of the **GPT2-large-bne** model is as follows: - gpt2-large: 36-layer, 1280-hidden, 20-heads, 774M parameters. The training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original [GPT-2](https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf) model with a vocabulary size of 50,262 tokens. The GPT2-large-bne pre-training consists of an autoregressive language model training that follows the approach of the GPT-2. The training lasted a total of 10 days with 32 computing nodes each one with 4 NVIDIA V100 GPUs of 16GB VRAM. ## Additional information ### Author Text Mining Unit (TeMU) at the Barcelona Supercomputing Center ([email protected]) ### Contact information For further information, send an email to <[email protected]> ### Copyright Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022) ### Licensing information This work is licensed under a [Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0) ### Funding This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL. ### Citation information If you use this model, please cite our [paper](http://journal.sepln.org/sepln/ojs/ojs/index.php/pln/article/view/6405): ``` @article{, abstract = {We want to thank the National Library of Spain for such a large effort on the data gathering and the Future of Computing Center, a Barcelona Supercomputing Center and IBM initiative (2020). This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.}, author = {Asier Gutiérrez Fandiño and Jordi Armengol Estapé and Marc Pàmies and Joan Llop Palao and Joaquin Silveira Ocampo and Casimiro Pio Carrino and Carme Armentano Oller and Carlos Rodriguez Penagos and Aitor Gonzalez Agirre and Marta Villegas}, doi = {10.26342/2022-68-3}, issn = {1135-5948}, journal = {Procesamiento del Lenguaje Natural}, keywords = {Artificial intelligence,Benchmarking,Data processing.,MarIA,Natural language processing,Spanish language modelling,Spanish language resources,Tractament del llenguatge natural (Informàtica),Àrees temàtiques de la UPC::Informàtica::Intel·ligència artificial::Llenguatge natural}, publisher = {Sociedad Española para el Procesamiento del Lenguaje Natural}, title = {MarIA: Spanish Language Models}, volume = {68}, url = {https://upcommons.upc.edu/handle/2117/367156#.YyMTB4X9A-0.mendeley}, year = {2022}, } ``` ### Disclaimer <details> <summary>Click to expand</summary> The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence. In no event shall the owner of the models (SEDIA – State Secretariat for Digitalization and Artificial Intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models. Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables. Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial. En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos. </details>
{"language": ["es"], "license": "apache-2.0", "tags": ["national library of spain", "spanish", "bne", "gpt2-large-bne"], "datasets": ["bne"], "widget": [{"text": "El modelo del lenguaje GPT es capaz de"}, {"text": "La Biblioteca Nacional de Espa\u00f1a es una entidad p\u00fablica y sus fines son"}]}
text-generation
PlanTL-GOB-ES/gpt2-large-bne
[ "transformers", "pytorch", "gpt2", "text-generation", "national library of spain", "spanish", "bne", "gpt2-large-bne", "es", "dataset:bne", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "es" ]
TAGS #transformers #pytorch #gpt2 #text-generation #national library of spain #spanish #bne #gpt2-large-bne #es #dataset-bne #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
GPT2-large trained with data from the National Library of Spain (BNE) ===================================================================== Table of Contents ----------------- Click to expand * Overview * Model description * Intended uses and limitations * How to use * Limitations and bias * Training + Training data + Training procedure * Additional Information + Author + Contact information + Copyright + Licensing information + Funding + Disclaimer Overview -------- * Architecture: gpt2-large * Language: Spanish * Task: text-generation * Data: BNE Model description ----------------- GPT2-large-bne is a transformer-based model for the Spanish language. It is based on the GPT-2 model and has been pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text processed for this work, compiled from the web crawlings performed by the National Library of Spain (Biblioteca Nacional de España) from 2009 to 2019. Intended uses and limitations ----------------------------- You can use the raw model for text generation or fine-tune it to a downstream task. How to use ---------- Here is how to use this model: You can use this model directly with a pipeline for text generation. Since the generation relies on some randomness, we set a seed for reproducibility: Here is how to use this model to get the features of a given text in PyTorch: Limitations and bias -------------------- At the time of submission, no measures have been taken to estimate the bias and toxicity embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated. Nevertheless, here's an example of how the model can have biased predictions: Training -------- ### Training data The National Library of Spain (Biblioteca Nacional de España) crawls all .es domains once a year. The training corpus consists of 59TB of WARC files from these crawls, carried out from 2009 to 2019. To obtain a high-quality training corpus, the corpus has been preprocessed with a pipeline of operations, including among others, sentence splitting, language detection, filtering of bad-formed sentences, and deduplication of repetitive contents. During the process, document boundaries are kept. This resulted in 2TB of Spanish clean corpus. Further global deduplication among the corpus is applied, resulting in 570GB of text. Some of the statistics of the corpus: ### Training procedure The pretraining objective used for this architecture is next token prediction. The configuration of the GPT2-large-bne model is as follows: * gpt2-large: 36-layer, 1280-hidden, 20-heads, 774M parameters. The training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original GPT-2 model with a vocabulary size of 50,262 tokens. The GPT2-large-bne pre-training consists of an autoregressive language model training that follows the approach of the GPT-2. The training lasted a total of 10 days with 32 computing nodes each one with 4 NVIDIA V100 GPUs of 16GB VRAM. Additional information ---------------------- ### Author Text Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL) ### Contact information For further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL) ### Copyright Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022) ### Licensing information This work is licensed under a Apache License, Version 2.0 ### Funding This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL. information If you use this model, please cite our paper: ### Disclaimer Click to expand The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence. In no event shall the owner of the models (SEDIA – State Secretariat for Digitalization and Artificial Intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models. Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables. Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial. En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos.
[ "### Training data\n\n\nThe National Library of Spain (Biblioteca Nacional de España) crawls all .es domains once a year. The training corpus consists of 59TB of WARC files from these crawls, carried out from 2009 to 2019.\n\n\nTo obtain a high-quality training corpus, the corpus has been preprocessed with a pipeline of operations, including among others, sentence splitting, language detection, filtering of bad-formed sentences, and deduplication of repetitive contents. During the process, document boundaries are kept. This resulted in 2TB of Spanish clean corpus. Further global deduplication among the corpus is applied, resulting in 570GB of text.\n\n\nSome of the statistics of the corpus:", "### Training procedure\n\n\nThe pretraining objective used for this architecture is next token prediction.\nThe configuration of the GPT2-large-bne model is as follows:\n\n\n* gpt2-large: 36-layer, 1280-hidden, 20-heads, 774M parameters.\n\n\nThe training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original GPT-2 model with a vocabulary size of 50,262 tokens.\n\n\nThe GPT2-large-bne pre-training consists of an autoregressive language model training that follows the approach of the GPT-2.\n\n\nThe training lasted a total of 10 days with 32 computing nodes each one with 4 NVIDIA V100 GPUs of 16GB VRAM.\n\n\nAdditional information\n----------------------", "### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)", "### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)", "### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)", "### Licensing information\n\n\nThis work is licensed under a Apache License, Version 2.0", "### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.\n\n\ninformation\nIf you use this model, please cite our paper:", "### Disclaimer\n\n\n\nClick to expand\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for Digitalization and Artificial Intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos." ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #national library of spain #spanish #bne #gpt2-large-bne #es #dataset-bne #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n", "### Training data\n\n\nThe National Library of Spain (Biblioteca Nacional de España) crawls all .es domains once a year. The training corpus consists of 59TB of WARC files from these crawls, carried out from 2009 to 2019.\n\n\nTo obtain a high-quality training corpus, the corpus has been preprocessed with a pipeline of operations, including among others, sentence splitting, language detection, filtering of bad-formed sentences, and deduplication of repetitive contents. During the process, document boundaries are kept. This resulted in 2TB of Spanish clean corpus. Further global deduplication among the corpus is applied, resulting in 570GB of text.\n\n\nSome of the statistics of the corpus:", "### Training procedure\n\n\nThe pretraining objective used for this architecture is next token prediction.\nThe configuration of the GPT2-large-bne model is as follows:\n\n\n* gpt2-large: 36-layer, 1280-hidden, 20-heads, 774M parameters.\n\n\nThe training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original GPT-2 model with a vocabulary size of 50,262 tokens.\n\n\nThe GPT2-large-bne pre-training consists of an autoregressive language model training that follows the approach of the GPT-2.\n\n\nThe training lasted a total of 10 days with 32 computing nodes each one with 4 NVIDIA V100 GPUs of 16GB VRAM.\n\n\nAdditional information\n----------------------", "### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)", "### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)", "### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)", "### Licensing information\n\n\nThis work is licensed under a Apache License, Version 2.0", "### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.\n\n\ninformation\nIf you use this model, please cite our paper:", "### Disclaimer\n\n\n\nClick to expand\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for Digitalization and Artificial Intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos." ]
[ 89, 160, 181, 28, 37, 22, 19, 46, 364 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #national library of spain #spanish #bne #gpt2-large-bne #es #dataset-bne #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n### Training data\n\n\nThe National Library of Spain (Biblioteca Nacional de España) crawls all .es domains once a year. The training corpus consists of 59TB of WARC files from these crawls, carried out from 2009 to 2019.\n\n\nTo obtain a high-quality training corpus, the corpus has been preprocessed with a pipeline of operations, including among others, sentence splitting, language detection, filtering of bad-formed sentences, and deduplication of repetitive contents. During the process, document boundaries are kept. This resulted in 2TB of Spanish clean corpus. Further global deduplication among the corpus is applied, resulting in 570GB of text.\n\n\nSome of the statistics of the corpus:### Training procedure\n\n\nThe pretraining objective used for this architecture is next token prediction.\nThe configuration of the GPT2-large-bne model is as follows:\n\n\n* gpt2-large: 36-layer, 1280-hidden, 20-heads, 774M parameters.\n\n\nThe training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original GPT-2 model with a vocabulary size of 50,262 tokens.\n\n\nThe GPT2-large-bne pre-training consists of an autoregressive language model training that follows the approach of the GPT-2.\n\n\nThe training lasted a total of 10 days with 32 computing nodes each one with 4 NVIDIA V100 GPUs of 16GB VRAM.\n\n\nAdditional information\n----------------------### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)" ]
[ -0.05026979744434357, 0.15240886807441711, -0.004754639696329832, 0.09374962747097015, 0.11760669946670532, 0.04528358578681946, 0.03753715381026268, 0.1193586140871048, -0.042925458401441574, 0.07635702192783356, 0.01811828464269638, 0.04039091616868973, 0.07795456051826477, 0.07999788969755173, 0.1514284461736679, -0.21235275268554688, 0.016294049099087715, -0.11296916007995605, -0.07936906069517136, 0.010583272203803062, 0.06403569877147675, -0.03537118807435036, 0.08925394713878632, -0.04650704562664032, -0.027486778795719147, -0.002194220433011651, -0.0388156920671463, -0.0708073228597641, 0.11637275665998459, 0.09731104969978333, -0.006939675658941269, -0.03715161979198456, 0.05769982561469078, -0.11786248534917831, 0.005475328303873539, 0.10425512492656708, -0.045270826667547226, 0.046432819217443466, 0.0945645123720169, -0.016584614291787148, 0.13885651528835297, -0.08420762419700623, 0.005866005085408688, 0.040946897119283676, -0.1575048416852951, -0.1873933970928192, -0.10011454671621323, -0.001884165802039206, 0.013388526625931263, 0.011919278651475906, 0.007144481875002384, -0.0018442973960191011, 0.0016622240655124187, -0.0014766913373023272, 0.09067302942276001, -0.25493496656417847, -0.0032775565050542355, 0.12538759410381317, 0.003769990522414446, 0.10894905775785446, -0.04621101915836334, 0.025627899914979935, 0.01501783449202776, 0.038411837071180344, -0.011169572360813618, -0.010334620252251625, -0.016448650509119034, -0.006121733691543341, -0.06262379139661789, -0.14392919838428497, 0.1218208447098732, -0.041175179183483124, -0.0938703790307045, -0.0761488750576973, -0.04207789897918701, -0.03352360054850578, 0.058762021362781525, -0.008198241703212261, 0.03814355283975601, 0.017395634204149246, 0.06870134174823761, -0.09055260568857193, -0.11561717092990875, -0.07303771376609802, -0.11909908056259155, 0.10843811929225922, 0.04960593208670616, 0.041190482676029205, 0.03760916367173195, 0.17665088176727295, -0.017658088356256485, -0.03189932554960251, -0.039115648716688156, -0.026377659291028976, -0.08116278797388077, 0.04076676070690155, -0.005317381117492914, -0.09282458573579788, -0.027729075402021408, 0.10305260121822357, -0.0479859933257103, 0.007297519128769636, 0.07893400639295578, 0.002753780223429203, 0.07937119156122208, 0.04622136428952217, -0.07548920065164566, -0.028651412576436996, 0.004553073551505804, -0.08950167149305344, 0.015081057325005531, -0.01813146285712719, -0.07086644321680069, -0.0403669998049736, -0.003037277841940522, 0.061633653938770294, 0.0253842044621706, 0.019029710441827774, 0.006012760568410158, -0.06154770404100418, 0.21870532631874084, -0.059633299708366394, -0.0013394581619650126, -0.00800869520753622, -0.09043097496032715, -0.09523726999759674, 0.043397076427936554, -0.0537324883043766, -0.07574230432510376, 0.0022093909792602062, -0.07519847899675369, -0.08759140223264694, -0.09246761351823807, -0.09007540345191956, 0.026203028857707977, -0.09000450372695923, -0.051088232547044754, -0.13428714871406555, -0.18250447511672974, -0.005941063165664673, 0.03960657864809036, -0.06459125131368637, 0.04660387337207794, -0.00889066606760025, 0.01832687109708786, 0.0033121665474027395, -0.03916528820991516, 0.08431203663349152, -0.05470328778028488, 0.04050203040242195, -0.018280712887644768, 0.05299031361937523, -0.035124752670526505, -0.003975709900259972, -0.04576023295521736, 0.002064889995381236, -0.10821277648210526, 0.09826374053955078, -0.033363230526447296, -0.02061241865158081, -0.09925807267427444, -0.036947380751371384, -0.06266870349645615, 0.047815125435590744, 0.03540195897221565, 0.1171427071094513, -0.12889046967029572, -0.04098908230662346, 0.21616090834140778, -0.056696567684412, -0.0031496162991970778, 0.09931755065917969, -0.01352094579488039, 0.06239256635308266, 0.10105712711811066, 0.08326258510351181, 0.11347555369138718, -0.09135506302118301, -0.023912392556667328, -0.011497369967401028, -0.08933661133050919, 0.000625383632723242, 0.059309717267751694, -0.07570123672485352, 0.07635481655597687, 0.07178405672311783, -0.022820468991994858, 0.042175307869911194, -0.011942549608647823, -0.018129954114556313, 0.05134507268667221, -0.014732899144291878, -0.015064278617501259, -0.040061790496110916, 0.00826692022383213, -0.009762777015566826, -0.05006178095936775, -0.05671205744147301, 0.06873796135187149, -0.04099150747060776, 0.021022522822022438, -0.026228807866573334, 0.005900878459215164, -0.06449968367815018, 0.023322036489844322, -0.10131191462278366, -0.1258334517478943, 0.06490007787942886, -0.10556621849536896, 0.1083010584115982, 0.04416411742568016, 0.063385009765625, 0.1007079929113388, -0.031051253899931908, -0.011353219859302044, -0.004655081313103437, -0.08585004508495331, -0.030294764786958694, -0.06023213267326355, -0.012046873569488525, -0.034507136791944504, 0.0009160980116575956, -0.06027218699455261, 0.01682218536734581, 0.030751517042517662, 0.04273439198732376, -0.014285827986896038, -0.06422831118106842, 0.009621569886803627, -0.03539396822452545, -0.01893094927072525, -0.09039697796106339, 0.038445647805929184, 0.006914676167070866, 0.032953571528196335, 0.03771747648715973, -0.15204186737537384, -0.005257452838122845, 0.06425856053829193, 0.12880119681358337, -0.03661288321018219, -0.037189796566963196, -0.03601720184087753, -0.013291276060044765, -0.0321035236120224, -0.04961700364947319, 0.2690207064151764, -0.019961638376116753, 0.08302948623895645, -0.09881285578012466, -0.03543972223997116, 0.052641138434410095, -0.01743876188993454, -0.03906587138772011, 0.03417874872684479, 0.0045016189105808735, -0.058909349143505096, 0.06213942542672157, -0.09288586676120758, 0.0712745413184166, 0.20449155569076538, 0.09453835338354111, -0.06793326884508133, -0.047447819262742996, 0.03729024901986122, 0.04765233024954796, 0.1032615602016449, -0.04539289325475693, -0.009107470512390137, 0.03113236278295517, 0.10274834930896759, 0.12066806107759476, -0.11302012950181961, 0.02768796868622303, -0.010090185329318047, -0.08788010478019714, 0.08780041337013245, 0.013279051519930363, -0.07038235664367676, 0.09008995443582535, 0.058068204671144485, 0.08419431000947952, -0.0038086925633251667, -0.027206633239984512, -0.06931670010089874, 0.17230384051799774, -0.15323953330516815, -0.3173302412033081, -0.2302337884902954, 0.008044262416660786, -0.10142426192760468, 0.09318889677524567, 0.025297999382019043, -0.11600185185670853, -0.06721177697181702, -0.04976143315434456, 0.11407486349344254, -0.036185696721076965, -0.0030511105433106422, -0.12458253651857376, 0.0444469228386879, -0.028587890788912773, -0.1747952103614807, 0.014431859366595745, -0.009389265440404415, -0.16179810464382172, 0.008823168464004993, 0.03074600361287594, 0.0911569595336914, 0.0554170086979866, 0.03930060192942619, -0.027982857078313828, -0.014277813956141472, 0.12376172840595245, -0.06888400763273239, 0.03922753036022186, 0.14183920621871948, 0.08402015268802643, 0.02945121005177498, 0.028721366077661514, -0.007622412405908108, -0.10333484411239624, 0.03463960438966751, 0.00623211357742548, -0.06632915884256363, -0.23069877922534943, -0.0906851515173912, -0.026875874027609825, -0.01551577728241682, 0.07705049961805344, 0.09561469405889511, 0.07642550021409988, 0.03181770443916321, -0.04838837310671806, 0.045745301991701126, 0.03165370970964432, 0.09402169287204742, 0.008785075508058071, -0.022845249623060226, 0.013153762556612492, -0.07873545587062836, -0.008216969668865204, 0.15685032308101654, 0.16704922914505005, 0.22932471334934235, -0.08165797591209412, 0.1498299539089203, 0.0375872403383255, 0.037944477051496506, 0.010832840576767921, 0.07755875587463379, 0.0022515184246003628, 0.028329994529485703, -0.06673946231603622, -0.02172650210559368, -0.10393386334180832, 0.04308485984802246, 0.043696142733097076, -0.07492844015359879, -0.040243975818157196, -0.11559414118528366, 0.06109510734677315, 0.04598844796419144, -0.031397443264722824, -0.1927647441625595, -0.1042882576584816, 0.04663492739200592, -0.007087984587997198, -0.11391914635896683, -0.024462120607495308, 0.1337338089942932, -0.13976871967315674, 0.006867147516459227, -0.026141762733459473, 0.037919044494628906, -0.18938015401363373, -0.01238720677793026, 0.012461823411285877, 0.11801896244287491, 0.0030352568719536066, 0.09101366251707077, -0.12017392367124557, 0.10807110369205475, 0.01714480295777321, 0.05746617168188095, -0.0823751762509346, 0.04102868214249611, -0.03917061164975166, -0.11186310648918152, 0.11699487268924713, 0.03560064360499382, -0.021531162783503532, -0.024013711139559746, -0.07581046223640442, 0.03435002639889717, 0.08614872395992279, -0.0907459706068039, 0.06026410311460495, 0.040992774069309235, 0.01450326293706894, -0.05050196871161461, -0.023097027093172073, -0.051403552293777466, -0.15403296053409576, 0.06171563267707825, -0.12189815938472748, 0.029624534770846367, -0.06475908309221268, -0.0025620642118155956, -0.10576137155294418, 0.19262541830539703, -0.17458392679691315, -0.047256484627723694, -0.08645449578762054, 0.023350711911916733, 0.10564656555652618, -0.06043200194835663, 0.06306841969490051, -0.021324483677744865, 0.0285699013620615, -0.023403190076351166, -0.07943373173475266, 0.057228971272706985, -0.04978909716010094, -0.09160475432872772, -0.07464064657688141, 0.09053550660610199, 0.090077243745327, -0.009845406748354435, -0.008522865362465382, -0.029212936758995056, 0.01948988437652588, -0.12410266697406769, 0.021159010007977486, 0.16527226567268372, 0.07871589064598083, 0.04980763792991638, -0.06183752045035362, -0.07427098602056503, -0.011330800130963326, -0.052690036594867706, 0.09408123046159744, 0.23251569271087646, -0.022182229906320572, 0.16205735504627228, 0.15977273881435394, -0.13695460557937622, -0.18183641135692596, -0.03443370386958122, 0.05984801799058914, 0.025742048397660255, -0.03786369785666466, -0.2433372586965561, -0.036800939589738846, 0.1281345933675766, 0.0019975006580352783, 0.10516589134931564, -0.2763316333293915, -0.10147873312234879, -0.017202842980623245, 0.06591801345348358, 0.15161268413066864, -0.09262897819280624, -0.05164407193660736, -0.053866416215896606, -0.09487462043762207, 0.15273776650428772, 0.01107505802065134, 0.11773620545864105, -0.00695409718900919, -0.016527408733963966, 0.020731810480356216, -0.04223811998963356, 0.17360225319862366, 0.04859995096921921, 0.07670540362596512, -0.040081508457660675, 0.10034530609846115, 0.21148426830768585, -0.00013196180225349963, 0.09071415662765503, 0.0829203724861145, 0.01943316124379635, -0.13395890593528748, -0.06255576759576797, -0.06658486276865005, 0.015447013080120087, -0.017148440703749657, -0.06998321413993835, -0.04059137776494026, 0.10058964043855667, 0.055061422288417816, -0.0005208924412727356, -0.07356186211109161, -0.011084317229688168, -0.029779896140098572, 0.1045878455042839, 0.04119761660695076, 0.038592834025621414, -0.058926235884428024, -0.013361210934817791, 0.008334682323038578, 0.03475499898195267, -0.10666430741548538, 0.0032878911588340998, 0.0770869106054306, -0.04570889472961426, 0.052641261368989944, 0.019342558458447456, -0.16680943965911865, 0.03833845257759094, 0.13086044788360596, -0.09790503233671188, -0.07832977175712585, 0.02655072696506977, -0.06347502768039703, -0.005208979826420546, 0.0671858862042427, 0.15092049539089203, 0.03615264594554901, -0.03964695706963539, -0.01804320700466633, 0.045854583382606506, -0.03421584889292717, 0.11989205330610275, -0.001304465695284307, -0.01183212362229824, -0.058457233011722565, 0.19361023604869843, 0.06441062688827515, -0.11138270050287247, -0.019721589982509613, 0.10903993993997574, -0.08761528134346008, -0.04642530530691147, -0.08072925359010696, -0.026592610403895378, -0.09877883642911911, -0.04034333676099777, -0.020087048411369324, 0.004609501920640469, 0.006410811096429825, -0.05950980633497238, -0.009091655723750591, 0.06415051221847534, -0.033305563032627106, 0.09208982437849045, -0.006715934257954359, -0.008390000090003014, 0.009710900485515594, 0.007626314647495747, -0.05305904522538185, 0.13482804596424103, 0.009435809217393398, -0.016197793185710907, -0.013943426311016083, -0.04290316253900528, -0.08921193331480026, -0.015448306687176228, -0.025794275104999542, -0.04422976076602936, -0.05628831312060356, -0.019809629768133163, -0.036911871284246445, 0.016907012090086937, 0.013840842060744762, -0.004905350971966982, -0.019307412207126617, -0.04344499856233597, -0.08462762832641602, 0.0647195354104042, -0.06234291195869446, -0.02903350442647934, 0.013780606910586357, -0.05839891359210014, 0.13442565500736237, 0.04046367108821869, 0.03459004685282707, 0.07112381607294083, -0.09658529609441757, 0.0618610642850399, 0.03082975558936596, 0.040365491062402725, -0.009687489829957485, -0.04290330410003662, 0.042950790375471115, -0.006840602960437536, -0.0261110607534647, -0.008003785274922848, 0.04905006289482117, -0.09439919888973236, 0.020928574725985527, -0.0021285649854689837, -0.012479688040912151, -0.06588495522737503, 0.027966057881712914, 0.05966849625110626, 0.04301411285996437, 0.00790236797183752, -0.04585111513733864, -0.033597566187381744, -0.12134940922260284, -0.03181019797921181, 0.005907449405640364, -0.016896700486540794, 0.011873128823935986, -0.01258335541933775, 0.042770400643348694, 0.058457206934690475, 0.22921238839626312, 0.05062779784202576, 0.0001075022664736025, -0.03130367398262024, 0.0065985615365207195, -0.036949027329683304, 0.00867715198546648, 0.046467240899801254, 0.040320638567209244, 0.025336211547255516, -0.05956758186221123, 0.07925805449485779, 0.0000031253982797352364, -0.004427515901625156, 0.09736698120832443, 0.014617347158491611, 0.19245228171348572, 0.04751111567020416, 0.004350693896412849, -0.13856011629104614, -0.07183639705181122, 0.10156144201755524, -0.022666912525892258, 0.011818441562354565, -0.064702607691288, -0.11568935215473175, 0.12799938023090363, -0.16723300516605377, 0.10469035059213638, 0.04562909156084061, -0.05879883095622063, -0.014473604038357735, -0.24310828745365143, 0.018791737034916878, -0.06354428082704544, 0.030509455129504204, -0.09121465682983398, 0.04498932510614395, 0.009795056656002998, 0.010578629560768604, -0.03943028301000595, 0.06587274372577667, -0.09002408385276794, -0.1153038889169693, -0.00959704164415598, 0.03407140448689461, 0.10326044261455536, 0.08050142973661423, 0.01842876709997654, -0.012830497696995735, 0.08745047450065613, 0.08290690928697586, 0.07057639211416245, 0.09605112671852112, 0.07353438436985016, 0.004414842929691076, -0.013833468779921532, -0.03952214866876602, 0.014385691843926907, -0.04901036247611046, 0.21532027423381805, 0.0650462806224823, -0.08633432537317276, 0.042484693229198456, 0.18189619481563568, -0.006134168244898319, -0.03914229944348335, -0.07461608946323395, 0.13681554794311523, 0.04719873517751694, 0.0676741823554039, 0.02261306717991829, -0.09099218249320984, -0.05424122139811516, 0.13381427526474, 0.24695686995983124, -0.04127691686153412, -0.04021850973367691, 0.05354335159063339, -0.023088209331035614, 0.06793387979269028, 0.1569834053516388, 0.022853396832942963, 0.36508503556251526, -0.0618683323264122, 0.059076108038425446, 0.02595043182373047, 0.053651824593544006, -0.0017844088142737746, 0.13391225039958954, -0.1142493337392807, 0.02624138817191124, -0.029387176036834717, 0.03356747329235077, -0.10501065105199814, -0.26164862513542175, 0.048118144273757935, -0.055557504296302795, -0.07700920104980469, 0.04114442318677902, -0.02247839793562889, 0.0030550456140190363, 0.09402500092983246, 0.03555808588862419, 0.018203966319561005, 0.15079498291015625, 0.011499341577291489, -0.1548863798379898, -0.09234704077243805, 0.023935982957482338, 0.034720391035079956, 0.19323106110095978, -0.03203394263982773, 0.07244793325662613, 0.06344138830900192, -0.006799878552556038, -0.16674014925956726, 0.02139546163380146, -0.04055069759488106, -0.023134011775255203, 0.051267631351947784, 0.027071679010987282, 0.004884720779955387, -0.01097396481782198, 0.024828944355249405, -0.0033785991836339235, 0.04758553206920624, 0.15359841287136078, 0.043309420347213745, -0.11366364359855652, 0.07311384379863739, -0.10923103243112564, 0.125111922621727, 0.07274945080280304, 0.010633735917508602, 0.005115963984280825, -0.07066251337528229, 0.035094309598207474, -0.009397318586707115, 0.14131595194339752, 0.009368146769702435, -0.15229415893554688, -0.010159177705645561, -0.07284726202487946, 0.038870569318532944, -0.17497915029525757, -0.061864983290433884, 0.04810395464301109, -0.043329354375600815, -0.04160218685865402, 0.11221959441900253, -0.0134041216224432, 0.04091164097189903, -0.05248963460326195, 0.043871987611055374, -0.03225832059979439, 0.035789791494607925, -0.12910065054893494, -0.13859213888645172 ]
null
null
transformers
# Biomedical-clinical language model for Spanish ## Table of contents <details> <summary>Click to expand</summary> - [Model description](#model-description) - [Intended uses and limitations](#intended-use) - [How to use](#how-to-use) - [Limitations and bias](#limitations-and-bias) - [Training](#training) - [Evaluation](#evaluation) - [Additional information](#additional-information) - [Author](#author) - [Contact information](#contact-information) - [Copyright](#copyright) - [Licensing information](#licensing-information) - [Funding](#funding) - [Citation information](#citation-information) - [Disclaimer](#disclaimer) </details> ## Model description Biomedical pretrained language model for Spanish. This model is a [RoBERTa-based](https://github.com/pytorch/fairseq/tree/master/examples/roberta) model trained on a **biomedical-clinical** corpus in Spanish collected from several sources. ## Intended uses and limitations The model is ready-to-use only for masked language modelling to perform the Fill Mask task (try the inference API or read the next section). However, it is intended to be fine-tuned on downstream tasks such as Named Entity Recognition or Text Classification. ## How to use ```python from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("BSC-TeMU/roberta-base-biomedical-es") model = AutoModelForMaskedLM.from_pretrained("BSC-TeMU/roberta-base-biomedical-es") from transformers import pipeline unmasker = pipeline('fill-mask', model="BSC-TeMU/roberta-base-biomedical-es") unmasker("El único antecedente personal a reseñar era la <mask> arterial.") ``` ``` # Output [ { "sequence": " El único antecedente personal a reseñar era la hipertensión arterial.", "score": 0.9855039715766907, "token": 3529, "token_str": " hipertensión" }, { "sequence": " El único antecedente personal a reseñar era la diabetes arterial.", "score": 0.0039140828885138035, "token": 1945, "token_str": " diabetes" }, { "sequence": " El único antecedente personal a reseñar era la hipotensión arterial.", "score": 0.002484665485098958, "token": 11483, "token_str": " hipotensión" }, { "sequence": " El único antecedente personal a reseñar era la Hipertensión arterial.", "score": 0.0023484621196985245, "token": 12238, "token_str": " Hipertensión" }, { "sequence": " El único antecedente personal a reseñar era la presión arterial.", "score": 0.0008009297889657319, "token": 2267, "token_str": " presión" } ] ``` ## Limitations and bias At the time of submission, no measures have been taken to estimate the bias embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated. ## Training The training corpus has been tokenized using a byte version of [Byte-Pair Encoding (BPE)](https://github.com/openai/gpt-2) used in the original [RoBERTA](https://github.com/pytorch/fairseq/tree/master/examples/roberta) model with a vocabulary size of 52,000 tokens. The pretraining consists of a masked language model training at the subword level following the approach employed for the RoBERTa base model with the same hyperparameters as in the original work. The training lasted a total of 48 hours with 16 NVIDIA V100 GPUs of 16GB DDRAM, using Adam optimizer with a peak learning rate of 0.0005 and an effective batch size of 2,048 sentences. The training corpus is composed of several biomedical corpora in Spanish, collected from publicly available corpora and crawlers, and a real-world clinical corpus collected from more than 278K clinical documents and notes. To obtain a high-quality training corpus while retaining the idiosyncrasies of the clinical language, a cleaning pipeline has been applied only to the biomedical corpora, keeping the clinical corpus uncleaned. Essentially, the cleaning operations used are: - data parsing in different formats - sentence splitting - language detection - filtering of ill-formed sentences - deduplication of repetitive contents - keep the original document boundaries Then, the biomedical corpora are concatenated and further global deduplication among the biomedical corpora have been applied. Eventually, the clinical corpus is concatenated to the cleaned biomedical corpus resulting in a medium-size biomedical-clinical corpus for Spanish composed of more than 1B tokens. The table below shows some basic statistics of the individual cleaned corpora: | Name | No. tokens | Description | |-----------------------------------------------------------------------------------------|-------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | [Medical crawler](https://zenodo.org/record/4561970) | 745,705,946 | Crawler of more than 3,000 URLs belonging to Spanish biomedical and health domains. | | Clinical cases misc. | 102,855,267 | A miscellany of medical content, essentially clinical cases. Note that a clinical case report is a scientific publication where medical practitioners share patient cases and it is different from a clinical note or document. | | Clinical notes/documents | 91,250,080 | Collection of more than 278K clinical documents, including discharge reports, clinical course notes and X-ray reports, for a total of 91M tokens. | | [Scielo](https://github.com/PlanTL-SANIDAD/SciELO-Spain-Crawler) | 60,007,289 | Publications written in Spanish crawled from the Spanish SciELO server in 2017. | | [BARR2_background](https://temu.bsc.es/BARR2/downloads/background_set.raw_text.tar.bz2) | 24,516,442 | Biomedical Abbreviation Recognition and Resolution (BARR2) containing Spanish clinical case study sections from a variety of clinical disciplines. | | Wikipedia_life_sciences | 13,890,501 | Wikipedia articles crawled 04/01/2021 with the [Wikipedia API python library](https://pypi.org/project/Wikipedia-API/) starting from the "Ciencias\_de\_la\_vida" category up to a maximum of 5 subcategories. Multiple links to the same articles are then discarded to avoid repeating content. | | Patents | 13,463,387 | Google Patent in Medical Domain for Spain (Spanish). The accepted codes (Medical Domain) for Json files of patents are: "A61B", "A61C","A61F", "A61H", "A61K", "A61L","A61M", "A61B", "A61P". | | [EMEA](http://opus.nlpl.eu/download.php?f=EMEA/v3/moses/en-es.txt.zip) | 5,377,448 | Spanish-side documents extracted from parallel corpora made out of PDF documents from the European Medicines Agency. | | [mespen_Medline](https://zenodo.org/record/3562536#.YTt1fH2xXbR) | 4,166,077 | Spanish-side articles extracted from a collection of Spanish-English parallel corpus consisting of biomedical scientific literature. The collection of parallel resources are aggregated from the MedlinePlus source. | | PubMed | 1,858,966 | Open-access articles from the PubMed repository crawled in 2017. | ## Evaluation The model has been evaluated on the Named Entity Recognition (NER) using the following datasets: - [PharmaCoNER](https://zenodo.org/record/4270158): is a track on chemical and drug mention recognition from Spanish medical texts (for more info see: https://temu.bsc.es/pharmaconer/). - [CANTEMIST](https://zenodo.org/record/3978041#.YTt5qH2xXbQ): is a shared task specifically focusing on named entity recognition of tumor morphology, in Spanish (for more info see: https://zenodo.org/record/3978041#.YTt5qH2xXbQ). - ICTUSnet: consists of 1,006 hospital discharge reports of patients admitted for stroke from 18 different Spanish hospitals. It contains more than 79,000 annotations for 51 different kinds of variables. The evaluation results are compared against the [mBERT](https://huggingface.co/bert-base-multilingual-cased) and [BETO](https://huggingface.co/dccuchile/bert-base-spanish-wwm-cased) models: | F1 - Precision - Recall | roberta-base-biomedical-clinical-es | mBERT | BETO | |---------------------------|----------------------------|-------------------------------|-------------------------| | PharmaCoNER | **90.04** - **88.92** - **91.18** | 87.46 - 86.50 - 88.46 | 88.18 - 87.12 - 89.28 | | CANTEMIST | **83.34** - **81.48** - **85.30** | 82.61 - 81.12 - 84.15 | 82.42 - 80.91 - 84.00 | | ICTUSnet | **88.08** - **84.92** - **91.50** | 86.75 - 83.53 - 90.23 | 85.95 - 83.10 - 89.02 | ## Additional information ### Author Text Mining Unit (TeMU) at the Barcelona Supercomputing Center ([email protected]) ### Contact information For further information, send an email to <[email protected]> ### Copyright Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022) ### Licensing information [Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0) ### Funding This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL. ### Citation information If you use our models, please cite our latest preprint: ```bibtex @misc{carrino2021biomedical, title={Biomedical and Clinical Language Models for Spanish: On the Benefits of Domain-Specific Pretraining in a Mid-Resource Scenario}, author={Casimiro Pio Carrino and Jordi Armengol-Estapé and Asier Gutiérrez-Fandiño and Joan Llop-Palao and Marc Pàmies and Aitor Gonzalez-Agirre and Marta Villegas}, year={2021}, eprint={2109.03570}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` If you use our Medical Crawler corpus, please cite the preprint: ```bibtex @misc{carrino2021spanish, title={Spanish Biomedical Crawled Corpus: A Large, Diverse Dataset for Spanish Biomedical Language Models}, author={Casimiro Pio Carrino and Jordi Armengol-Estapé and Ona de Gibert Bonet and Asier Gutiérrez-Fandiño and Aitor Gonzalez-Agirre and Martin Krallinger and Marta Villegas}, year={2021}, eprint={2109.07765}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ### Disclaimer <details> <summary>Click to expand</summary> The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence. In no event shall the owner of the models (SEDIA – State Secretariat for Digitalization and Artificial Intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models. Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables. Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial. En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos. </details>
{"language": ["es"], "license": "apache-2.0", "tags": ["biomedical", "clinical", "spanish"], "metrics": ["ppl"], "widget": [{"text": "El \u00fanico antecedente personal a rese\u00f1ar era la <mask> arterial."}, {"text": "Las radiolog\u00edas \u00f3seas de cuerpo entero no detectan alteraciones <mask>, ni alteraciones vertebrales."}, {"text": "En el <mask> toraco-abd\u00f3mino-p\u00e9lvico no se encontraron hallazgos patol\u00f3gicos de inter\u00e9s."}]}
fill-mask
PlanTL-GOB-ES/roberta-base-biomedical-clinical-es
[ "transformers", "pytorch", "roberta", "fill-mask", "biomedical", "clinical", "spanish", "es", "arxiv:2109.03570", "arxiv:2109.07765", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2109.03570", "2109.07765" ]
[ "es" ]
TAGS #transformers #pytorch #roberta #fill-mask #biomedical #clinical #spanish #es #arxiv-2109.03570 #arxiv-2109.07765 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
Biomedical-clinical language model for Spanish ============================================== Table of contents ----------------- Click to expand * Model description * Intended uses and limitations * How to use * Limitations and bias * Training * Evaluation * Additional information + Author + Contact information + Copyright + Licensing information + Funding + Citation information + Disclaimer Model description ----------------- Biomedical pretrained language model for Spanish. This model is a RoBERTa-based model trained on a biomedical-clinical corpus in Spanish collected from several sources. Intended uses and limitations ----------------------------- The model is ready-to-use only for masked language modelling to perform the Fill Mask task (try the inference API or read the next section). However, it is intended to be fine-tuned on downstream tasks such as Named Entity Recognition or Text Classification. How to use ---------- Limitations and bias -------------------- At the time of submission, no measures have been taken to estimate the bias embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated. Training -------- The training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original RoBERTA model with a vocabulary size of 52,000 tokens. The pretraining consists of a masked language model training at the subword level following the approach employed for the RoBERTa base model with the same hyperparameters as in the original work. The training lasted a total of 48 hours with 16 NVIDIA V100 GPUs of 16GB DDRAM, using Adam optimizer with a peak learning rate of 0.0005 and an effective batch size of 2,048 sentences. The training corpus is composed of several biomedical corpora in Spanish, collected from publicly available corpora and crawlers, and a real-world clinical corpus collected from more than 278K clinical documents and notes. To obtain a high-quality training corpus while retaining the idiosyncrasies of the clinical language, a cleaning pipeline has been applied only to the biomedical corpora, keeping the clinical corpus uncleaned. Essentially, the cleaning operations used are: * data parsing in different formats * sentence splitting * language detection * filtering of ill-formed sentences * deduplication of repetitive contents * keep the original document boundaries Then, the biomedical corpora are concatenated and further global deduplication among the biomedical corpora have been applied. Eventually, the clinical corpus is concatenated to the cleaned biomedical corpus resulting in a medium-size biomedical-clinical corpus for Spanish composed of more than 1B tokens. The table below shows some basic statistics of the individual cleaned corpora: Name: Medical crawler, No. tokens: 745,705,946, Description: Crawler of more than 3,000 URLs belonging to Spanish biomedical and health domains. Name: Clinical cases misc., No. tokens: 102,855,267, Description: A miscellany of medical content, essentially clinical cases. Note that a clinical case report is a scientific publication where medical practitioners share patient cases and it is different from a clinical note or document. Name: Clinical notes/documents, No. tokens: 91,250,080, Description: Collection of more than 278K clinical documents, including discharge reports, clinical course notes and X-ray reports, for a total of 91M tokens. Name: Scielo, No. tokens: 60,007,289, Description: Publications written in Spanish crawled from the Spanish SciELO server in 2017. Name: BARR2\_background, No. tokens: 24,516,442, Description: Biomedical Abbreviation Recognition and Resolution (BARR2) containing Spanish clinical case study sections from a variety of clinical disciplines. Name: Wikipedia\_life\_sciences, No. tokens: 13,890,501, Description: Wikipedia articles crawled 04/01/2021 with the Wikipedia API python library starting from the "Ciencias\_de\_la\_vida" category up to a maximum of 5 subcategories. Multiple links to the same articles are then discarded to avoid repeating content. Name: Patents, No. tokens: 13,463,387, Description: Google Patent in Medical Domain for Spain (Spanish). The accepted codes (Medical Domain) for Json files of patents are: "A61B", "A61C","A61F", "A61H", "A61K", "A61L","A61M", "A61B", "A61P". Name: EMEA, No. tokens: 5,377,448, Description: Spanish-side documents extracted from parallel corpora made out of PDF documents from the European Medicines Agency. Name: mespen\_Medline, No. tokens: 4,166,077, Description: Spanish-side articles extracted from a collection of Spanish-English parallel corpus consisting of biomedical scientific literature. The collection of parallel resources are aggregated from the MedlinePlus source. Name: PubMed, No. tokens: 1,858,966, Description: Open-access articles from the PubMed repository crawled in 2017. Evaluation ---------- The model has been evaluated on the Named Entity Recognition (NER) using the following datasets: * PharmaCoNER: is a track on chemical and drug mention recognition from Spanish medical texts (for more info see: URL * CANTEMIST: is a shared task specifically focusing on named entity recognition of tumor morphology, in Spanish (for more info see: URL * ICTUSnet: consists of 1,006 hospital discharge reports of patients admitted for stroke from 18 different Spanish hospitals. It contains more than 79,000 annotations for 51 different kinds of variables. The evaluation results are compared against the mBERT and BETO models: Additional information ---------------------- ### Author Text Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL) ### Contact information For further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL) ### Copyright Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022) ### Licensing information Apache License, Version 2.0 ### Funding This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL. information If you use our models, please cite our latest preprint: If you use our Medical Crawler corpus, please cite the preprint: ### Disclaimer Click to expand The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence. In no event shall the owner of the models (SEDIA – State Secretariat for Digitalization and Artificial Intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models. Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables. Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial. En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos.
[ "### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)", "### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)", "### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)", "### Licensing information\n\n\nApache License, Version 2.0", "### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.\n\n\ninformation\nIf you use our models, please cite our latest preprint:\n\n\nIf you use our Medical Crawler corpus, please cite the preprint:", "### Disclaimer\n\n\n\nClick to expand\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for Digitalization and Artificial Intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos." ]
[ "TAGS\n#transformers #pytorch #roberta #fill-mask #biomedical #clinical #spanish #es #arxiv-2109.03570 #arxiv-2109.07765 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)", "### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)", "### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)", "### Licensing information\n\n\nApache License, Version 2.0", "### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.\n\n\ninformation\nIf you use our models, please cite our latest preprint:\n\n\nIf you use our Medical Crawler corpus, please cite the preprint:", "### Disclaimer\n\n\n\nClick to expand\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for Digitalization and Artificial Intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos." ]
[ 74, 28, 37, 22, 12, 64, 364 ]
[ "passage: TAGS\n#transformers #pytorch #roberta #fill-mask #biomedical #clinical #spanish #es #arxiv-2109.03570 #arxiv-2109.07765 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)### Licensing information\n\n\nApache License, Version 2.0### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.\n\n\ninformation\nIf you use our models, please cite our latest preprint:\n\n\nIf you use our Medical Crawler corpus, please cite the preprint:" ]
[ -0.047067634761333466, 0.18988105654716492, -0.004417674615979195, -0.010404507629573345, 0.036213476210832596, 0.020307617262005806, 0.09449630975723267, 0.1222444400191307, 0.0043885004706680775, 0.0634651854634285, 0.11071622371673584, 0.135728657245636, 0.08991432934999466, 0.025515098124742508, 0.041296254843473434, -0.18902036547660828, -0.001828448032028973, 0.06938686966896057, 0.019451146945357323, 0.04885824769735336, 0.036778807640075684, -0.027925822883844376, 0.08163066953420639, 0.011846156790852547, 0.011012021452188492, -0.0022970878053456545, 0.04435480758547783, -0.08743946999311447, 0.12473027408123016, 0.007575790397822857, 0.03242453932762146, 0.05711251124739647, 0.054113492369651794, -0.10664539784193039, 0.003086875658482313, -0.026753373444080353, -0.05354335531592369, 0.13191209733486176, 0.05511781573295593, -0.09549271315336227, 0.19403789937496185, -0.14510652422904968, -0.01846996508538723, 0.014352400787174702, -0.1621154546737671, -0.19078266620635986, -0.11893384158611298, 0.0811123326420784, 0.053393468260765076, 0.07020572572946548, 0.021984757855534554, 0.13339835405349731, 0.03498527407646179, 0.008710500784218311, 0.1393752247095108, -0.19889245927333832, -0.035652898252010345, 0.015332820825278759, 0.18006916344165802, 0.08834998309612274, 0.022710975259542465, 0.08436925709247589, 0.03787744790315628, -0.024708222597837448, -0.021396689116954803, -0.11693932116031647, -0.06226729601621628, -0.040975991636514664, -0.05933494493365288, -0.04327266663312912, 0.26302728056907654, -0.08789578825235367, -0.049902163445949554, 0.09205816686153412, -0.012522134929895401, 0.02536802925169468, 0.01751215010881424, -0.050644032657146454, 0.07827911525964737, -0.06078113242983818, 0.07732006162405014, -0.07175135612487793, -0.08148129284381866, -0.06235207989811897, -0.12806186079978943, 0.05938354879617691, -0.002708648331463337, 0.022487860172986984, 0.003748607588931918, 0.014327134005725384, -0.018524695187807083, -0.039385806769132614, 0.001697742729447782, 0.03389321640133858, 0.0044600507244467735, -0.020063916221261024, -0.07034582644701004, -0.09111134707927704, 0.11421503871679306, 0.17478136718273163, 0.007173221092671156, -0.06032360717654228, -0.0043019624426960945, 0.08435329794883728, 0.04179222509264946, -0.02103372849524021, -0.18051192164421082, -0.0023410459980368614, 0.06418991833925247, -0.038233425468206406, 0.11141985654830933, 0.0054046339355409145, -0.07216945290565491, -0.0023429463617503643, -0.07189767062664032, 0.024847961962223053, 0.048604898154735565, -0.03018525242805481, -0.06983775645494461, -0.05159550532698631, 0.15726758539676666, 0.008345862850546837, -0.059035949409008026, -0.018581122159957886, -0.026255063712596893, -0.024179868400096893, 0.10293065011501312, 0.03589225187897682, -0.07171064615249634, -0.02505289949476719, -0.07471851259469986, -0.006567288190126419, -0.020404169335961342, -0.021554069593548775, 0.06792598217725754, -0.05891058221459389, 0.04945163428783417, -0.1285681128501892, -0.07717437297105789, -0.05016518756747246, 0.09372770041227341, -0.07867424935102463, 0.011253193020820618, 0.03548961132764816, 0.1003633663058281, -0.024292370304465294, -0.06757402420043945, -0.003836692776530981, -0.05526376888155937, 0.06649364531040192, -0.058662209659814835, 0.0950232744216919, -0.2585325241088867, 0.012236555106937885, -0.1226331889629364, 0.0020650194492191076, -0.11882251501083374, -0.027696559205651283, -0.03520885482430458, -0.008004063740372658, -0.1691233068704605, -0.009613816626369953, -0.08450651913881302, 0.011524243280291557, 0.06614208221435547, 0.08540904521942139, -0.06695301085710526, -0.006310997065156698, 0.16345272958278656, -0.0017352576833218336, -0.16245263814926147, 0.15542854368686676, 0.004239968489855528, 0.09662077575922012, 0.0761948749423027, 0.2036125212907791, 0.07496393471956253, -0.16466602683067322, -0.0012379471445456147, -0.09743218123912811, -0.02107228897511959, -0.18536217510700226, 0.1450846940279007, -0.07871922850608826, -0.0020670974627137184, -0.02438397891819477, -0.1829632818698883, -0.011189221404492855, -0.012510867789387703, -0.04452505707740784, 0.042106110602617264, -0.07525096833705902, -0.06422905623912811, -0.0011773253791034222, 0.049843959510326385, -0.017727283760905266, 0.028224913403391838, 0.011233123950660229, 0.04044618457555771, 0.05349629744887352, 0.00379189639352262, -0.08003924041986465, -0.0023989584296941757, -0.057902175933122635, 0.0034496155567467213, -0.13470624387264252, 0.06145857274532318, -0.007816356606781483, -0.10087274760007858, 0.12639473378658295, -0.013014301657676697, -0.039995644241571426, 0.031323302537202835, -0.0011437706416472793, 0.05734022706747055, 0.0708763524889946, -0.03452780097723007, -0.041274864226579666, -0.11830339580774307, 0.10574530810117722, -0.01197146438062191, 0.0692751333117485, -0.008859171532094479, 0.01991206407546997, -0.051897041499614716, 0.009985213167965412, -0.01809903234243393, -0.009932668879628181, 0.07096552103757858, 0.058015670627355576, -0.034036342054605484, 0.048393867909908295, 0.09906947612762451, -0.010727310553193092, -0.11621256172657013, 0.07124541699886322, -0.059993091970682144, 0.07049938291311264, 0.0882023423910141, -0.1086529791355133, -0.0023519613314419985, -0.0785117968916893, -0.00968876201659441, 0.01383240707218647, -0.04619242250919342, -0.06520453840494156, 0.2247539907693863, -0.010014962404966354, 0.06316018104553223, -0.08844803273677826, -0.004188348073512316, -0.015970047563314438, -0.12781231105327606, -0.03176214173436165, 0.07358904182910919, 0.1577252745628357, -0.09007909893989563, 0.028919877484440804, 0.09062860161066055, -0.07119673490524292, 0.14244870841503143, 0.05408448353409767, -0.053514763712882996, -0.07059839367866516, -0.044904038310050964, 0.06556522846221924, 0.11205916106700897, -0.17249546945095062, -0.004375440534204245, 0.082103431224823, -0.05523153021931648, 0.025898002088069916, -0.10785575956106186, -0.03771710395812988, -0.05766456201672554, -0.02592296153306961, -0.10282311588525772, 0.018760573118925095, -0.10243865102529526, 0.1277751475572586, 0.029896896332502365, 0.048054106533527374, 0.005970001220703125, 0.0007746785413473845, -0.08580485731363297, 0.11948984861373901, -0.054175037890672684, -0.28883492946624756, -0.1935776323080063, 0.021656878292560577, -0.021185463294386864, 0.06845598667860031, 0.034434057772159576, -0.02217436581850052, -0.006934445817023516, 0.01093397755175829, -0.06505811214447021, 0.13177156448364258, -0.08823736011981964, -0.040037088096141815, 0.04639900103211403, 0.07640762627124786, -0.0678064152598381, -0.020329272374510765, -0.059548184275627136, -0.007955783978104591, 0.048619139939546585, -0.05315108969807625, 0.18960775434970856, 0.056187719106674194, 0.03796133026480675, -0.05603880062699318, 0.0000804073570179753, 0.044883616268634796, -0.04562632367014885, 0.009672440588474274, 0.2596219778060913, 0.06621023267507553, -0.007767004426568747, 0.15634188055992126, 0.06935004889965057, -0.10821916908025742, 0.039659060537815094, -0.018465271219611168, -0.04808269813656807, -0.2818748652935028, -0.08287694305181503, -0.0521860271692276, -0.0701441541314125, 0.029881685972213745, 0.015213732607662678, 0.044565364718437195, 0.1088583767414093, -0.005304321181029081, 0.040479257702827454, -0.035468216985464096, 0.07323950529098511, 0.08157055079936981, 0.0037972438149154186, 0.09185046702623367, -0.03513273969292641, -0.05163755267858505, 0.09622590988874435, 0.1576641947031021, 0.09842468798160553, 0.08174126595258713, 0.19673602283000946, 0.08804432302713394, 0.07896203547716141, -0.01451046671718359, 0.04558645188808441, 0.08086797595024109, 0.018535306677222252, -0.0896114706993103, -0.056551046669483185, -0.14658261835575104, -0.009207215160131454, -0.02200116589665413, -0.022305326536297798, 0.03738216310739517, -0.09122069180011749, 0.05441237613558769, 0.083564393222332, -0.0038283595349639654, -0.20173285901546478, 0.015114076435565948, 0.047107186168432236, 0.027614692226052284, -0.04046953469514847, -0.01039219368249178, -0.021260039880871773, -0.02859824150800705, 0.08768203109502792, -0.017764678224921227, 0.0795774906873703, -0.08076635003089905, 0.05345433950424194, -0.08379805833101273, -0.08981406688690186, -0.00276245572604239, 0.07786131650209427, -0.19498619437217712, 0.26208147406578064, -0.00324492109939456, 0.011285084299743176, -0.05626164749264717, -0.013060553930699825, 0.020989300683140755, 0.11569177359342575, 0.1439765989780426, 0.022244498133659363, -0.0020713997073471546, -0.006582900881767273, -0.09725973010063171, 0.029172996059060097, -0.00032731235842220485, -0.09215869009494781, 0.020197030156850815, -0.003933073952794075, 0.009790821932256222, -0.098478302359581, -0.06254167854785919, -0.10715010017156601, -0.0789184644818306, 0.11035159230232239, -0.09329288452863693, 0.13017158210277557, -0.006747055798768997, -0.015665855258703232, -0.011648974381387234, 0.10611288994550705, -0.1386837512254715, -0.03903573378920555, -0.10108624398708344, 0.020816626027226448, 0.07211948931217194, -0.08796215802431107, -0.008373907767236233, 0.00802945252507925, -0.054197173565626144, -0.018525609746575356, -0.06552057713270187, 0.08178980648517609, -0.10957368463277817, -0.04781162366271019, -0.09966406226158142, 0.055847909301519394, 0.009131031110882759, 0.07219358533620834, 0.021258262917399406, 0.01800791174173355, -0.0532306432723999, -0.056483421474695206, 0.08398943394422531, 0.033698126673698425, 0.10346989333629608, 0.007666824851185083, -0.12132507562637329, -0.1790284961462021, -0.0704631358385086, -0.07494433224201202, 0.0684133768081665, 0.1649758517742157, 0.01002065371721983, 0.07566240429878235, 0.295200914144516, -0.1284116506576538, -0.25647860765457153, -0.07911969721317291, -0.005390843842178583, -0.06165894493460655, -0.009247547015547752, -0.19291381537914276, 0.07082738727331161, 0.238912433385849, -0.02175959013402462, -0.09347597509622574, -0.16345876455307007, -0.09508538991212845, 0.05120566487312317, 0.09486155956983566, 0.17467911541461945, -0.19859521090984344, -0.09940719604492188, -0.07806793600320816, -0.04215705022215843, 0.15564599633216858, -0.07809474319219589, 0.07216200977563858, -0.03955819457769394, -0.05936842039227486, -0.009756800718605518, -0.02103235200047493, 0.16633227467536926, -0.04966912046074867, 0.0367094986140728, -0.03101223148405552, -0.019125815480947495, 0.12948736548423767, 0.01854524202644825, 0.12771613895893097, 0.06583619862794876, -0.044766008853912354, -0.06255196779966354, 0.001998902764171362, -0.08186039328575134, 0.07503952085971832, 0.0035154959186911583, -0.06418761610984802, -0.0565510094165802, 0.05242745205760002, -0.029402531683444977, 0.015766477212309837, 0.14178913831710815, -0.10509423911571503, -0.0021002686116844416, 0.11560060828924179, 0.24248234927654266, -0.0015535430284217, 0.06706760078668594, 0.014115184545516968, -0.07839276641607285, 0.05711418017745018, -0.06195524334907532, -0.03757321462035179, 0.1169225350022316, 0.0047790780663490295, 0.06686383485794067, -0.02869168482720852, -0.06596987694501877, 0.06463020294904709, 0.137352854013443, -0.10453975945711136, -0.10852832347154617, 0.012243008241057396, 0.017188848927617073, -0.06325262784957886, 0.04811760410666466, 0.14444535970687866, 0.0023136460222303867, -0.0640622228384018, -0.005456708371639252, 0.0687289908528328, -0.06663330644369125, 0.15356183052062988, 0.06422185897827148, -0.025262653827667236, -0.07305750250816345, 0.08157189190387726, 0.14739108085632324, -0.15086472034454346, 0.01323959231376648, 0.0404236726462841, -0.05668497830629349, -0.10507380962371826, -0.08020095527172089, 0.04088791087269783, -0.2225785255432129, -0.08908288925886154, -0.11101512610912323, -0.031911905854940414, -0.029821841046214104, 0.18155823647975922, 0.00202856189571321, -0.07274620234966278, 0.01520826667547226, -0.0014629112556576729, -0.036521390080451965, 0.08733481168746948, -0.01817324198782444, 0.04611744359135628, 0.0412275530397892, 0.045536477118730545, 0.006572641897946596, 0.016972186043858528, -0.057481154799461365, 0.05859334021806717, -0.1532873511314392, -0.030892174690961838, -0.09842967242002487, 0.039743270725011826, -0.07005216926336288, -0.03427368775010109, -0.00958872027695179, -0.05457711219787598, -0.06633486598730087, -0.05689532682299614, -0.05911896750330925, 0.01059192419052124, 0.04892759025096893, 0.10923927277326584, -0.09123557060956955, -0.05604925379157066, 0.081244558095932, 0.03769971430301666, 0.0676402598619461, -0.025748571380972862, 0.004500253591686487, 0.02290579117834568, -0.1596587896347046, 0.07501352578401566, 0.062766432762146, 0.11240726709365845, 0.0019802346359938383, -0.15632915496826172, -0.00818308349698782, 0.057727012783288956, -0.03935738652944565, 0.049929410219192505, 0.08345507830381393, -0.07568863034248352, 0.025765473023056984, 0.026437953114509583, -0.08694356679916382, -0.03330802544951439, 0.006457692012190819, 0.10109617561101913, -0.017656799405813217, 0.059497129172086716, -0.07683116942644119, -0.04683258384466171, -0.09558413922786713, 0.053517263382673264, -0.026297107338905334, -0.08546170592308044, -0.1817205250263214, -0.08278141170740128, 0.018319975584745407, 0.05806863307952881, 0.20642878115177155, 0.03592656925320625, -0.023651711642742157, 0.0011424818076193333, 0.14718511700630188, 0.06142113730311394, 0.00736612593755126, 0.05871012061834335, 0.009664840064942837, -0.009357994422316551, -0.11770771443843842, 0.09804438799619675, -0.011647829785943031, -0.09198182821273804, 0.11627735197544098, 0.1344183087348938, 0.1570330113172531, -0.009148430079221725, 0.027485786005854607, 0.005991379264742136, -0.01411285437643528, -0.032840583473443985, 0.10784327983856201, -0.018533028662204742, 0.019072137773036957, -0.003490696894004941, 0.1806396096944809, -0.14426635205745697, 0.019092192873358727, 0.02333739958703518, -0.030388478189706802, -0.07958272099494934, -0.14621248841285706, 0.0016994006000459194, -0.03950338438153267, -0.06744814664125443, -0.11913960427045822, 0.008244403637945652, 0.08003920316696167, -0.013298221863806248, -0.04893011599779129, 0.1054907888174057, -0.14713381230831146, -0.03458442538976669, 0.016666920855641365, 0.01902213878929615, 0.05937063321471214, -0.151642307639122, 0.01255940180271864, -0.0011704599019140005, 0.09884843230247498, 0.04230011627078056, 0.058416660875082016, 0.03788001090288162, -0.022975191473960876, -0.06727337092161179, -0.06290709972381592, 0.016906628385186195, 0.0009021356818266213, -0.006775268353521824, 0.1361664980649948, 0.08699148148298264, -0.027943704277276993, 0.09391066431999207, 0.13514213263988495, -0.0011166557669639587, 0.018044399097561836, -0.10910134017467499, 0.1632973551750183, -0.010928301140666008, 0.018387146294116974, -0.033371929079294205, -0.046943534165620804, 0.028097786009311676, 0.2684418857097626, 0.14666129648685455, -0.03582135960459709, -0.03330397978425026, 0.004152394365519285, 0.0335422046482563, 0.08388663828372955, 0.09882549941539764, 0.06447846442461014, 0.18558049201965332, -0.043630316853523254, -0.01954578049480915, -0.03318948298692703, 0.02011743001639843, -0.13307614624500275, 0.0018748082220554352, 0.03174252063035965, -0.03577277436852455, 0.006351158022880554, 0.09527623653411865, -0.11282173544168472, -0.1403941661119461, -0.0001502943632658571, -0.09149111062288284, -0.07104945182800293, -0.03720159828662872, 0.010758168995380402, 0.003253023372963071, 0.03513778746128082, 0.02640460804104805, 0.0010103968670591712, 0.1082896739244461, 0.027494560927152634, -0.1613702028989792, -0.04069828242063522, 0.07103666663169861, -0.06272301077842712, 0.2572016716003418, -0.03549053147435188, -0.000885755114722997, 0.07506032288074493, -0.041638053953647614, -0.08480450510978699, 0.12739881873130798, -0.030855027958750725, 0.00918253418058157, 0.16228428483009338, -0.03786856308579445, 0.04990670084953308, 0.03605155274271965, 0.0772462785243988, 0.01646803878247738, 0.013914803974330425, 0.0213470421731472, 0.0438428670167923, -0.06798555701971054, 0.16336514055728912, -0.1454218178987503, 0.10033374279737473, 0.08665824681520462, -0.010970117524266243, 0.03305022791028023, -0.08970970660448074, 0.009251000359654427, 0.08973551541566849, 0.05903802812099457, -0.011234062723815441, -0.14951805770397186, 0.04113384708762169, -0.08550240844488144, 0.036861855536699295, -0.24328288435935974, -0.0059525673277676105, -0.07280667126178741, -0.0027915711980313063, -0.07743918895721436, -0.007048245519399643, 0.055326081812381744, -0.0033522057346999645, -0.03697121515870094, -0.023535843938589096, -0.007403118535876274, 0.03570013493299484, -0.09849660098552704, -0.04218512773513794 ]
null
null
transformers
# Biomedical language model for Spanish ## Table of contents <details> <summary>Click to expand</summary> - [Model description](#model-description) - [Intended uses and limitations](#intended-use) - [How to use](#how-to-use) - [Limitations and bias](#limitations-and-bias) - [Training](#training) - [Tokenization and model pretraining](#Tokenization-pretraining) - [Training corpora and preprocessing](#training-corpora-preprocessing) - [Evaluation](#evaluation) - [Additional information](#additional-information) - [Author](#author) - [Contact information](#contact-information) - [Copyright](#copyright) - [Licensing information](#licensing-information) - [Funding](#funding) - [Disclaimer](#disclaimer) </details> ## Model description Biomedical pretrained language model for Spanish. For more details about the corpus, the pretraining and the evaluation, check the official [repository](https://github.com/PlanTL-SANIDAD/lm-biomedical-clinical-es) and read our [preprint](https://arxiv.org/abs/2109.03570). ## Intended uses and limitations The model is ready-to-use only for masked language modelling to perform the Fill Mask task (try the inference API or read the next section). However, it is intended to be fine-tuned on downstream tasks such as Named Entity Recognition or Text Classification. ## How to use ```python from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("BSC-TeMU/roberta-base-biomedical-es") model = AutoModelForMaskedLM.from_pretrained("BSC-TeMU/roberta-base-biomedical-es") from transformers import pipeline unmasker = pipeline('fill-mask', model="BSC-TeMU/roberta-base-biomedical-es") unmasker("El único antecedente personal a reseñar era la <mask> arterial.") ``` ``` # Output [ { "sequence": " El único antecedente personal a reseñar era la hipertensión arterial.", "score": 0.9855039715766907, "token": 3529, "token_str": " hipertensión" }, { "sequence": " El único antecedente personal a reseñar era la diabetes arterial.", "score": 0.0039140828885138035, "token": 1945, "token_str": " diabetes" }, { "sequence": " El único antecedente personal a reseñar era la hipotensión arterial.", "score": 0.002484665485098958, "token": 11483, "token_str": " hipotensión" }, { "sequence": " El único antecedente personal a reseñar era la Hipertensión arterial.", "score": 0.0023484621196985245, "token": 12238, "token_str": " Hipertensión" }, { "sequence": " El único antecedente personal a reseñar era la presión arterial.", "score": 0.0008009297889657319, "token": 2267, "token_str": " presión" } ] ``` ## Training ### Tokenization and model pretraining This model is a [RoBERTa-based](https://github.com/pytorch/fairseq/tree/master/examples/roberta) model trained on a **biomedical** corpus in Spanish collected from several sources (see next section). The training corpus has been tokenized using a byte version of [Byte-Pair Encoding (BPE)](https://github.com/openai/gpt-2) used in the original [RoBERTA](https://github.com/pytorch/fairseq/tree/master/examples/roberta) model with a vocabulary size of 52,000 tokens. The pretraining consists of a masked language model training at the subword level following the approach employed for the RoBERTa base model with the same hyperparameters as in the original work. The training lasted a total of 48 hours with 16 NVIDIA V100 GPUs of 16GB DDRAM, using Adam optimizer with a peak learning rate of 0.0005 and an effective batch size of 2,048 sentences. ### Training corpora and preprocessing The training corpus is composed of several biomedical corpora in Spanish, collected from publicly available corpora and crawlers. To obtain a high-quality training corpus, a cleaning pipeline with the following operations has been applied: - data parsing in different formats - sentence splitting - language detection - filtering of ill-formed sentences - deduplication of repetitive contents - keep the original document boundaries Finally, the corpora are concatenated and further global deduplication among the corpora have been applied. The result is a medium-size biomedical corpus for Spanish composed of about 963M tokens. The table below shows some basic statistics of the individual cleaned corpora: | Name | No. tokens | Description | |-----------------------------------------------------------------------------------------|-------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | [Medical crawler](https://zenodo.org/record/4561970) | 745,705,946 | Crawler of more than 3,000 URLs belonging to Spanish biomedical and health domains. | | Clinical cases misc. | 102,855,267 | A miscellany of medical content, essentially clinical cases. Note that a clinical case report is a scientific publication where medical practitioners share patient cases and it is different from a clinical note or document. | | [Scielo](https://github.com/PlanTL-SANIDAD/SciELO-Spain-Crawler) | 60,007,289 | Publications written in Spanish crawled from the Spanish SciELO server in 2017. | | [BARR2_background](https://temu.bsc.es/BARR2/downloads/background_set.raw_text.tar.bz2) | 24,516,442 | Biomedical Abbreviation Recognition and Resolution (BARR2) containing Spanish clinical case study sections from a variety of clinical disciplines. | | Wikipedia_life_sciences | 13,890,501 | Wikipedia articles crawled 04/01/2021 with the [Wikipedia API python library](https://pypi.org/project/Wikipedia-API/) starting from the "Ciencias\_de\_la\_vida" category up to a maximum of 5 subcategories. Multiple links to the same articles are then discarded to avoid repeating content. | | Patents | 13,463,387 | Google Patent in Medical Domain for Spain (Spanish). The accepted codes (Medical Domain) for Json files of patents are: "A61B", "A61C","A61F", "A61H", "A61K", "A61L","A61M", "A61B", "A61P". | | [EMEA](http://opus.nlpl.eu/download.php?f=EMEA/v3/moses/en-es.txt.zip) | 5,377,448 | Spanish-side documents extracted from parallel corpora made out of PDF documents from the European Medicines Agency. | | [mespen_Medline](https://zenodo.org/record/3562536#.YTt1fH2xXbR) | 4,166,077 | Spanish-side articles extracted from a collection of Spanish-English parallel corpus consisting of biomedical scientific literature. The collection of parallel resources are aggregated from the MedlinePlus source. | | PubMed | 1,858,966 | Open-access articles from the PubMed repository crawled in 2017. | ## Evaluation The model has been evaluated on the Named Entity Recognition (NER) using the following datasets: - [PharmaCoNER](https://zenodo.org/record/4270158): is a track on chemical and drug mention recognition from Spanish medical texts (for more info see: https://temu.bsc.es/pharmaconer/). - [CANTEMIST](https://zenodo.org/record/3978041#.YTt5qH2xXbQ): is a shared task specifically focusing on named entity recognition of tumor morphology, in Spanish (for more info see: https://zenodo.org/record/3978041#.YTt5qH2xXbQ). - ICTUSnet: consists of 1,006 hospital discharge reports of patients admitted for stroke from 18 different Spanish hospitals. It contains more than 79,000 annotations for 51 different kinds of variables. The evaluation results are compared against the [mBERT](https://huggingface.co/bert-base-multilingual-cased) and [BETO](https://huggingface.co/dccuchile/bert-base-spanish-wwm-cased) models: | F1 - Precision - Recall | roberta-base-biomedical-es | mBERT | BETO | |---------------------------|----------------------------|-------------------------------|-------------------------| | PharmaCoNER | **89.48** - **87.85** - **91.18** | 87.46 - 86.50 - 88.46 | 88.18 - 87.12 - 89.28 | | CANTEMIST | **83.87** - **81.70** - **86.17** | 82.61 - 81.12 - 84.15 | 82.42 - 80.91 - 84.00 | | ICTUSnet | **88.12** - **85.56** - **90.83** | 86.75 - 83.53 - 90.23 | 85.95 - 83.10 - 89.02 | ## Additional information ### Author Text Mining Unit (TeMU) at the Barcelona Supercomputing Center ([email protected]) ### Contact information For further information, send an email to <[email protected]> ### Copyright Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022) ### Licensing information [Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0) ### Funding This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL. ## Citation information If you use our models, please cite our latest preprint: ```bibtex @misc{carrino2021biomedical, title={Biomedical and Clinical Language Models for Spanish: On the Benefits of Domain-Specific Pretraining in a Mid-Resource Scenario}, author={Casimiro Pio Carrino and Jordi Armengol-Estapé and Asier Gutiérrez-Fandiño and Joan Llop-Palao and Marc Pàmies and Aitor Gonzalez-Agirre and Marta Villegas}, year={2021}, eprint={2109.03570}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` If you use our Medical Crawler corpus, please cite the preprint: ```bibtex @misc{carrino2021spanish, title={Spanish Biomedical Crawled Corpus: A Large, Diverse Dataset for Spanish Biomedical Language Models}, author={Casimiro Pio Carrino and Jordi Armengol-Estapé and Ona de Gibert Bonet and Asier Gutiérrez-Fandiño and Aitor Gonzalez-Agirre and Martin Krallinger and Marta Villegas}, year={2021}, eprint={2109.07765}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ### Disclaimer <details> <summary>Click to expand</summary> The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence. In no event shall the owner of the models (SEDIA – State Secretariat for Digitalization and Artificial Intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models. Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables. Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial. En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos. </details>
{"language": ["es"], "license": "apache-2.0", "tags": ["biomedical", "spanish"], "metrics": ["ppl"], "widget": [{"text": "El \u00fanico antecedente personal a rese\u00f1ar era la <mask> arterial."}, {"text": "Las radiolog\u00edas \u00f3seas de cuerpo entero no detectan alteraciones <mask>, ni alteraciones vertebrales."}, {"text": "En el <mask> toraco-abd\u00f3mino-p\u00e9lvico no se encontraron hallazgos patol\u00f3gicos de inter\u00e9s."}]}
fill-mask
PlanTL-GOB-ES/roberta-base-biomedical-es
[ "transformers", "pytorch", "roberta", "fill-mask", "biomedical", "spanish", "es", "arxiv:2109.03570", "arxiv:2109.07765", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2109.03570", "2109.07765" ]
[ "es" ]
TAGS #transformers #pytorch #roberta #fill-mask #biomedical #spanish #es #arxiv-2109.03570 #arxiv-2109.07765 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
Biomedical language model for Spanish ===================================== Table of contents ----------------- Click to expand * Model description * Intended uses and limitations * How to use * Limitations and bias * Training + Tokenization and model pretraining + Training corpora and preprocessing * Evaluation * Additional information + Author + Contact information + Copyright + Licensing information + Funding + Disclaimer Model description ----------------- Biomedical pretrained language model for Spanish. For more details about the corpus, the pretraining and the evaluation, check the official repository and read our preprint. Intended uses and limitations ----------------------------- The model is ready-to-use only for masked language modelling to perform the Fill Mask task (try the inference API or read the next section). However, it is intended to be fine-tuned on downstream tasks such as Named Entity Recognition or Text Classification. How to use ---------- Training -------- ### Tokenization and model pretraining This model is a RoBERTa-based model trained on a biomedical corpus in Spanish collected from several sources (see next section). The training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE) used in the original RoBERTA model with a vocabulary size of 52,000 tokens. The pretraining consists of a masked language model training at the subword level following the approach employed for the RoBERTa base model with the same hyperparameters as in the original work. The training lasted a total of 48 hours with 16 NVIDIA V100 GPUs of 16GB DDRAM, using Adam optimizer with a peak learning rate of 0.0005 and an effective batch size of 2,048 sentences. ### Training corpora and preprocessing The training corpus is composed of several biomedical corpora in Spanish, collected from publicly available corpora and crawlers. To obtain a high-quality training corpus, a cleaning pipeline with the following operations has been applied: * data parsing in different formats + sentence splitting + language detection + filtering of ill-formed sentences + deduplication of repetitive contents + keep the original document boundaries Finally, the corpora are concatenated and further global deduplication among the corpora have been applied. The result is a medium-size biomedical corpus for Spanish composed of about 963M tokens. The table below shows some basic statistics of the individual cleaned corpora: Name: Medical crawler, No. tokens: 745,705,946, Description: Crawler of more than 3,000 URLs belonging to Spanish biomedical and health domains. Name: Clinical cases misc., No. tokens: 102,855,267, Description: A miscellany of medical content, essentially clinical cases. Note that a clinical case report is a scientific publication where medical practitioners share patient cases and it is different from a clinical note or document. Name: Scielo, No. tokens: 60,007,289, Description: Publications written in Spanish crawled from the Spanish SciELO server in 2017. Name: BARR2\_background, No. tokens: 24,516,442, Description: Biomedical Abbreviation Recognition and Resolution (BARR2) containing Spanish clinical case study sections from a variety of clinical disciplines. Name: Wikipedia\_life\_sciences, No. tokens: 13,890,501, Description: Wikipedia articles crawled 04/01/2021 with the Wikipedia API python library starting from the "Ciencias\_de\_la\_vida" category up to a maximum of 5 subcategories. Multiple links to the same articles are then discarded to avoid repeating content. Name: Patents, No. tokens: 13,463,387, Description: Google Patent in Medical Domain for Spain (Spanish). The accepted codes (Medical Domain) for Json files of patents are: "A61B", "A61C","A61F", "A61H", "A61K", "A61L","A61M", "A61B", "A61P". Name: EMEA, No. tokens: 5,377,448, Description: Spanish-side documents extracted from parallel corpora made out of PDF documents from the European Medicines Agency. Name: mespen\_Medline, No. tokens: 4,166,077, Description: Spanish-side articles extracted from a collection of Spanish-English parallel corpus consisting of biomedical scientific literature. The collection of parallel resources are aggregated from the MedlinePlus source. Name: PubMed, No. tokens: 1,858,966, Description: Open-access articles from the PubMed repository crawled in 2017. Evaluation ---------- The model has been evaluated on the Named Entity Recognition (NER) using the following datasets: * PharmaCoNER: is a track on chemical and drug mention recognition from Spanish medical texts (for more info see: URL * CANTEMIST: is a shared task specifically focusing on named entity recognition of tumor morphology, in Spanish (for more info see: URL * ICTUSnet: consists of 1,006 hospital discharge reports of patients admitted for stroke from 18 different Spanish hospitals. It contains more than 79,000 annotations for 51 different kinds of variables. The evaluation results are compared against the mBERT and BETO models: Additional information ---------------------- ### Author Text Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL) ### Contact information For further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL) ### Copyright Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022) ### Licensing information Apache License, Version 2.0 ### Funding This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL. information If you use our models, please cite our latest preprint: If you use our Medical Crawler corpus, please cite the preprint: ### Disclaimer Click to expand The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence. In no event shall the owner of the models (SEDIA – State Secretariat for Digitalization and Artificial Intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models. Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables. Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial. En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos.
[ "### Tokenization and model pretraining\n\n\nThis model is a RoBERTa-based model trained on a\nbiomedical corpus in Spanish collected from several sources (see next section).\n\n\nThe training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE)\nused in the original RoBERTA model with a vocabulary size of 52,000 tokens. The pretraining consists of a masked language model training at the subword level following the approach employed for the RoBERTa base model with the same hyperparameters as in the original work. The training lasted a total of 48 hours with 16 NVIDIA V100 GPUs of 16GB DDRAM, using Adam optimizer with a peak learning rate of 0.0005 and an effective batch size of 2,048 sentences.", "### Training corpora and preprocessing\n\n\nThe training corpus is composed of several biomedical corpora in Spanish, collected from publicly available corpora and crawlers.\nTo obtain a high-quality training corpus, a cleaning pipeline with the following operations has been applied:\n\n\n* data parsing in different formats\n\t+ sentence splitting\n\t+ language detection\n\t+ filtering of ill-formed sentences\n\t+ deduplication of repetitive contents\n\t+ keep the original document boundaries\n\n\nFinally, the corpora are concatenated and further global deduplication among the corpora have been applied.\nThe result is a medium-size biomedical corpus for Spanish composed of about 963M tokens. The table below shows some basic statistics of the individual cleaned corpora:\n\n\nName: Medical crawler, No. tokens: 745,705,946, Description: Crawler of more than 3,000 URLs belonging to Spanish biomedical and health domains.\nName: Clinical cases misc., No. tokens: 102,855,267, Description: A miscellany of medical content, essentially clinical cases. Note that a clinical case report is a scientific publication where medical practitioners share patient cases and it is different from a clinical note or document.\nName: Scielo, No. tokens: 60,007,289, Description: Publications written in Spanish crawled from the Spanish SciELO server in 2017.\nName: BARR2\\_background, No. tokens: 24,516,442, Description: Biomedical Abbreviation Recognition and Resolution (BARR2) containing Spanish clinical case study sections from a variety of clinical disciplines.\nName: Wikipedia\\_life\\_sciences, No. tokens: 13,890,501, Description: Wikipedia articles crawled 04/01/2021 with the Wikipedia API python library starting from the \"Ciencias\\_de\\_la\\_vida\" category up to a maximum of 5 subcategories. Multiple links to the same articles are then discarded to avoid repeating content.\nName: Patents, No. tokens: 13,463,387, Description: Google Patent in Medical Domain for Spain (Spanish). The accepted codes (Medical Domain) for Json files of patents are: \"A61B\", \"A61C\",\"A61F\", \"A61H\", \"A61K\", \"A61L\",\"A61M\", \"A61B\", \"A61P\".\nName: EMEA, No. tokens: 5,377,448, Description: Spanish-side documents extracted from parallel corpora made out of PDF documents from the European Medicines Agency.\nName: mespen\\_Medline, No. tokens: 4,166,077, Description: Spanish-side articles extracted from a collection of Spanish-English parallel corpus consisting of biomedical scientific literature. The collection of parallel resources are aggregated from the MedlinePlus source.\nName: PubMed, No. tokens: 1,858,966, Description: Open-access articles from the PubMed repository crawled in 2017.\n\n\nEvaluation\n----------\n\n\nThe model has been evaluated on the Named Entity Recognition (NER) using the following datasets:\n\n\n* PharmaCoNER: is a track on chemical and drug mention recognition from Spanish medical texts (for more info see: URL\n* CANTEMIST: is a shared task specifically focusing on named entity recognition of tumor morphology, in Spanish (for more info see: URL\n* ICTUSnet: consists of 1,006 hospital discharge reports of patients admitted for stroke from 18 different Spanish hospitals. It contains more than 79,000 annotations for 51 different kinds of variables.\n\n\nThe evaluation results are compared against the mBERT and BETO models:\n\n\n\nAdditional information\n----------------------", "### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)", "### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)", "### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)", "### Licensing information\n\n\nApache License, Version 2.0", "### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.\n\n\ninformation\nIf you use our models, please cite our latest preprint:\n\n\nIf you use our Medical Crawler corpus, please cite the preprint:", "### Disclaimer\n\n\n\nClick to expand\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for Digitalization and Artificial Intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos." ]
[ "TAGS\n#transformers #pytorch #roberta #fill-mask #biomedical #spanish #es #arxiv-2109.03570 #arxiv-2109.07765 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "### Tokenization and model pretraining\n\n\nThis model is a RoBERTa-based model trained on a\nbiomedical corpus in Spanish collected from several sources (see next section).\n\n\nThe training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE)\nused in the original RoBERTA model with a vocabulary size of 52,000 tokens. The pretraining consists of a masked language model training at the subword level following the approach employed for the RoBERTa base model with the same hyperparameters as in the original work. The training lasted a total of 48 hours with 16 NVIDIA V100 GPUs of 16GB DDRAM, using Adam optimizer with a peak learning rate of 0.0005 and an effective batch size of 2,048 sentences.", "### Training corpora and preprocessing\n\n\nThe training corpus is composed of several biomedical corpora in Spanish, collected from publicly available corpora and crawlers.\nTo obtain a high-quality training corpus, a cleaning pipeline with the following operations has been applied:\n\n\n* data parsing in different formats\n\t+ sentence splitting\n\t+ language detection\n\t+ filtering of ill-formed sentences\n\t+ deduplication of repetitive contents\n\t+ keep the original document boundaries\n\n\nFinally, the corpora are concatenated and further global deduplication among the corpora have been applied.\nThe result is a medium-size biomedical corpus for Spanish composed of about 963M tokens. The table below shows some basic statistics of the individual cleaned corpora:\n\n\nName: Medical crawler, No. tokens: 745,705,946, Description: Crawler of more than 3,000 URLs belonging to Spanish biomedical and health domains.\nName: Clinical cases misc., No. tokens: 102,855,267, Description: A miscellany of medical content, essentially clinical cases. Note that a clinical case report is a scientific publication where medical practitioners share patient cases and it is different from a clinical note or document.\nName: Scielo, No. tokens: 60,007,289, Description: Publications written in Spanish crawled from the Spanish SciELO server in 2017.\nName: BARR2\\_background, No. tokens: 24,516,442, Description: Biomedical Abbreviation Recognition and Resolution (BARR2) containing Spanish clinical case study sections from a variety of clinical disciplines.\nName: Wikipedia\\_life\\_sciences, No. tokens: 13,890,501, Description: Wikipedia articles crawled 04/01/2021 with the Wikipedia API python library starting from the \"Ciencias\\_de\\_la\\_vida\" category up to a maximum of 5 subcategories. Multiple links to the same articles are then discarded to avoid repeating content.\nName: Patents, No. tokens: 13,463,387, Description: Google Patent in Medical Domain for Spain (Spanish). The accepted codes (Medical Domain) for Json files of patents are: \"A61B\", \"A61C\",\"A61F\", \"A61H\", \"A61K\", \"A61L\",\"A61M\", \"A61B\", \"A61P\".\nName: EMEA, No. tokens: 5,377,448, Description: Spanish-side documents extracted from parallel corpora made out of PDF documents from the European Medicines Agency.\nName: mespen\\_Medline, No. tokens: 4,166,077, Description: Spanish-side articles extracted from a collection of Spanish-English parallel corpus consisting of biomedical scientific literature. The collection of parallel resources are aggregated from the MedlinePlus source.\nName: PubMed, No. tokens: 1,858,966, Description: Open-access articles from the PubMed repository crawled in 2017.\n\n\nEvaluation\n----------\n\n\nThe model has been evaluated on the Named Entity Recognition (NER) using the following datasets:\n\n\n* PharmaCoNER: is a track on chemical and drug mention recognition from Spanish medical texts (for more info see: URL\n* CANTEMIST: is a shared task specifically focusing on named entity recognition of tumor morphology, in Spanish (for more info see: URL\n* ICTUSnet: consists of 1,006 hospital discharge reports of patients admitted for stroke from 18 different Spanish hospitals. It contains more than 79,000 annotations for 51 different kinds of variables.\n\n\nThe evaluation results are compared against the mBERT and BETO models:\n\n\n\nAdditional information\n----------------------", "### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)", "### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)", "### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)", "### Licensing information\n\n\nApache License, Version 2.0", "### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.\n\n\ninformation\nIf you use our models, please cite our latest preprint:\n\n\nIf you use our Medical Crawler corpus, please cite the preprint:", "### Disclaimer\n\n\n\nClick to expand\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for Digitalization and Artificial Intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos." ]
[ 71, 173, 826, 28, 37, 22, 12, 64, 364 ]
[ "passage: TAGS\n#transformers #pytorch #roberta #fill-mask #biomedical #spanish #es #arxiv-2109.03570 #arxiv-2109.07765 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Tokenization and model pretraining\n\n\nThis model is a RoBERTa-based model trained on a\nbiomedical corpus in Spanish collected from several sources (see next section).\n\n\nThe training corpus has been tokenized using a byte version of Byte-Pair Encoding (BPE)\nused in the original RoBERTA model with a vocabulary size of 52,000 tokens. The pretraining consists of a masked language model training at the subword level following the approach employed for the RoBERTa base model with the same hyperparameters as in the original work. The training lasted a total of 48 hours with 16 NVIDIA V100 GPUs of 16GB DDRAM, using Adam optimizer with a peak learning rate of 0.0005 and an effective batch size of 2,048 sentences.", "passage: ### Training corpora and preprocessing\n\n\nThe training corpus is composed of several biomedical corpora in Spanish, collected from publicly available corpora and crawlers.\nTo obtain a high-quality training corpus, a cleaning pipeline with the following operations has been applied:\n\n\n* data parsing in different formats\n\t+ sentence splitting\n\t+ language detection\n\t+ filtering of ill-formed sentences\n\t+ deduplication of repetitive contents\n\t+ keep the original document boundaries\n\n\nFinally, the corpora are concatenated and further global deduplication among the corpora have been applied.\nThe result is a medium-size biomedical corpus for Spanish composed of about 963M tokens. The table below shows some basic statistics of the individual cleaned corpora:\n\n\nName: Medical crawler, No. tokens: 745,705,946, Description: Crawler of more than 3,000 URLs belonging to Spanish biomedical and health domains.\nName: Clinical cases misc., No. tokens: 102,855,267, Description: A miscellany of medical content, essentially clinical cases. Note that a clinical case report is a scientific publication where medical practitioners share patient cases and it is different from a clinical note or document.\nName: Scielo, No. tokens: 60,007,289, Description: Publications written in Spanish crawled from the Spanish SciELO server in 2017.\nName: BARR2\\_background, No. tokens: 24,516,442, Description: Biomedical Abbreviation Recognition and Resolution (BARR2) containing Spanish clinical case study sections from a variety of clinical disciplines.\nName: Wikipedia\\_life\\_sciences, No. tokens: 13,890,501, Description: Wikipedia articles crawled 04/01/2021 with the Wikipedia API python library starting from the \"Ciencias\\_de\\_la\\_vida\" category up to a maximum of 5 subcategories. Multiple links to the same articles are then discarded to avoid repeating content.\nName: Patents, No. tokens: 13,463,387, Description: Google Patent in Medical Domain for Spain (Spanish). The accepted codes (Medical Domain) for Json files of patents are: \"A61B\", \"A61C\",\"A61F\", \"A61H\", \"A61K\", \"A61L\",\"A61M\", \"A61B\", \"A61P\".\nName: EMEA, No. tokens: 5,377,448, Description: Spanish-side documents extracted from parallel corpora made out of PDF documents from the European Medicines Agency.\nName: mespen\\_Medline, No. tokens: 4,166,077, Description: Spanish-side articles extracted from a collection of Spanish-English parallel corpus consisting of biomedical scientific literature. The collection of parallel resources are aggregated from the MedlinePlus source.\nName: PubMed, No. tokens: 1,858,966, Description: Open-access articles from the PubMed repository crawled in 2017.\n\n\nEvaluation\n----------\n\n\nThe model has been evaluated on the Named Entity Recognition (NER) using the following datasets:\n\n\n* PharmaCoNER: is a track on chemical and drug mention recognition from Spanish medical texts (for more info see: URL\n* CANTEMIST: is a shared task specifically focusing on named entity recognition of tumor morphology, in Spanish (for more info see: URL\n* ICTUSnet: consists of 1,006 hospital discharge reports of patients admitted for stroke from 18 different Spanish hospitals. It contains more than 79,000 annotations for 51 different kinds of variables.\n\n\nThe evaluation results are compared against the mBERT and BETO models:\n\n\n\nAdditional information\n----------------------### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)### Licensing information\n\n\nApache License, Version 2.0### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.\n\n\ninformation\nIf you use our models, please cite our latest preprint:\n\n\nIf you use our Medical Crawler corpus, please cite the preprint:" ]
[ -0.040406689047813416, 0.033452391624450684, -0.0025445816572755575, 0.05155792832374573, 0.09375496208667755, 0.016980674117803574, 0.0511016882956028, 0.08072333037853241, -0.008860691450536251, 0.08529702574014664, 0.06452791392803192, 0.012093612924218178, 0.027292925864458084, 0.02944694086909294, 0.04105198383331299, -0.19720429182052612, 0.03355072811245918, -0.02668127603828907, -0.015557845123112202, 0.032573893666267395, 0.06180194392800331, -0.05693170428276062, 0.06019274890422821, -0.01263849064707756, -0.04512697830796242, -0.04673365503549576, 0.05685460940003395, -0.06829077005386353, 0.07934439182281494, 0.011847944930195808, 0.07391409575939178, -0.017370503395795822, 0.08357520401477814, -0.08203292638063431, 0.0015921983867883682, 0.04116520285606384, -0.011953233741223812, 0.08419650793075562, 0.01352475956082344, 0.03447265923023224, 0.13390888273715973, -0.11130661517381668, 0.03809259459376335, 0.016080942004919052, -0.1318027824163437, -0.15174511075019836, -0.08659553527832031, -0.021670328453183174, 0.07423819601535797, 0.07887383550405502, 0.013536319136619568, 0.06165318563580513, -0.022381754592061043, 0.017122671008110046, 0.06297294795513153, -0.19086991250514984, -0.05209570378065109, 0.05593223124742508, 0.1185431033372879, 0.1038619801402092, -0.04432213306427002, 0.03786135092377663, 0.0014627911150455475, 0.0066346595995128155, 0.009016424417495728, -0.061958856880664825, 0.03265736252069473, -0.03002338856458664, -0.04623681306838989, -0.05963685363531113, 0.1396401822566986, -0.030298486351966858, -0.07093891501426697, -0.013657832518219948, -0.0399983674287796, -0.0819828063249588, 0.0055790431797504425, 0.005132324993610382, 0.036691371351480484, -0.01954156532883644, 0.02114991843700409, -0.0664311945438385, -0.05587850511074066, -0.07109519839286804, -0.05773138627409935, 0.10543656349182129, 0.01755802519619465, 0.04601232334971428, 0.030145876109600067, 0.027939602732658386, 0.05602457374334335, -0.045948293060064316, -0.010595018975436687, 0.022296428680419922, -0.09400764107704163, 0.02040296606719494, -0.08148005604743958, -0.09539912641048431, -0.002838113345205784, 0.05566706508398056, -0.02320317178964615, 0.017129529267549515, 0.0799260213971138, 0.021682195365428925, 0.002098243683576584, 0.05539218336343765, -0.026261208578944206, -0.021890738978981972, 0.033994369208812714, 0.01638191007077694, 0.055960722267627716, 0.008827997371554375, -0.097938172519207, 0.00765832606703043, 0.031083805486559868, -0.0002615954726934433, -0.06155595928430557, -0.0010261274874210358, -0.020856717601418495, -0.04308846592903137, 0.13691142201423645, -0.02212589792907238, -0.036080971360206604, 0.0017531746998429298, -0.03860700502991676, -0.0734361782670021, 0.02281991019845009, -0.020600512623786926, -0.031141631305217743, -0.029986578971147537, -0.07755184918642044, -0.03857972472906113, -0.11813269555568695, -0.0703195333480835, 0.016319088637828827, -0.049409277737140656, -0.004668377339839935, -0.08073024451732635, -0.1391298919916153, -0.07743176072835922, 0.055822063237428665, -0.07261993736028671, 0.01630222611129284, -0.04438638687133789, 0.045842111110687256, 0.009098602458834648, -0.010461489669978619, 0.057729411870241165, -0.03272858262062073, 0.07640703022480011, -0.04566510394215584, 0.08598257601261139, -0.12605620920658112, 0.013648184016346931, -0.0693475678563118, 0.009329386055469513, -0.13513359427452087, 0.05696433410048485, -0.015988200902938843, -0.02134713903069496, -0.10235745459794998, -0.05611368641257286, -0.07414118945598602, 0.0225088968873024, 0.06436966359615326, 0.017752841114997864, -0.10063976049423218, -0.02313942089676857, 0.08367802947759628, -0.00358765572309494, -0.03241666033864021, 0.11577343195676804, -0.060748811811208725, 0.1021040827035904, 0.0773187130689621, 0.1265542060136795, 0.030827978625893593, -0.10304440557956696, -0.02989865653216839, -0.08470085263252258, 0.010562164708971977, -0.07813561707735062, 0.08960069715976715, -0.019432436674833298, 0.009862028062343597, 0.027314357459545135, -0.03414832055568695, -0.0056677162647247314, -0.0056937579065561295, -0.02434062771499157, 0.04700922593474388, -0.04507327079772949, -0.0756797194480896, 0.03235756233334541, 0.05429678410291672, -0.04234914854168892, -0.050696130841970444, 0.03318394348025322, 0.05768642574548721, -0.04292379319667816, 0.007493443787097931, -0.026676461100578308, -0.04570414125919342, -0.08205950260162354, -0.01225194986909628, -0.10798312723636627, -0.05482887849211693, 0.05052056908607483, -0.026942092925310135, 0.07893364131450653, 0.03458445891737938, 0.008316673338413239, 0.04934006929397583, -0.05653288960456848, 0.03843710198998451, -0.036737099289894104, -0.041825342923402786, -0.09540985524654388, -0.0580420084297657, 0.008471213281154633, -0.04596084728837013, 0.046199023723602295, -0.01889316737651825, -0.010594637133181095, -0.07576946914196014, -0.011885363608598709, 0.02926967851817608, -0.06764204800128937, 0.03863661736249924, 0.041966333985328674, 0.016416817903518677, -0.023316718637943268, 0.06332512199878693, 0.001636630855500698, 0.0012878421694040298, 0.00731297954916954, -0.07749809324741364, -0.04650658741593361, 0.047429975122213364, 0.03807951509952545, -0.03121601603925228, -0.05712762475013733, 0.013655813410878181, -0.005016925744712353, -0.0786353200674057, -0.06269603967666626, 0.2603360414505005, -0.00964520312845707, 0.0923047661781311, -0.09852561354637146, 0.009366555139422417, 0.006475774571299553, -0.02701864391565323, -0.005333820357918739, 0.045258957892656326, 0.04728558659553528, -0.2007235288619995, 0.03850787505507469, -0.03316882997751236, -0.03689071908593178, 0.1378779113292694, 0.034243978559970856, -0.04784317687153816, -0.05215612053871155, -0.041681334376335144, 0.05000494047999382, 0.11223108321428299, -0.028585955500602722, 0.007093348540365696, 0.02072015032172203, 0.02075856551527977, 0.033003874123096466, -0.08986556529998779, 0.02142111212015152, -0.03708891198039055, -0.046581756323575974, -0.04960772395133972, -0.010410505346953869, -0.07961803674697876, 0.08039137721061707, 0.07388818264007568, -0.0017704404890537262, 0.020026372745633125, -0.025321776047348976, -0.05839691311120987, 0.13267050683498383, -0.0991719514131546, -0.17862564325332642, -0.17648199200630188, 0.07730219513177872, 0.007645648904144764, 0.08297288417816162, -0.013731680810451508, -0.04388571158051491, -0.020102713257074356, -0.016387924551963806, 0.010045886039733887, 0.01599574089050293, -0.02511739172041416, -0.03881191834807396, 0.054069507867097855, -0.0020979016553610563, -0.07386486232280731, 0.023503128439188004, -0.08777453750371933, -0.04898998886346817, 0.013077784329652786, -0.07507573068141937, 0.08382878452539444, 0.0872102677822113, 0.006441867910325527, -0.03904573619365692, -0.014889528043568134, 0.12640194594860077, -0.055345285683870316, 0.050636112689971924, 0.11582934856414795, 0.0815047174692154, -0.0046482631005346775, 0.059566762298345566, 0.033545322716236115, -0.09487321227788925, 0.062733493745327, 0.006671161390841007, -0.06153006851673126, -0.22351965308189392, -0.04821903258562088, -0.07412952184677124, -0.14403828978538513, 0.05480131506919861, 0.017198123037815094, 0.03569129854440689, 0.061052996665239334, 0.0022074244916439056, 0.1118561252951622, -0.01348883006721735, 0.04599848762154579, 0.1064605563879013, 0.06541870534420013, 0.07844673097133636, -0.036760859191417694, -0.01845579594373703, 0.04272637516260147, 0.0884300246834755, 0.23584458231925964, -0.030156800523400307, 0.22087141871452332, 0.0021939557045698166, 0.006917462218552828, 0.03393252193927765, 0.06389600038528442, -0.021108806133270264, 0.03774559125304222, -0.0850178599357605, -0.013681825250387192, -0.13201886415481567, 0.022135309875011444, 0.002837582491338253, -0.036458320915699005, -0.014176628552377224, -0.01853947713971138, 0.003955793101340532, 0.16225752234458923, 0.03869137167930603, -0.1542164832353592, -0.08844711631536484, 0.02231956645846367, -0.004435028415173292, -0.06866461038589478, 0.038108475506305695, 0.10920728743076324, -0.03060068003833294, 0.010080376639962196, -0.044254619628190994, 0.0717640370130539, -0.10153919458389282, 0.034148529171943665, 0.03453395143151283, 0.04285706579685211, -0.019467884674668312, 0.07266318798065186, -0.1507168859243393, 0.15695635974407196, 0.023182250559329987, 0.07618127018213272, -0.05481705069541931, -0.007927300408482552, -0.01600986160337925, -0.06175187975168228, 0.07349829375743866, 0.021831246092915535, -0.14262287318706512, -0.032151442021131516, -0.09934353828430176, 0.011141750030219555, 0.0554896742105484, -0.02329600602388382, 0.07500377297401428, -0.006175860296934843, 0.003915928304195404, -0.024792823940515518, -0.06502672284841537, -0.05611720681190491, -0.08981899172067642, 0.01985619217157364, -0.04509522020816803, -0.01760431006550789, -0.037418052554130554, -0.016331864520907402, 0.023260436952114105, 0.052193015813827515, -0.07862352579832077, -0.0405559279024601, -0.1070864275097847, 0.032008059322834015, 0.09467902779579163, -0.05535278841853142, 0.050105877220630646, 0.007350924424827099, -0.04107114300131798, -0.007033888250589371, -0.04966498538851738, 0.09526708722114563, -0.04822784662246704, -0.1196666806936264, -0.0973849818110466, 0.057271867990493774, 0.04353884607553482, 0.057263970375061035, 0.001469927839934826, 0.089137002825737, -0.03834393620491028, -0.057332843542099, 0.07016471028327942, -0.01730453595519066, 0.08717633783817291, 0.037182003259658813, -0.09174056351184845, 0.0065851397812366486, -0.01622319594025612, -0.04206298664212227, 0.018983401358127594, 0.14927004277706146, -0.006917656399309635, 0.07343024760484695, 0.16088123619556427, -0.06909418851137161, -0.19217601418495178, -0.010710501112043858, 0.09068969637155533, 0.036134541034698486, -0.08963324129581451, -0.20183709263801575, 0.026115307584404945, 0.12390641868114471, -0.007371925748884678, -0.012264687567949295, -0.13627436757087708, -0.10821135342121124, -0.007995720952749252, 0.07746341824531555, 0.2121838927268982, -0.09515610337257385, -0.052803754806518555, -0.04159994795918465, 0.00035526114515960217, 0.14413338899612427, -0.027754783630371094, 0.12835757434368134, -0.04355470836162567, -0.046149902045726776, -0.0022097390610724688, -0.03756373003125191, 0.10617519915103912, -0.01915876567363739, 0.06984109431505203, 0.015178491361439228, -0.030440248548984528, 0.09348378330469131, -0.0013358964351937175, 0.05436716973781586, 0.03893625736236572, 0.01012324821203947, -0.058496735990047455, -0.054856494069099426, -0.08660172671079636, -0.023729564622044563, -0.005159709602594376, -0.04995417594909668, -0.09094810485839844, 0.07329538464546204, -0.0015845410525798798, 0.0021211830899119377, 0.152754545211792, -0.04246429353952408, -0.039305366575717926, 0.12634849548339844, 0.12640224397182465, 0.06493290513753891, -0.030360151082277298, 0.04056639224290848, -0.038764744997024536, 0.08051220327615738, -0.09495960921049118, -0.03802824765443802, 0.05959254503250122, -0.021804440766572952, 0.07121126353740692, 0.015072634443640709, -0.08457019925117493, 0.057178087532520294, 0.0660957396030426, -0.029298268258571625, -0.11631473153829575, 0.00680345855653286, -0.005202464759349823, -0.077272430062294, -0.020473066717386246, 0.10107453167438507, -0.024201810359954834, -0.03780508041381836, -0.04555056244134903, 0.03744718059897423, -0.01613381877541542, 0.1703101247549057, 0.06235938519239426, 0.017751557752490044, -0.057335611432790756, 0.06144331395626068, 0.12288704514503479, -0.03187304735183716, 0.033783845603466034, 0.08534189313650131, -0.08969174325466156, -0.056809574365615845, -0.03234068304300308, 0.0776710957288742, -0.07557854056358337, -0.04872217774391174, -0.11727196723222733, -0.07017500698566437, 0.03204954415559769, 0.054631367325782776, 0.009575264528393745, -0.01358820591121912, -0.04256147891283035, 0.03411396965384483, -0.031010698527097702, 0.0657249316573143, -0.08373893797397614, 0.040500931441783905, 0.04138268157839775, 0.04594646394252777, -0.0032191327773034573, -0.023138469085097313, -0.031265370547771454, 0.038377564400434494, -0.1182883009314537, 0.015284920111298561, -0.07128522545099258, 0.01804783195257187, 0.004690730944275856, -0.029588226228952408, -0.035024747252464294, -0.006892938166856766, -0.024644840508699417, -0.012187795713543892, -0.0343027226626873, -0.013228187337517738, -0.00708876084536314, 0.012016121298074722, -0.01129019446671009, -0.031044892966747284, 0.036807626485824585, -0.0277647003531456, 0.059858180582523346, -0.011578280478715897, 0.0159612949937582, 0.02355947718024254, -0.05674666166305542, 0.08839638531208038, 0.03164280951023102, 0.07203668355941772, -0.009771335870027542, -0.0871056467294693, 0.019883979111909866, 0.00016652164049446583, 0.04478331655263901, 0.014936821535229683, 0.06944644451141357, -0.05685928463935852, 0.03121737763285637, -0.04404442757368088, -0.015816720202565193, -0.057052239775657654, 0.03583066165447235, 0.046229247003793716, 0.03178025037050247, 0.09269185364246368, -0.07110129296779633, -0.024075819179415703, -0.11256136745214462, -0.004891818854957819, 0.013927682302892208, -0.06955012679100037, -0.14746157824993134, -0.0628003478050232, 0.06540848314762115, 0.06474132090806961, 0.09917926788330078, 0.036143627017736435, -0.02558981440961361, -0.006805357523262501, 0.09593318402767181, -0.0016578882932662964, 0.012081664055585861, 0.0496477410197258, 0.041340943425893784, -0.02356840670108795, -0.007086401805281639, 0.04384227469563484, 0.03672447055578232, 0.09103817492723465, 0.14187975227832794, 0.11742092669010162, 0.18675878643989563, 0.04926329106092453, -0.017036832869052887, -0.0578850656747818, -0.028422653675079346, 0.1423359513282776, 0.03470081090927124, -0.001487097004428506, 0.0011153232771903276, -0.05367692559957504, 0.08340774476528168, -0.16677972674369812, 0.07269028574228287, -0.018924497067928314, -0.06365283578634262, -0.04213209077715874, -0.06408881396055222, -0.018246155232191086, -0.03375843167304993, -0.021730199456214905, -0.08164282143115997, -0.021103333681821823, 0.027807950973510742, -0.019257934764027596, -0.010973571799695492, 0.07492485642433167, -0.10658500343561172, -0.009443450719118118, 0.032740842550992966, 0.020885493606328964, 0.05403612181544304, -0.017257189378142357, -0.044643718749284744, 0.0024268561974167824, 0.009238570928573608, 0.07363776862621307, 0.03358270600438118, 0.028520870953798294, 0.009683188050985336, 0.03957707807421684, -0.04305535554885864, 0.010745290666818619, 0.010251536965370178, 0.031644389033317566, 0.1679113209247589, 0.06615598499774933, -0.03892984613776207, 0.049091652035713196, 0.13716056942939758, -0.010501720942556858, -0.022290430963039398, -0.07213634252548218, 0.0986863374710083, 0.009787426330149174, 0.024645011872053146, -0.008768514730036259, -0.06212334334850311, 0.006569983437657356, 0.19243505597114563, 0.12892669439315796, -0.07569023221731186, -0.050905272364616394, 0.011983994394540787, -0.01314239390194416, 0.03444401174783707, 0.16546934843063354, 0.022761071100831032, 0.19935505092144012, -0.04314148426055908, 0.008589167147874832, -0.022339196875691414, -0.0003682579845190048, -0.09015734493732452, -0.015935298055410385, -0.036019131541252136, -0.011986781843006611, -0.039215754717588425, 0.03505738824605942, -0.04324851557612419, -0.20000575482845306, 0.08446329832077026, -0.07712659239768982, -0.07763596624135971, -0.03328298032283783, -0.02764933928847313, 0.02036389708518982, 0.04884623736143112, -0.0066619496792554855, 0.06772416830062866, 0.06730619072914124, 0.005292017012834549, -0.10129093378782272, -0.08874170482158661, 0.0828382670879364, 0.007740911096334457, 0.1554678976535797, -0.008966581895947456, 0.08223879337310791, 0.07061001658439636, -0.01793869398534298, -0.07658465206623077, 0.13015297055244446, -0.026620130985975266, 0.015493733808398247, 0.11608802527189255, 0.06569202989339828, -0.006785213015973568, 0.08640982210636139, 0.030642792582511902, 0.014360256493091583, 0.019037948921322823, 0.010231219232082367, 0.010655513033270836, -0.08143097162246704, 0.08475477248430252, -0.09665920585393906, 0.12969355285167694, 0.12410429120063782, -0.004568978678435087, 0.01975591480731964, -0.03278210014104843, 0.02395549602806568, 0.05162658914923668, 0.10079926252365112, -0.03723021596670151, -0.12259751558303833, -0.013960715383291245, -0.03802810609340668, 0.010097164660692215, -0.24340003728866577, -0.04135718569159508, -0.05401157587766647, -0.0402897484600544, -0.039939332753419876, 0.048354826867580414, -0.010013239458203316, 0.005142733454704285, -0.05238542705774307, 0.03676837682723999, -0.03470604866743088, 0.03176712989807129, -0.09039328992366791, -0.06656953692436218 ]
null
null
transformers
# Spanish RoBERTa-base trained on BNE finetuned for CAPITEL Named Entity Recognition (NER) dataset. ## Table of contents <details> <summary>Click to expand</summary> - [Model description](#model-description) - [Intended uses and limitations](#intended-use) - [How to use](#how-to-use) - [Limitations and bias](#limitations-and-bias) - [Training](#training) - [Training](#training) - [Training data](#training-data) - [Training procedure](#training-procedure) - [Evaluation](#evaluation) - [Evaluation](#evaluation) - [Variable and metrics](#variable-and-metrics) - [Evaluation results](#evaluation-results) - [Additional information](#additional-information) - [Author](#author) - [Contact information](#contact-information) - [Copyright](#copyright) - [Licensing information](#licensing-information) - [Funding](#funding) - [Citing information](#citing-information) - [Disclaimer](#disclaimer) </details> ## Model description The **roberta-base-bne-capitel-ner-plus** is a Named Entity Recognition (NER) model for the Spanish language fine-tuned from the [roberta-base-bne](https://huggingface.co/PlanTL-GOB-ES/roberta-base-bne) model, a [RoBERTa](https://arxiv.org/abs/1907.11692) base model pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text, processed for this work, compiled from the web crawlings performed by the [National Library of Spain (Biblioteca Nacional de España)](http://www.bne.es/en/Inicio/index.html) from 2009 to 2019. This model is a more robust version of the [roberta-base-bne-capitel-ner](https://huggingface.co/PlanTL-GOB-ES/roberta-base-bne-capitel-ner) model that recognizes better lowercased Named Entities (NE). ## Intended uses and limitations **roberta-base-bne-capitel-ner-plus** model can be used to recognize Named Entities (NE). The model is limited by its training dataset and may not generalize well for all use cases. ## How to use ```python from transformers import pipeline from pprint import pprint nlp = pipeline("ner", model="PlanTL-GOB-ES/roberta-base-bne-capitel-ner-plus") example = "Me llamo francisco javier y vivo en madrid." ner_results = nlp(example) pprint(ner_results) ``` ## Limitations and bias At the time of submission, no measures have been taken to estimate the bias embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated. ## Training The dataset used for training and evaluation is the one from the [CAPITEL competition at IberLEF 2020](https://sites.google.com/view/capitel2020) (sub-task 1). We lowercased and uppercased the dataset, and added the additional sentences to the training. ### Training procedure The model was trained with a batch size of 16 and a learning rate of 5e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set. ## Evaluation ### Variable and metrics This model was finetuned maximizing F1 score. ## Evaluation results We evaluated the **roberta-base-bne-capitel-ner-plus** on the CAPITEL-NERC test set against standard multilingual and monolingual baselines: | Model | CAPITEL-NERC (F1) | | ------------|:----| | roberta-large-bne-capitel-ner | **90.51** | | roberta-base-bne-capitel-ner | 89.60| | roberta-base-bne-capitel-ner-plus | 89.60| | BETO | 87.72 | | mBERT | 88.10 | | BERTIN | 88.56 | | ELECTRA | 80.35 | For more details, check the fine-tuning and evaluation scripts in the official [GitHub repository](https://github.com/PlanTL-GOB-ES/lm-spanish). ## Additional information ### Author Text Mining Unit (TeMU) at the Barcelona Supercomputing Center ([email protected]) ### Contact information For further information, send an email to <[email protected]> ### Copyright Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022) ### Licensing information [Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0) ### Funding This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL. ### Citing information If you use this model, please cite our [paper](http://journal.sepln.org/sepln/ojs/ojs/index.php/pln/article/view/6405): ``` @article{, abstract = {We want to thank the National Library of Spain for such a large effort on the data gathering and the Future of Computing Center, a Barcelona Supercomputing Center and IBM initiative (2020). This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.}, author = {Asier Gutiérrez Fandiño and Jordi Armengol Estapé and Marc Pàmies and Joan Llop Palao and Joaquin Silveira Ocampo and Casimiro Pio Carrino and Carme Armentano Oller and Carlos Rodriguez Penagos and Aitor Gonzalez Agirre and Marta Villegas}, doi = {10.26342/2022-68-3}, issn = {1135-5948}, journal = {Procesamiento del Lenguaje Natural}, keywords = {Artificial intelligence,Benchmarking,Data processing.,MarIA,Natural language processing,Spanish language modelling,Spanish language resources,Tractament del llenguatge natural (Informàtica),Àrees temàtiques de la UPC::Informàtica::Intel·ligència artificial::Llenguatge natural}, publisher = {Sociedad Española para el Procesamiento del Lenguaje Natural}, title = {MarIA: Spanish Language Models}, volume = {68}, url = {https://upcommons.upc.edu/handle/2117/367156#.YyMTB4X9A-0.mendeley}, year = {2022}, } ``` ### Disclaimer The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models. Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables. Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial. En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos.
{"language": ["es"], "license": "apache-2.0", "tags": ["national library of spain", "spanish", "bne", "capitel", "ner"], "datasets": ["bne", "capitel"], "metrics": ["f1"], "inference": {"parameters": {"aggregation_strategy": "first"}}, "widget": ["Me llamo francisco javier y vivo en madrid.", "Mi hermano ram\u00f3n y su mejor amigo luis trabajan en el bsc."], "model-index": [{"name": "roberta-base-bne-capiter-ner-plus", "results": [{"task": {"type": "token-classification"}, "dataset": {"name": "CAPITEL-NERC", "type": "ner"}, "metrics": [{"type": "f1", "value": 0.896, "name": "F1"}]}]}]}
token-classification
PlanTL-GOB-ES/roberta-base-bne-capitel-ner-plus
[ "transformers", "pytorch", "roberta", "token-classification", "national library of spain", "spanish", "bne", "capitel", "ner", "es", "dataset:bne", "dataset:capitel", "arxiv:1907.11692", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
2022-03-02T23:29:04+00:00
[ "1907.11692" ]
[ "es" ]
TAGS #transformers #pytorch #roberta #token-classification #national library of spain #spanish #bne #capitel #ner #es #dataset-bne #dataset-capitel #arxiv-1907.11692 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
Spanish RoBERTa-base trained on BNE finetuned for CAPITEL Named Entity Recognition (NER) dataset. ================================================================================================= Table of contents ----------------- Click to expand * Model description * Intended uses and limitations * How to use * Limitations and bias * Training * Training + Training data + Training procedure * Evaluation * Evaluation + Variable and metrics + Evaluation results * Additional information + Author + Contact information + Copyright + Licensing information + Funding + Citing information + Disclaimer Model description ----------------- The roberta-base-bne-capitel-ner-plus is a Named Entity Recognition (NER) model for the Spanish language fine-tuned from the roberta-base-bne model, a RoBERTa base model pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text, processed for this work, compiled from the web crawlings performed by the National Library of Spain (Biblioteca Nacional de España) from 2009 to 2019. This model is a more robust version of the roberta-base-bne-capitel-ner model that recognizes better lowercased Named Entities (NE). Intended uses and limitations ----------------------------- roberta-base-bne-capitel-ner-plus model can be used to recognize Named Entities (NE). The model is limited by its training dataset and may not generalize well for all use cases. How to use ---------- Limitations and bias -------------------- At the time of submission, no measures have been taken to estimate the bias embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated. Training -------- The dataset used for training and evaluation is the one from the CAPITEL competition at IberLEF 2020 (sub-task 1). We lowercased and uppercased the dataset, and added the additional sentences to the training. ### Training procedure The model was trained with a batch size of 16 and a learning rate of 5e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set. Evaluation ---------- ### Variable and metrics This model was finetuned maximizing F1 score. Evaluation results ------------------ We evaluated the roberta-base-bne-capitel-ner-plus on the CAPITEL-NERC test set against standard multilingual and monolingual baselines: For more details, check the fine-tuning and evaluation scripts in the official GitHub repository. Additional information ---------------------- ### Author Text Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL) ### Contact information For further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL) ### Copyright Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022) ### Licensing information Apache License, Version 2.0 ### Funding This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL. ### Citing information If you use this model, please cite our paper: ### Disclaimer The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models. Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables. Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial. En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos.
[ "### Training procedure\n\n\nThe model was trained with a batch size of 16 and a learning rate of 5e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.\n\n\nEvaluation\n----------", "### Variable and metrics\n\n\nThis model was finetuned maximizing F1 score.\n\n\nEvaluation results\n------------------\n\n\nWe evaluated the roberta-base-bne-capitel-ner-plus on the CAPITEL-NERC test set against standard multilingual and monolingual baselines:\n\n\n\nFor more details, check the fine-tuning and evaluation scripts in the official GitHub repository.\n\n\nAdditional information\n----------------------", "### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)", "### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)", "### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)", "### Licensing information\n\n\nApache License, Version 2.0", "### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.", "### Citing information\n\n\nIf you use this model, please cite our paper:", "### Disclaimer\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos." ]
[ "TAGS\n#transformers #pytorch #roberta #token-classification #national library of spain #spanish #bne #capitel #ner #es #dataset-bne #dataset-capitel #arxiv-1907.11692 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n", "### Training procedure\n\n\nThe model was trained with a batch size of 16 and a learning rate of 5e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.\n\n\nEvaluation\n----------", "### Variable and metrics\n\n\nThis model was finetuned maximizing F1 score.\n\n\nEvaluation results\n------------------\n\n\nWe evaluated the roberta-base-bne-capitel-ner-plus on the CAPITEL-NERC test set against standard multilingual and monolingual baselines:\n\n\n\nFor more details, check the fine-tuning and evaluation scripts in the official GitHub repository.\n\n\nAdditional information\n----------------------", "### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)", "### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)", "### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)", "### Licensing information\n\n\nApache License, Version 2.0", "### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.", "### Citing information\n\n\nIf you use this model, please cite our paper:", "### Disclaimer\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos." ]
[ 90, 65, 95, 28, 37, 22, 12, 34, 16, 363 ]
[ "passage: TAGS\n#transformers #pytorch #roberta #token-classification #national library of spain #spanish #bne #capitel #ner #es #dataset-bne #dataset-capitel #arxiv-1907.11692 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training procedure\n\n\nThe model was trained with a batch size of 16 and a learning rate of 5e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.\n\n\nEvaluation\n----------### Variable and metrics\n\n\nThis model was finetuned maximizing F1 score.\n\n\nEvaluation results\n------------------\n\n\nWe evaluated the roberta-base-bne-capitel-ner-plus on the CAPITEL-NERC test set against standard multilingual and monolingual baselines:\n\n\n\nFor more details, check the fine-tuning and evaluation scripts in the official GitHub repository.\n\n\nAdditional information\n----------------------### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)### Licensing information\n\n\nApache License, Version 2.0### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.### Citing information\n\n\nIf you use this model, please cite our paper:" ]
[ -0.09112697094678879, 0.2061859518289566, -0.007116315420717001, 0.060418304055929184, 0.10925377905368805, -0.02744254842400551, 0.054168254137039185, 0.10058938711881638, -0.033164579421281815, 0.11895637214183807, -0.007907130755484104, 0.02651689015328884, 0.1055295467376709, 0.12139017879962921, 0.04440098628401756, -0.18448565900325775, -0.005181805696338415, -0.08319763839244843, 0.012987390160560608, 0.0940588191151619, 0.10325399786233902, -0.051770247519016266, 0.0537923239171505, -0.02706151269376278, -0.022398320958018303, 0.08082573860883713, -0.052149564027786255, -0.07340878993272781, 0.06262774765491486, 0.05892898887395859, 0.053778331726789474, 0.014170731417834759, 0.05220125988125801, -0.22473610937595367, 0.014843208715319633, 0.04189411178231239, 0.003607149701565504, 0.05036337301135063, 0.09752913564443588, -0.05037582293152809, 0.2019963562488556, -0.10614059865474701, 0.02300858311355114, 0.05116229131817818, -0.10388264805078506, -0.12099780142307281, -0.12733906507492065, 0.07272732257843018, 0.08010253310203552, 0.0359574556350708, -0.028357408940792084, 0.05627906695008278, -0.04622181877493858, 0.02053951658308506, 0.07912283390760422, -0.1947201043367386, -0.044980719685554504, 0.04641071334481239, 0.02263891138136387, 0.11549275368452072, -0.07942313700914383, -0.002393128117546439, 0.05516815558075905, -0.007340461481362581, -0.007318523246794939, -0.03445806726813316, -0.052913885563611984, 0.01565847545862198, -0.1151580885052681, -0.09855931252241135, 0.12952297925949097, 0.0021285242401063442, -0.08725208044052124, -0.1065700426697731, -0.011100498028099537, 0.0005517840036191046, 0.025418303906917572, -0.03380696102976799, 0.04087809473276138, -0.016692711040377617, 0.07966048270463943, -0.05516752973198891, -0.08640013635158539, -0.031026504933834076, -0.04614649713039398, 0.06271513551473618, 0.014904397539794445, -0.01953759789466858, 0.024456236511468887, 0.1282351166009903, 0.01177962776273489, -0.13117876648902893, -0.02565925568342209, -0.0012751109898090363, -0.04188673570752144, -0.04324659705162048, 0.02518981322646141, -0.029536889865994453, 0.0838603526353836, 0.19309470057487488, -0.06508082151412964, 0.01828148029744625, -0.04160792753100395, 0.007098658476024866, 0.08638431876897812, 0.17239384353160858, -0.07150911539793015, -0.09675587713718414, 0.010472306050360203, -0.0005946484743617475, 0.03366006165742874, 0.02286977879703045, -0.03353220596909523, 0.013964925892651081, 0.04461466148495674, 0.13484326004981995, 0.07661838084459305, -0.048661790788173676, -0.07781057804822922, -0.02013920433819294, 0.14337502419948578, -0.15881948173046112, 0.03635789826512337, 0.012886721640825272, -0.07107267528772354, 0.03064403124153614, -0.04507945850491524, -0.04253923147916794, -0.11051461845636368, 0.03851979970932007, -0.04875322803854942, -0.024912577122449875, -0.07013747841119766, -0.042026933282613754, 0.07502549886703491, -0.06337332725524902, -0.004716438241302967, -0.08947337418794632, -0.09979084134101868, -0.08461041748523712, 0.0510016605257988, -0.12648771703243256, 0.006784600671380758, -0.033335890620946884, -0.0013731474755331874, -0.00506769772619009, -0.04301827400922775, 0.03455880284309387, -0.05771946534514427, 0.05453266575932503, 0.042877063155174255, 0.024617277085781097, 0.04704038053750992, 0.0030474707018584013, -0.11680560559034348, 0.02015712298452854, -0.14243903756141663, 0.10479564964771271, -0.10222534090280533, 0.015824852511286736, -0.18229131400585175, -0.03996868059039116, -0.004398934077471495, 0.015411607921123505, 0.053236979991197586, 0.18004904687404633, -0.11400097608566284, -0.04454183951020241, 0.14219272136688232, -0.0510394349694252, -0.05479849874973297, 0.11122985184192657, 0.0077991848811507225, 0.061950456351041794, 0.07512775808572769, 0.10059116035699844, 0.12185774743556976, -0.16239725053310394, -0.0482940673828125, -0.0038859995547682047, -0.003525911830365658, 0.06216239556670189, 0.11369277536869049, -0.09194499999284744, 0.0640566274523735, 0.04347720369696617, -0.12767736613750458, -0.018728842958807945, -0.00747328158468008, -0.04895400628447533, 0.029465774074196815, 0.0019477964378893375, -0.042782243341207504, 0.005865011364221573, -0.02493557333946228, -0.026427481323480606, -0.09646731615066528, 0.0045586745254695415, 0.04252546653151512, 0.012526100501418114, 0.013598969206213951, -0.08940213918685913, 0.09584171324968338, -0.048640232533216476, 0.004708399064838886, -0.18948107957839966, -0.010259279981255531, 0.039888475090265274, -0.06377775967121124, 0.11581522971391678, -0.06955910474061966, 0.011996341869235039, 0.01699262112379074, -0.03890584036707878, -0.008912904188036919, -0.022056004032492638, -0.0520445741713047, 0.0170583538711071, -0.1277799904346466, 0.000794443127233535, -0.03464295715093613, 0.07227494567632675, -0.10021688789129257, 0.007434483151882887, 0.10369998961687088, 0.09836732596158981, -0.013783726841211319, -0.03872453421354294, 0.012458426877856255, 0.027946259826421738, -0.02861768938601017, -0.07484593987464905, 0.027687542140483856, 0.00749233877286315, -0.0226204302161932, -0.027049893513321877, -0.039812881499528885, -0.04898912087082863, 0.06637782603502274, 0.11726260185241699, -0.058575037866830826, -0.04169940948486328, -0.03515502065420151, 0.00749369990080595, -0.053947705775499344, -0.01268444862216711, 0.20598292350769043, 0.03045041300356388, 0.0708274096250534, -0.1392478495836258, -0.07550720125436783, -0.003966492135077715, -0.04697105661034584, -0.08438792079687119, 0.14631201326847076, 0.03878752887248993, -0.07037755846977234, 0.07807903736829758, -0.006375063676387072, 0.033016137778759, 0.18134596943855286, -0.008210322819650173, -0.10718948394060135, -0.03720734640955925, 0.08532542735338211, 0.03477053344249725, 0.08941623568534851, -0.03900495544075966, 0.004354084841907024, 0.04065676033496857, 0.007463670335710049, 0.06848215311765671, -0.12353929877281189, 0.037740662693977356, -0.0019804879557341337, -0.06414930522441864, 0.0027997849974781275, 0.010249366983771324, -0.02472294308245182, 0.08681842684745789, 0.046591442078351974, 0.04137550666928291, -0.05824293568730354, -0.03406575322151184, -0.10586759448051453, 0.15253585577011108, -0.08396021276712418, -0.24801252782344818, -0.22632844746112823, 0.024567367509007454, -0.046150390058755875, 0.02624339796602726, 0.02967611886560917, -0.08666859567165375, -0.06582240760326385, -0.07252821326255798, 0.0219639353454113, 0.07637052983045578, -0.08564534038305283, -0.00824929028749466, 0.06056159734725952, 0.0023689314257353544, -0.1068461611866951, 0.0005932206404395401, 0.046920496970415115, -0.04640811309218407, -0.04320667311549187, 0.0053501492366194725, 0.12088555842638016, 0.10468312352895737, 0.017614832147955894, -0.007465899456292391, -0.006920242682099342, 0.17511819303035736, -0.13654042780399323, 0.0497802309691906, 0.20540469884872437, 0.05817288160324097, 0.022160280495882034, 0.17308585345745087, 0.016026586294174194, -0.05995793640613556, 0.018147319555282593, 0.02017156593501568, -0.03864450752735138, -0.266353040933609, -0.07280082255601883, -0.04270970821380615, -0.03922086954116821, 0.06543619185686111, 0.08570592850446701, -0.033090703189373016, 0.014649906195700169, -0.05038994550704956, -0.058369964361190796, 0.05052901804447174, 0.1007663682103157, 0.06224183365702629, 0.015188999474048615, 0.01659628376364708, -0.06353580951690674, -0.04280141368508339, 0.1365024596452713, 0.09876120835542679, 0.12297643721103668, 0.005780115257948637, 0.1466882824897766, 0.05362912267446518, 0.06734991818666458, -0.050948914140462875, 0.02407618798315525, 0.027300117537379265, 0.014350623823702335, -0.024763453751802444, -0.06949585676193237, -0.03767178952693939, 0.0011175917461514473, 0.018425988033413887, -0.0233681108802557, -0.026106324046850204, -0.12017059326171875, 0.08195877075195312, 0.10930019617080688, 0.0018330555176362395, -0.16227412223815918, -0.05376619100570679, 0.0462622307240963, -0.0689353197813034, -0.08037366718053818, -0.021276719868183136, 0.02624598704278469, -0.1566636711359024, 0.04006215184926987, -0.009161097928881645, 0.10575587302446365, -0.05360174551606178, -0.0351642370223999, -0.017567278817296028, 0.055503182113170624, 0.006804733071476221, 0.10848482698202133, -0.12648940086364746, 0.15006770193576813, 0.006318855565041304, 0.10185223072767258, -0.03151356428861618, 0.05727093666791916, -0.026676803827285767, 0.004292423836886883, 0.1492619812488556, -0.004278406500816345, -0.0244572963565588, -0.05496477708220482, -0.05344241484999657, 0.011387803591787815, 0.06372113525867462, -0.1185331642627716, 0.10652391612529755, -0.008130788803100586, -0.016082096844911575, -0.11141948401927948, -0.13052873313426971, -0.092549167573452, -0.17367996275424957, 0.028076862916350365, -0.1265762448310852, 0.06871381402015686, -0.04670174792408943, -0.047150641679763794, -0.007093179505318403, 0.20436161756515503, -0.21101835370063782, -0.08873933553695679, -0.13199245929718018, 0.03132874518632889, 0.1371905654668808, -0.07903839647769928, 0.034828636795282364, -0.03742855414748192, 0.10094605386257172, 0.013179694302380085, -0.03272523358464241, 0.015763696283102036, -0.06566587835550308, -0.11628324538469315, -0.02058948203921318, 0.16330420970916748, 0.0629810094833374, 0.037047117948532104, 0.009642683900892735, -0.014124813489615917, 0.016555726528167725, -0.10399042814970016, -0.0527927428483963, 0.08242202550172806, 0.11313358694314957, 0.07790064811706543, -0.025008808821439743, -0.16215579211711884, -0.12155571579933167, -0.06367981433868408, 0.06001809611916542, 0.20076937973499298, -0.012345152907073498, 0.11391651630401611, 0.1788400560617447, -0.12784117460250854, -0.15010584890842438, -0.06833113729953766, 0.0565389059484005, 0.02492602914571762, 0.028529798611998558, -0.18026010692119598, 0.011251947842538357, 0.08833948522806168, -0.00977535080164671, -0.0006869551143608987, -0.27488550543785095, -0.11397083103656769, 0.020489215850830078, 0.03468368947505951, -0.09143227338790894, -0.12466559559106827, -0.10510551929473877, -0.05899382755160332, -0.11503510177135468, 0.06510811299085617, 0.005424415227025747, 0.04416501522064209, -0.0025236522778868675, 0.012869518250226974, 0.03898682817816734, -0.019669340923428535, 0.19402305781841278, -0.039500169456005096, 0.025319227948784828, -0.03152862563729286, 0.007422268856316805, 0.08367548882961273, -0.01312339212745428, 0.11385000497102737, 0.0027204137295484543, 0.022722367197275162, -0.10392941534519196, -0.051296330988407135, -0.04423043876886368, 0.04739794135093689, -0.04995914548635483, -0.009215393103659153, -0.1001385971903801, 0.10122208297252655, 0.040972113609313965, -0.012699730694293976, 0.03306340053677559, -0.07637126743793488, 0.002574209589511156, 0.1674869805574417, 0.14837400615215302, 0.05822983756661415, -0.01810234971344471, -0.00226244586519897, -0.0030353365000337362, 0.02342010661959648, -0.12831351161003113, 0.010791908949613571, 0.1287940889596939, 0.015270148403942585, 0.08014126867055893, -0.03414410725235939, -0.14154021441936493, 0.007082516327500343, 0.1432572901248932, -0.04292788729071617, -0.13997311890125275, -0.015842502936720848, -0.01431675348430872, -0.1191687360405922, -0.00516933761537075, 0.10984545946121216, 0.012602003291249275, -0.06998580694198608, 0.02168862335383892, 0.06094641610980034, -0.02349698171019554, 0.1344301700592041, 0.03808518126606941, 0.03778773546218872, -0.0603540875017643, 0.1251625418663025, 0.11546049267053604, -0.11650906503200531, -0.02012203261256218, 0.09438339620828629, -0.05469078570604324, -0.031370192766189575, 0.04599589481949806, 0.006687017623335123, -0.10208122432231903, -0.06793679296970367, -0.07620003819465637, -0.04250721633434296, 0.005176382139325142, 0.08327198773622513, 0.03384115546941757, 0.02438836544752121, 0.004578042309731245, 0.027366958558559418, -0.038996994495391846, 0.0764109194278717, 0.09534681588411331, 0.00047575923963449895, -0.08093134313821793, 0.006109477486461401, 0.0018741160165518522, -0.014964071102440357, -0.019569991156458855, -0.021055392920970917, -0.11100919544696808, 0.007969198748469353, -0.05815833434462547, 0.034062135964632034, -0.08254610002040863, -0.0031586005352437496, -0.0112357959151268, -0.04141169413924217, -0.05056013539433479, 0.002967549953609705, -0.030721591785550117, -0.0409758985042572, -0.05322936177253723, 0.1286332607269287, -0.148588627576828, 0.04948563501238823, 0.08917153626680374, -0.07202188670635223, 0.07027487456798553, -0.020213626325130463, 0.012885733507573605, 0.09377377480268478, -0.20324404537677765, 0.03931770101189613, 0.004040885251015425, 0.05034280940890312, 0.031396057456731796, -0.1262587159872055, 0.04540609195828438, 0.04632434621453285, -0.03761855885386467, 0.02334734797477722, 0.022727636620402336, -0.10588371753692627, -0.003915701061487198, 0.0027417351957410574, -0.07691916078329086, -0.05862939730286598, 0.10963963717222214, 0.1212296262383461, 0.023854641243815422, 0.10570109635591507, -0.07154373824596405, -0.00017462098912801594, -0.14043985307216644, -0.013948681764304638, -0.0055777085945010185, 0.020876388996839523, -0.021823683753609657, -0.05020487308502197, 0.04778190702199936, 0.033774346113204956, 0.1627235859632492, 0.05392376706004143, 0.1113784983754158, 0.03345813974738121, 0.010781792923808098, 0.04421745985746384, 0.03257979825139046, 0.07118694484233856, 0.018210280686616898, 0.019366102293133736, -0.029008768498897552, -0.01692226715385914, -0.05167979747056961, -0.07521656900644302, 0.06202341243624687, 0.13623642921447754, 0.08455754071474075, 0.05092949792742729, 0.004594515543431044, -0.04821505770087242, -0.016291197389364243, 0.029384544119238853, -0.0020870023872703314, 0.003289958694949746, -0.044964663684368134, 0.10445351898670197, 0.20238737761974335, -0.2034783810377121, 0.10233981162309647, -0.004120959434658289, -0.0679164007306099, -0.05859906226396561, -0.21188966929912567, -0.024682551622390747, -0.08615938574075699, 0.032924335449934006, -0.10576701164245605, 0.07972642779350281, 0.010513235814869404, 0.0007115605403669178, -0.07346808910369873, 0.07876816391944885, -0.057018883526325226, -0.12976133823394775, 0.06988676637411118, 0.023399045690894127, 0.09430777281522751, 0.010229373350739479, 0.09154006093740463, -0.003285105573013425, 0.07116828113794327, 0.08967239409685135, 0.10877791047096252, 0.04269862547516823, 0.009787523187696934, -0.0645608976483345, -0.027666155248880386, 0.02474087104201317, -0.00644198851659894, 0.00938275083899498, 0.20251347124576569, 0.03202233090996742, -0.02787255309522152, 0.016217077150940895, 0.25953447818756104, -0.017587119713425636, -0.060830824077129364, -0.14110100269317627, 0.13705042004585266, 0.031210167333483696, 0.05638589709997177, 0.019288064911961555, -0.13668347895145416, -0.027159569784998894, 0.1267654299736023, 0.09036093205213547, 0.008921097032725811, -0.034691549837589264, -0.007946508936583996, 0.017361091449856758, 0.00511509645730257, 0.05497952550649643, 0.03526405245065689, 0.25677111744880676, -0.05712917447090149, 0.04805218055844307, -0.03422003984451294, 0.003915251698344946, -0.06013514846563339, 0.11173897236585617, -0.0440741702914238, -0.013882238417863846, -0.0642915815114975, 0.15932932496070862, -0.08023998141288757, -0.2797945439815521, 0.06227142736315727, -0.03749296814203262, -0.147197425365448, -0.006476788315922022, 0.03427110239863396, -0.0000989549225778319, 0.03975997865200043, 0.04543504863977432, -0.024731766432523727, 0.08297226577997208, 0.034724682569503784, -0.030588394030928612, -0.07404611259698868, 0.046877454966306686, -0.07148244976997375, 0.23828017711639404, -0.008865988813340664, 0.08331696689128876, 0.10342565178871155, -0.033795345574617386, -0.1524919867515564, 0.036912620067596436, 0.06494002044200897, -0.02724897488951683, 0.11802791059017181, 0.07128777354955673, 0.028809893876314163, 0.014113359153270721, 0.07479315251111984, 0.0623483769595623, 0.04052765294909477, 0.01998472400009632, 0.06303168088197708, -0.16122423112392426, 0.11485278606414795, -0.13922932744026184, 0.08481567353010178, 0.09458992630243301, -0.039204709231853485, 0.06526210159063339, -0.07092761993408203, 0.08790867775678635, 0.002947186352685094, 0.1891356259584427, 0.03621111437678337, -0.17093442380428314, 0.02187504991889, -0.004733717069029808, 0.03666037321090698, -0.2048819363117218, -0.025189029052853584, 0.03829265758395195, 0.0010294589446857572, -0.05767408013343811, 0.1397324502468109, 0.013019619509577751, 0.012473754584789276, -0.008078934624791145, -0.1814258098602295, -0.0050729080103337765, 0.07623080909252167, -0.10661470144987106, -0.006292674224823713 ]
null
null
transformers
# Spanish RoBERTa-base trained on BNE finetuned for CAPITEL Named Entity Recognition (NER) dataset. ## Table of contents <details> <summary>Click to expand</summary> - [Model description](#model-description) - [Intended uses and limitations](#intended-use) - [How to use](#how-to-use) - [Limitations and bias](#limitations-and-bias) - [Training](#training) - [Training](#training) - [Training data](#training-data) - [Training procedure](#training-procedure) - [Evaluation](#evaluation) - [Evaluation](#evaluation) - [Variable and metrics](#variable-and-metrics) - [Evaluation results](#evaluation-results) - [Additional information](#additional-information) - [Author](#author) - [Contact information](#contact-information) - [Copyright](#copyright) - [Licensing information](#licensing-information) - [Funding](#funding) - [Citing information](#citing-information) - [Disclaimer](#disclaimer) </details> ## Model description The **roberta-base-bne-capitel-ner** is a Named Entity Recognition (NER) model for the Spanish language fine-tuned from the [roberta-base-bne](https://huggingface.co/PlanTL-GOB-ES/roberta-base-bne) model, a [RoBERTa](https://arxiv.org/abs/1907.11692) base model pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text, processed for this work, compiled from the web crawlings performed by the [National Library of Spain (Biblioteca Nacional de España)](http://www.bne.es/en/Inicio/index.html) from 2009 to 2019. ## Intended uses and limitations **roberta-base-bne-capitel-ner** model can be used to recognize Named Entities (NE). The model is limited by its training dataset and may not generalize well for all use cases. ## How to use ```python from transformers import pipeline from pprint import pprint nlp = pipeline("ner", model="PlanTL-GOB-ES/roberta-base-bne-capitel-ner") example = "Me llamo Francisco Javier y vivo en Madrid." ner_results = nlp(example) pprint(ner_results) ``` ## Limitations and bias At the time of submission, no measures have been taken to estimate the bias embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated. ## Training The dataset used for training and evaluation is the one from the [CAPITEL competition at IberLEF 2020](https://sites.google.com/view/capitel2020) (sub-task 1). ### Training procedure The model was trained with a batch size of 16 and a learning rate of 5e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set. ## Evaluation ### Variable and metrics This model was finetuned maximizing F1 score. ## Evaluation results We evaluated the **roberta-base-bne-capitel-ner** on the CAPITEL-NERC test set against standard multilingual and monolingual baselines: | Model | CAPITEL-NERC (F1) | | ------------|:----| | roberta-large-bne-capitel-ner | **90.51** | | roberta-base-bne-capitel-ner | 89.60| | BETO | 87.72 | | mBERT | 88.10 | | BERTIN | 88.56 | | ELECTRA | 80.35 | For more details, check the fine-tuning and evaluation scripts in the official [GitHub repository](https://github.com/PlanTL-GOB-ES/lm-spanish). ## Additional information ### Author Text Mining Unit (TeMU) at the Barcelona Supercomputing Center ([email protected]) ### Contact information For further information, send an email to <[email protected]> ### Copyright Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022) ### Licensing information [Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0) ### Funding This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL. ### Citing information If you use this model, please cite our [paper](http://journal.sepln.org/sepln/ojs/ojs/index.php/pln/article/view/6405): ``` @article{, abstract = {We want to thank the National Library of Spain for such a large effort on the data gathering and the Future of Computing Center, a Barcelona Supercomputing Center and IBM initiative (2020). This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.}, author = {Asier Gutiérrez Fandiño and Jordi Armengol Estapé and Marc Pàmies and Joan Llop Palao and Joaquin Silveira Ocampo and Casimiro Pio Carrino and Carme Armentano Oller and Carlos Rodriguez Penagos and Aitor Gonzalez Agirre and Marta Villegas}, doi = {10.26342/2022-68-3}, issn = {1135-5948}, journal = {Procesamiento del Lenguaje Natural}, keywords = {Artificial intelligence,Benchmarking,Data processing.,MarIA,Natural language processing,Spanish language modelling,Spanish language resources,Tractament del llenguatge natural (Informàtica),Àrees temàtiques de la UPC::Informàtica::Intel·ligència artificial::Llenguatge natural}, publisher = {Sociedad Española para el Procesamiento del Lenguaje Natural}, title = {MarIA: Spanish Language Models}, volume = {68}, url = {https://upcommons.upc.edu/handle/2117/367156#.YyMTB4X9A-0.mendeley}, year = {2022}, } ``` ### Disclaimer The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models. Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables. Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial. En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos.
{"language": ["es"], "license": "apache-2.0", "tags": ["national library of spain", "spanish", "bne", "capitel", "ner"], "datasets": ["bne", "capitel"], "metrics": ["f1"], "inference": {"parameters": {"aggregation_strategy": "first"}}, "widget": ["Me llamo Francisco Javier y vivo en Madrid.", "Mi hermano Ram\u00f3n y su mejor amigo Luis trabajan en el BSC."], "model-index": [{"name": "roberta-base-bne-capiter-ner", "results": [{"task": {"type": "token-classification"}, "dataset": {"name": "CAPITEL-NERC", "type": "ner"}, "metrics": [{"type": "f1", "value": 0.896, "name": "F1"}]}]}]}
token-classification
PlanTL-GOB-ES/roberta-base-bne-capitel-ner
[ "transformers", "pytorch", "roberta", "token-classification", "national library of spain", "spanish", "bne", "capitel", "ner", "es", "dataset:bne", "dataset:capitel", "arxiv:1907.11692", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
2022-03-02T23:29:04+00:00
[ "1907.11692" ]
[ "es" ]
TAGS #transformers #pytorch #roberta #token-classification #national library of spain #spanish #bne #capitel #ner #es #dataset-bne #dataset-capitel #arxiv-1907.11692 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #has_space #region-us
Spanish RoBERTa-base trained on BNE finetuned for CAPITEL Named Entity Recognition (NER) dataset. ================================================================================================= Table of contents ----------------- Click to expand * Model description * Intended uses and limitations * How to use * Limitations and bias * Training * Training + Training data + Training procedure * Evaluation * Evaluation + Variable and metrics + Evaluation results * Additional information + Author + Contact information + Copyright + Licensing information + Funding + Citing information + Disclaimer Model description ----------------- The roberta-base-bne-capitel-ner is a Named Entity Recognition (NER) model for the Spanish language fine-tuned from the roberta-base-bne model, a RoBERTa base model pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text, processed for this work, compiled from the web crawlings performed by the National Library of Spain (Biblioteca Nacional de España) from 2009 to 2019. Intended uses and limitations ----------------------------- roberta-base-bne-capitel-ner model can be used to recognize Named Entities (NE). The model is limited by its training dataset and may not generalize well for all use cases. How to use ---------- Limitations and bias -------------------- At the time of submission, no measures have been taken to estimate the bias embedded in the model. However, we are well aware that our models may be biased since the corpora have been collected using crawling techniques on multiple web sources. We intend to conduct research in these areas in the future, and if completed, this model card will be updated. Training -------- The dataset used for training and evaluation is the one from the CAPITEL competition at IberLEF 2020 (sub-task 1). ### Training procedure The model was trained with a batch size of 16 and a learning rate of 5e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set. Evaluation ---------- ### Variable and metrics This model was finetuned maximizing F1 score. Evaluation results ------------------ We evaluated the roberta-base-bne-capitel-ner on the CAPITEL-NERC test set against standard multilingual and monolingual baselines: For more details, check the fine-tuning and evaluation scripts in the official GitHub repository. Additional information ---------------------- ### Author Text Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL) ### Contact information For further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL) ### Copyright Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022) ### Licensing information Apache License, Version 2.0 ### Funding This work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL. ### Citing information If you use this model, please cite our paper: ### Disclaimer The models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions. When third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence. In no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models. Los modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables. Cuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial. En ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos.
[ "### Training procedure\n\n\nThe model was trained with a batch size of 16 and a learning rate of 5e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.\n\n\nEvaluation\n----------", "### Variable and metrics\n\n\nThis model was finetuned maximizing F1 score.\n\n\nEvaluation results\n------------------\n\n\nWe evaluated the roberta-base-bne-capitel-ner on the CAPITEL-NERC test set against standard multilingual and monolingual baselines:\n\n\n\nFor more details, check the fine-tuning and evaluation scripts in the official GitHub repository.\n\n\nAdditional information\n----------------------", "### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)", "### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)", "### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)", "### Licensing information\n\n\nApache License, Version 2.0", "### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.", "### Citing information\n\n\nIf you use this model, please cite our paper:", "### Disclaimer\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos." ]
[ "TAGS\n#transformers #pytorch #roberta #token-classification #national library of spain #spanish #bne #capitel #ner #es #dataset-bne #dataset-capitel #arxiv-1907.11692 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #has_space #region-us \n", "### Training procedure\n\n\nThe model was trained with a batch size of 16 and a learning rate of 5e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.\n\n\nEvaluation\n----------", "### Variable and metrics\n\n\nThis model was finetuned maximizing F1 score.\n\n\nEvaluation results\n------------------\n\n\nWe evaluated the roberta-base-bne-capitel-ner on the CAPITEL-NERC test set against standard multilingual and monolingual baselines:\n\n\n\nFor more details, check the fine-tuning and evaluation scripts in the official GitHub repository.\n\n\nAdditional information\n----------------------", "### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)", "### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)", "### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)", "### Licensing information\n\n\nApache License, Version 2.0", "### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.", "### Citing information\n\n\nIf you use this model, please cite our paper:", "### Disclaimer\n\n\nThe models published in this repository are intended for a generalist purpose and are available to third parties. These models may have bias and/or any other undesirable distortions.\n\n\nWhen third parties, deploy or provide systems and/or services to other parties using any of these models (or using systems based on these models) or become users of the models, they should note that it is their responsibility to mitigate the risks arising from their use and, in any event, to comply with applicable regulations, including regulations regarding the use of artificial intelligence.\n\n\nIn no event shall the owner of the models (SEDIA – State Secretariat for digitalization and artificial intelligence) nor the creator (BSC – Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties of these models.\n\n\nLos modelos publicados en este repositorio tienen una finalidad generalista y están a disposición de terceros. Estos modelos pueden tener sesgos y/u otro tipo de distorsiones indeseables.\n\n\nCuando terceros desplieguen o proporcionen sistemas y/o servicios a otras partes usando alguno de estos modelos (o utilizando sistemas basados en estos modelos) o se conviertan en usuarios de los modelos, deben tener en cuenta que es su responsabilidad mitigar los riesgos derivados de su uso y, en todo caso, cumplir con la normativa aplicable, incluyendo la normativa en materia de uso de inteligencia artificial.\n\n\nEn ningún caso el propietario de los modelos (SEDIA – Secretaría de Estado de Digitalización e Inteligencia Artificial) ni el creador (BSC – Barcelona Supercomputing Center) serán responsables de los resultados derivados del uso que hagan terceros de estos modelos." ]
[ 94, 65, 93, 28, 37, 22, 12, 34, 16, 363 ]
[ "passage: TAGS\n#transformers #pytorch #roberta #token-classification #national library of spain #spanish #bne #capitel #ner #es #dataset-bne #dataset-capitel #arxiv-1907.11692 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #has_space #region-us \n### Training procedure\n\n\nThe model was trained with a batch size of 16 and a learning rate of 5e-5 for 5 epochs. We then selected the best checkpoint using the downstream task metric in the corresponding development set and then evaluated it on the test set.\n\n\nEvaluation\n----------### Variable and metrics\n\n\nThis model was finetuned maximizing F1 score.\n\n\nEvaluation results\n------------------\n\n\nWe evaluated the roberta-base-bne-capitel-ner on the CAPITEL-NERC test set against standard multilingual and monolingual baselines:\n\n\n\nFor more details, check the fine-tuning and evaluation scripts in the official GitHub repository.\n\n\nAdditional information\n----------------------### Author\n\n\nText Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@URL)### Contact information\n\n\nFor further information, send an email to [plantl-gob-es@URL](mailto:plantl-gob-es@URL)### Copyright\n\n\nCopyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)### Licensing information\n\n\nApache License, Version 2.0### Funding\n\n\nThis work was funded by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) within the framework of the Plan-TL.### Citing information\n\n\nIf you use this model, please cite our paper:" ]
[ -0.07585026323795319, 0.19944019615650177, -0.006682501174509525, 0.06405288726091385, 0.09793897718191147, -0.021713469177484512, 0.06707217544317245, 0.10520128905773163, -0.03451083227992058, 0.12209078669548035, 0.0003762955020647496, 0.028087778016924858, 0.1104251891374588, 0.11339676380157471, 0.04493899643421173, -0.20550237596035004, -0.006204560864716768, -0.09211728721857071, -0.020818721503019333, 0.08619023114442825, 0.10802435129880905, -0.04921187460422516, 0.0625118836760521, -0.03191784769296646, -0.01746099628508091, 0.07886631786823273, -0.051285892724990845, -0.06616541743278503, 0.057007767260074615, 0.06577404588460922, 0.049342408776283264, -0.0026068990118801594, 0.05684199929237366, -0.24295833706855774, 0.014916972257196903, 0.0396483913064003, 0.010322573594748974, 0.05992455035448074, 0.07996100932359695, -0.05583393573760986, 0.19140228629112244, -0.11514552682638168, 0.03187524154782295, 0.052925676107406616, -0.11134178191423416, -0.11360352486371994, -0.12658563256263733, 0.0762806385755539, 0.08710607141256332, 0.032403044402599335, -0.021022915840148926, 0.056204456835985184, -0.036252494901418686, 0.020464936271309853, 0.07826410233974457, -0.20077362656593323, -0.05551319569349289, 0.06265673786401749, 0.026825416833162308, 0.13292396068572998, -0.08461416512727737, -0.0013750351499766111, 0.05852971598505974, -0.0027175219729542732, -0.007494792342185974, -0.025637099519371986, -0.05035655200481415, 0.03377259150147438, -0.11329955607652664, -0.09505710005760193, 0.13809825479984283, 0.006185778416693211, -0.07439353317022324, -0.10590095818042755, -0.02539384551346302, -0.018944459035992622, 0.012893849052488804, -0.03501284122467041, 0.04477957636117935, -0.010219691321253777, 0.08960756659507751, -0.06675466895103455, -0.0872432291507721, -0.025726620107889175, -0.03870460391044617, 0.06127926707267761, 0.019626138731837273, -0.012401538901031017, 0.035782426595687866, 0.12896083295345306, 0.02531169354915619, -0.13481462001800537, -0.013180273585021496, -0.002744769910350442, -0.03813215717673302, -0.0354873463511467, 0.029849043115973473, -0.026628997176885605, 0.0759790763258934, 0.19004519283771515, -0.04397980123758316, 0.017667805776000023, -0.043100278824567795, 0.009658127091825008, 0.08845023810863495, 0.16849687695503235, -0.07367128133773804, -0.10121163725852966, 0.00982630904763937, 0.008319588378071785, 0.040276773273944855, 0.013887946493923664, -0.043044354766607285, 0.01937648095190525, 0.023631639778614044, 0.14053058624267578, 0.08516260981559753, -0.043677959591150284, -0.07883226871490479, -0.02995966002345085, 0.1408863216638565, -0.1621503382921219, 0.036628760397434235, 0.00583074102178216, -0.06448525190353394, 0.048446089029312134, -0.04443033039569855, -0.0376027412712574, -0.10323477536439896, 0.02252546139061451, -0.057630475610494614, -0.022449875250458717, -0.08332353085279465, -0.037592675536870956, 0.07314154505729675, -0.041403040289878845, -0.018696457147598267, -0.07413917779922485, -0.08383583277463913, -0.07991402596235275, 0.06740909814834595, -0.12507112324237823, 0.0001823700004024431, -0.04043208062648773, -0.0045099398121237755, -0.000005684605639544316, -0.03868614509701729, 0.031017113476991653, -0.05598249286413193, 0.04961547628045082, 0.029244927689433098, 0.036934949457645416, 0.032827991992235184, -0.0005628662765957415, -0.10251867771148682, 0.030035782605409622, -0.17451776564121246, 0.0925409123301506, -0.10398765653371811, 0.027697637677192688, -0.17587757110595703, -0.027809642255306244, 0.002363471081480384, 0.013329656794667244, 0.04402531683444977, 0.16394993662834167, -0.11771085858345032, -0.030228696763515472, 0.14335481822490692, -0.04818069934844971, -0.060825277119874954, 0.11337515711784363, 0.010046698153018951, 0.06817108392715454, 0.07011285424232483, 0.10241924226284027, 0.1004207655787468, -0.16185343265533447, -0.04139413684606552, -0.0022737092804163694, -0.002476379508152604, 0.07083342969417572, 0.10808823257684708, -0.0809101089835167, 0.0595017708837986, 0.047192685306072235, -0.11772511154413223, -0.014463867992162704, -0.00037408931530080736, -0.057664766907691956, 0.026856306940317154, 0.00005824077743454836, -0.05371987074613571, -0.001933784456923604, -0.02637973055243492, -0.01701435074210167, -0.09693819284439087, 0.017532063648104668, 0.04836418107151985, 0.006094496231526136, 0.02548881806433201, -0.08261232078075409, 0.08734011650085449, -0.045834410935640335, 0.004869792144745588, -0.17576290667057037, -0.0126394834369421, 0.03480374813079834, -0.06425464153289795, 0.12291926145553589, -0.04635351896286011, 0.013903282582759857, 0.021173903718590736, -0.04326007515192032, -0.001815057243220508, -0.010816811583936214, -0.05025020241737366, 0.006740070413798094, -0.15286138653755188, 0.00018754025222733617, -0.030152413994073868, 0.07781127840280533, -0.09780008345842361, 0.01403548289090395, 0.08233197778463364, 0.10001329332590103, -0.0047377776354551315, -0.031246135011315346, 0.002098320983350277, 0.014097527600824833, -0.03343329206109047, -0.07137409597635269, 0.021076921373605728, -0.0010206765728071332, -0.016419019550085068, -0.012944799847900867, -0.06756024807691574, -0.0542423315346241, 0.07314679026603699, 0.11093080043792725, -0.04930313304066658, -0.03878707438707352, -0.0420723631978035, 0.00521907489746809, -0.042055968195199966, -0.02269301377236843, 0.1803993135690689, 0.031776752322912216, 0.07089346647262573, -0.13086311519145966, -0.08094072341918945, -0.011304505169391632, -0.05002589151263237, -0.08762402087450027, 0.1378975510597229, 0.04137279465794563, -0.07151485979557037, 0.089409738779068, 0.011476491577923298, 0.03489702194929123, 0.18597787618637085, -0.006811691448092461, -0.10572934150695801, -0.04389631748199463, 0.08307233452796936, 0.03613656386733055, 0.1103813499212265, -0.028161536902189255, 0.009472728706896305, 0.03654564917087555, 0.016905467957258224, 0.06952062249183655, -0.11590322107076645, 0.04135248437523842, -0.005970723461359739, -0.07030492275953293, 0.026116574183106422, -0.0021865402814000845, -0.02938210591673851, 0.09254081547260284, 0.06257469952106476, 0.05612826347351074, -0.06288626044988632, -0.03604915365576744, -0.10248776525259018, 0.1438489705324173, -0.10786846280097961, -0.22789457440376282, -0.22550047934055328, 0.017564062029123306, -0.043215230107307434, 0.02551569603383541, 0.020235909149050713, -0.0767107829451561, -0.07272879779338837, -0.08261466771364212, 0.03815311938524246, 0.0762040764093399, -0.08774708211421967, -0.020287346094846725, 0.06570103019475937, -0.003599913790822029, -0.1167655736207962, 0.0019148854771628976, 0.04557669907808304, -0.044183265417814255, -0.05802958831191063, 0.004867394920438528, 0.1159953773021698, 0.11168216168880463, 0.02080259844660759, -0.013244129717350006, -0.0028774007223546505, 0.1648399531841278, -0.12865345180034637, 0.06814020872116089, 0.19301851093769073, 0.042662203311920166, 0.02874954417347908, 0.18023616075515747, 0.017803726717829704, -0.06369276344776154, 0.017332863062620163, 0.022091850638389587, -0.03404783457517624, -0.27412357926368713, -0.08157455176115036, -0.05348722264170647, -0.01621156930923462, 0.0644913837313652, 0.08493402600288391, -0.03499647229909897, 0.011186523362994194, -0.06137993931770325, -0.057241640985012054, 0.0446554534137249, 0.10399623960256577, 0.0740344449877739, -0.0008647937793284655, 0.012194592505693436, -0.07509259134531021, -0.044021401554346085, 0.1448148638010025, 0.11523284018039703, 0.12077178061008453, 0.011845240369439125, 0.14551134407520294, 0.060639578849077225, 0.06955868750810623, -0.048863038420677185, 0.019780585542321205, 0.021131953224539757, 0.014750504866242409, -0.03353323042392731, -0.07411467283964157, -0.03579387068748474, 0.01779487542808056, 0.021021690219640732, -0.019953252747654915, -0.007899974472820759, -0.12337327748537064, 0.09194313734769821, 0.11869660764932632, 0.014940442517399788, -0.15232983231544495, -0.0534730926156044, 0.059178080409765244, -0.07630640268325806, -0.08420472592115402, -0.020814258605241776, 0.03490643948316574, -0.1478179544210434, 0.03003797121345997, -0.012162730097770691, 0.10318569839000702, -0.0577067956328392, -0.03429019823670387, -0.016587233170866966, 0.06861754506826401, 0.0029799763578921556, 0.10488098114728928, -0.12228834629058838, 0.14600379765033722, 0.016474705189466476, 0.09163491427898407, -0.03575599938631058, 0.0613151378929615, -0.02753518521785736, -0.002794809639453888, 0.12790825963020325, 0.0023126129526644945, -0.04637187346816063, -0.05758501589298248, -0.06381627917289734, 0.016409872099757195, 0.06470810621976852, -0.11537963896989822, 0.09746254980564117, -0.010236983187496662, -0.016474194824695587, -0.09863288700580597, -0.13173803687095642, -0.08491448312997818, -0.19121596217155457, 0.020793942734599113, -0.1286533921957016, 0.058373965322971344, -0.044042620807886124, -0.036981116980314255, 0.0072448039427399635, 0.21188600361347198, -0.1930503100156784, -0.07893691211938858, -0.13413099944591522, 0.02455674484372139, 0.13976697623729706, -0.07225734740495682, 0.03356736898422241, -0.043031059205532074, 0.099089115858078, 0.013842334970831871, -0.035672158002853394, 0.03265281021595001, -0.05841933190822601, -0.10836657136678696, -0.024545026943087578, 0.16125456988811493, 0.0697517916560173, 0.037770092487335205, -0.00032509106677025557, -0.0139307277277112, 0.010005746968090534, -0.11185013502836227, -0.05493148788809776, 0.1015656441450119, 0.10872893780469894, 0.07361307740211487, -0.03970262408256531, -0.15518982708454132, -0.11454053968191147, -0.05619730427861214, 0.058497652411460876, 0.2070065587759018, -0.013846853747963905, 0.11739419400691986, 0.1771131008863449, -0.12836487591266632, -0.15695755183696747, -0.06097878888249397, 0.039218802005052567, 0.02401205524802208, 0.0384063757956028, -0.1735159158706665, 0.023378022015094757, 0.08099231123924255, -0.014320947229862213, 0.01591769978404045, -0.2677444815635681, -0.11312411725521088, 0.010731781832873821, 0.040999073535203934, -0.08996421098709106, -0.11841261386871338, -0.09117050468921661, -0.06487035751342773, -0.11862403899431229, 0.07399848103523254, 0.018559908494353294, 0.041266124695539474, -0.0013793305261060596, 0.01815945841372013, 0.033637017011642456, -0.017930680885910988, 0.19808362424373627, -0.04020993411540985, 0.032538462430238724, -0.028485557064414024, 0.011226329021155834, 0.09372975677251816, -0.021316971629858017, 0.11664418131113052, -0.018892118707299232, 0.017469950020313263, -0.1017269566655159, -0.04389113187789917, -0.029075291007757187, 0.03963639214634895, -0.054093584418296814, -0.02218053862452507, -0.10954071581363678, 0.09886884689331055, 0.0458267480134964, -0.007649402599781752, 0.03518182039260864, -0.07191547751426697, -0.00242570205591619, 0.16016998887062073, 0.14060072600841522, 0.05389503017067909, -0.003242855193093419, -0.0022414252161979675, -0.0009872263763099909, 0.020910052582621574, -0.15916608273983002, 0.015496646985411644, 0.12077596038579941, 0.0010228469036519527, 0.07126455754041672, -0.034613821655511856, -0.1447468250989914, -0.0024254280142486095, 0.13586477935314178, -0.048222173005342484, -0.13435843586921692, -0.007788360584527254, -0.007606444414705038, -0.13196690380573273, -0.01441681943833828, 0.11096881330013275, 0.014618377201259136, -0.06603560596704483, 0.031250983476638794, 0.06281685829162598, -0.030223144218325615, 0.11939958482980728, 0.04250190779566765, 0.03419806435704231, -0.054333675652742386, 0.11735662817955017, 0.11284443736076355, -0.12664556503295898, -0.021054886281490326, 0.10562072694301605, -0.04814106598496437, -0.034869372844696045, 0.049440305680036545, 0.013907073065638542, -0.09419026225805283, -0.053757958114147186, -0.06452915817499161, -0.04080101475119591, 0.005903239361941814, 0.06893083453178406, 0.028379397466778755, 0.029084425419569016, -0.002142922254279256, 0.03441896662116051, -0.04286827892065048, 0.08812115341424942, 0.06932207196950912, 0.011916532181203365, -0.07393143326044083, -0.010104500688612461, 0.0019975544419139624, -0.023929277434945107, -0.017334170639514923, -0.019154272973537445, -0.1152363047003746, -0.0018830554326996207, -0.059756021946668625, 0.01885136403143406, -0.08209840208292007, -0.0014174104435369372, -0.00847305916249752, -0.04958464205265045, -0.05295393988490105, -0.0014355508610606194, -0.03213031217455864, -0.04044793173670769, -0.0585670992732048, 0.12258478254079819, -0.14323855936527252, 0.04607253521680832, 0.09345274418592453, -0.08153221756219864, 0.07532739639282227, -0.020697709172964096, 0.014926855452358723, 0.08737293630838394, -0.2028064876794815, 0.03949010372161865, -0.004053490702062845, 0.05734117701649666, 0.034624531865119934, -0.11919920891523361, 0.03636706620454788, 0.04255774989724159, -0.01730029098689556, 0.02403072454035282, 0.023563895374536514, -0.10776413977146149, 0.018730027601122856, -0.009508254937827587, -0.0847579762339592, -0.05247718095779419, 0.09818419069051743, 0.11461865901947021, 0.011481794528663158, 0.09675277769565582, -0.06572552025318146, 0.006582938134670258, -0.14895540475845337, -0.007437614258378744, -0.004424565471708775, 0.021068399772047997, -0.014512828551232815, -0.057229120284318924, 0.04089944809675217, 0.02663634903728962, 0.16831158101558685, 0.055185940116643906, 0.09116563200950623, 0.043594226241111755, 0.014654128812253475, 0.03885287418961525, 0.03764617070555687, 0.06491342931985855, 0.01627633534371853, 0.02503541298210621, -0.025628022849559784, -0.015659336000680923, -0.05003977194428444, -0.08271032571792603, 0.06168888881802559, 0.14093546569347382, 0.08382178097963333, 0.046939704567193985, 0.010057950392365456, -0.04566868022084236, -0.03166453540325165, 0.0540592260658741, -0.008550037629902363, 0.001806596526876092, -0.04603080078959465, 0.10079946368932724, 0.19575220346450806, -0.2069500833749771, 0.09265398979187012, -0.013010391965508461, -0.06745318323373795, -0.06554412096738815, -0.21677646040916443, -0.01913437992334366, -0.09936034679412842, 0.020223550498485565, -0.10314922034740448, 0.07938658446073532, 0.02711280807852745, 0.0032040709629654884, -0.05906523019075394, 0.07129096984863281, -0.06008422374725342, -0.11316995322704315, 0.04666478931903839, 0.0401359498500824, 0.09546343982219696, 0.005648995749652386, 0.08209114521741867, -0.010463215410709381, 0.06880250573158264, 0.0841967836022377, 0.10947348922491074, 0.04387378692626953, 0.010884223505854607, -0.07125426083803177, -0.032989367842674255, 0.021008197218179703, -0.006973247975111008, 0.0024008434265851974, 0.2104199230670929, 0.03041822277009487, -0.034293338656425476, 0.01572011038661003, 0.26597610116004944, -0.023242240771651268, -0.06923571974039078, -0.14621539413928986, 0.12951985001564026, 0.021901048719882965, 0.05498005822300911, 0.008646456524729729, -0.12766912579536438, -0.030306916683912277, 0.12902122735977173, 0.112310990691185, -0.003097795881330967, -0.03517073020339012, -0.00566820940002799, 0.017555981874465942, -0.006801809649914503, 0.05945684760808945, 0.041921216994524, 0.25988101959228516, -0.058032404631376266, 0.04153900593519211, -0.03267686814069748, -0.006086878478527069, -0.07095721364021301, 0.1066814437508583, -0.04545094072818756, -0.016768047586083412, -0.050098568201065063, 0.15328457951545715, -0.0753398984670639, -0.25804540514945984, 0.05919024348258972, -0.04894154891371727, -0.14627033472061157, -0.001861223136074841, 0.07418020814657211, -0.006354881450533867, 0.036948300898075104, 0.04154213145375252, -0.03029300644993782, 0.10181120038032532, 0.03148200735449791, -0.03693930059671402, -0.07388776540756226, 0.05647333338856697, -0.08312361687421799, 0.24065138399600983, -0.006466851569712162, 0.06782388687133789, 0.10661477595567703, -0.03680048882961273, -0.15277965366840363, 0.020701702684164047, 0.0631227120757103, -0.03918559476733208, 0.1043403297662735, 0.06870556622743607, 0.02976716123521328, 0.030928295105695724, 0.07769449055194855, 0.060722481459379196, 0.03502471372485161, 0.009661269374191761, 0.05716433748602867, -0.14755448698997498, 0.10213761776685715, -0.13863739371299744, 0.09533240646123886, 0.09407467395067215, -0.043388646095991135, 0.06154349818825722, -0.06644996255636215, 0.08044268935918808, -0.0037116724997758865, 0.18300174176692963, 0.03428064286708832, -0.17441345751285553, 0.010566381737589836, -0.010480799712240696, 0.03296803683042526, -0.19906340539455414, -0.03334329277276993, 0.033688850700855255, 0.006687354762107134, -0.06438753753900528, 0.1363740712404251, 0.028165196999907494, 0.010065476410090923, -0.011561654508113861, -0.18516546487808228, -0.0103403739631176, 0.07677952200174332, -0.0960560142993927, -0.005999766290187836 ]